Mar 12 14:47:26 crc systemd[1]: Starting Kubernetes Kubelet... Mar 12 14:47:26 crc restorecon[4691]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:26 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 14:47:27 crc restorecon[4691]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 14:47:27 crc restorecon[4691]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 12 14:47:28 crc kubenswrapper[4869]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 14:47:28 crc kubenswrapper[4869]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 12 14:47:28 crc kubenswrapper[4869]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 14:47:28 crc kubenswrapper[4869]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 14:47:28 crc kubenswrapper[4869]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 12 14:47:28 crc kubenswrapper[4869]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.136826 4869 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139682 4869 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139702 4869 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139707 4869 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139711 4869 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139715 4869 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139720 4869 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139724 4869 feature_gate.go:330] unrecognized feature gate: Example Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139728 4869 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139732 4869 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139736 4869 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139741 4869 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139746 4869 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139750 4869 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139753 4869 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139757 4869 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139760 4869 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139764 4869 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139767 4869 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139771 4869 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139774 4869 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139777 4869 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139781 4869 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139785 4869 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139789 4869 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139794 4869 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139798 4869 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139802 4869 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139806 4869 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139809 4869 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139813 4869 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139816 4869 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139819 4869 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139823 4869 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139826 4869 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139830 4869 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139834 4869 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139837 4869 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139840 4869 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139844 4869 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139847 4869 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139850 4869 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139854 4869 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139858 4869 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139861 4869 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139865 4869 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139869 4869 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139874 4869 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139878 4869 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139882 4869 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139886 4869 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139889 4869 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139893 4869 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139896 4869 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139899 4869 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139903 4869 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139906 4869 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139910 4869 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139913 4869 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139917 4869 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139920 4869 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139923 4869 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139927 4869 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139931 4869 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139935 4869 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139939 4869 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139944 4869 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139948 4869 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139952 4869 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139956 4869 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139959 4869 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.139963 4869 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140037 4869 flags.go:64] FLAG: --address="0.0.0.0" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140046 4869 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140053 4869 flags.go:64] FLAG: --anonymous-auth="true" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140059 4869 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140065 4869 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140069 4869 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140075 4869 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140080 4869 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140084 4869 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140089 4869 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140093 4869 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140098 4869 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140102 4869 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140106 4869 flags.go:64] FLAG: --cgroup-root="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140110 4869 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140115 4869 flags.go:64] FLAG: --client-ca-file="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140119 4869 flags.go:64] FLAG: --cloud-config="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140123 4869 flags.go:64] FLAG: --cloud-provider="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140127 4869 flags.go:64] FLAG: --cluster-dns="[]" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140132 4869 flags.go:64] FLAG: --cluster-domain="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140136 4869 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140140 4869 flags.go:64] FLAG: --config-dir="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140144 4869 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140150 4869 flags.go:64] FLAG: --container-log-max-files="5" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140157 4869 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140161 4869 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140165 4869 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140169 4869 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140173 4869 flags.go:64] FLAG: --contention-profiling="false" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140177 4869 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140182 4869 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140186 4869 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140190 4869 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140195 4869 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140199 4869 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140205 4869 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140208 4869 flags.go:64] FLAG: --enable-load-reader="false" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140213 4869 flags.go:64] FLAG: --enable-server="true" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140217 4869 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140222 4869 flags.go:64] FLAG: --event-burst="100" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140227 4869 flags.go:64] FLAG: --event-qps="50" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140231 4869 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140235 4869 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140239 4869 flags.go:64] FLAG: --eviction-hard="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140244 4869 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140248 4869 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140252 4869 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140257 4869 flags.go:64] FLAG: --eviction-soft="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140261 4869 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140265 4869 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140268 4869 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140273 4869 flags.go:64] FLAG: --experimental-mounter-path="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140277 4869 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140281 4869 flags.go:64] FLAG: --fail-swap-on="true" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140285 4869 flags.go:64] FLAG: --feature-gates="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140290 4869 flags.go:64] FLAG: --file-check-frequency="20s" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140294 4869 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140298 4869 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140302 4869 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140306 4869 flags.go:64] FLAG: --healthz-port="10248" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140310 4869 flags.go:64] FLAG: --help="false" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140314 4869 flags.go:64] FLAG: --hostname-override="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140318 4869 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140322 4869 flags.go:64] FLAG: --http-check-frequency="20s" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140326 4869 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140330 4869 flags.go:64] FLAG: --image-credential-provider-config="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140334 4869 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140339 4869 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140343 4869 flags.go:64] FLAG: --image-service-endpoint="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140347 4869 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140350 4869 flags.go:64] FLAG: --kube-api-burst="100" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140355 4869 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140359 4869 flags.go:64] FLAG: --kube-api-qps="50" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140363 4869 flags.go:64] FLAG: --kube-reserved="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140368 4869 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140373 4869 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140385 4869 flags.go:64] FLAG: --kubelet-cgroups="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140391 4869 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140397 4869 flags.go:64] FLAG: --lock-file="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140402 4869 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140407 4869 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140412 4869 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140422 4869 flags.go:64] FLAG: --log-json-split-stream="false" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140428 4869 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140432 4869 flags.go:64] FLAG: --log-text-split-stream="false" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140436 4869 flags.go:64] FLAG: --logging-format="text" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140440 4869 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140445 4869 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140449 4869 flags.go:64] FLAG: --manifest-url="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140453 4869 flags.go:64] FLAG: --manifest-url-header="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140463 4869 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140467 4869 flags.go:64] FLAG: --max-open-files="1000000" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140472 4869 flags.go:64] FLAG: --max-pods="110" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140476 4869 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140480 4869 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140484 4869 flags.go:64] FLAG: --memory-manager-policy="None" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140488 4869 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140492 4869 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140497 4869 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140502 4869 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140519 4869 flags.go:64] FLAG: --node-status-max-images="50" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140526 4869 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140532 4869 flags.go:64] FLAG: --oom-score-adj="-999" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140554 4869 flags.go:64] FLAG: --pod-cidr="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140560 4869 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140567 4869 flags.go:64] FLAG: --pod-manifest-path="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140572 4869 flags.go:64] FLAG: --pod-max-pids="-1" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140577 4869 flags.go:64] FLAG: --pods-per-core="0" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140583 4869 flags.go:64] FLAG: --port="10250" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140589 4869 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140594 4869 flags.go:64] FLAG: --provider-id="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140600 4869 flags.go:64] FLAG: --qos-reserved="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140605 4869 flags.go:64] FLAG: --read-only-port="10255" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140611 4869 flags.go:64] FLAG: --register-node="true" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140616 4869 flags.go:64] FLAG: --register-schedulable="true" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140620 4869 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140628 4869 flags.go:64] FLAG: --registry-burst="10" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140632 4869 flags.go:64] FLAG: --registry-qps="5" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140636 4869 flags.go:64] FLAG: --reserved-cpus="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140641 4869 flags.go:64] FLAG: --reserved-memory="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140646 4869 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140651 4869 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140655 4869 flags.go:64] FLAG: --rotate-certificates="false" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140659 4869 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140663 4869 flags.go:64] FLAG: --runonce="false" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140667 4869 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140671 4869 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140675 4869 flags.go:64] FLAG: --seccomp-default="false" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140679 4869 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140683 4869 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140687 4869 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140692 4869 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140696 4869 flags.go:64] FLAG: --storage-driver-password="root" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140700 4869 flags.go:64] FLAG: --storage-driver-secure="false" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140704 4869 flags.go:64] FLAG: --storage-driver-table="stats" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140708 4869 flags.go:64] FLAG: --storage-driver-user="root" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140712 4869 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140716 4869 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140720 4869 flags.go:64] FLAG: --system-cgroups="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140724 4869 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140732 4869 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140736 4869 flags.go:64] FLAG: --tls-cert-file="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140740 4869 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140745 4869 flags.go:64] FLAG: --tls-min-version="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140749 4869 flags.go:64] FLAG: --tls-private-key-file="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140753 4869 flags.go:64] FLAG: --topology-manager-policy="none" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140757 4869 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140761 4869 flags.go:64] FLAG: --topology-manager-scope="container" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140765 4869 flags.go:64] FLAG: --v="2" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140771 4869 flags.go:64] FLAG: --version="false" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140777 4869 flags.go:64] FLAG: --vmodule="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140782 4869 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.140786 4869 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.140889 4869 feature_gate.go:330] unrecognized feature gate: Example Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.140899 4869 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.140906 4869 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.140910 4869 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.140914 4869 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.140917 4869 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.140921 4869 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.140925 4869 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.140929 4869 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.140932 4869 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.140936 4869 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.140940 4869 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.140944 4869 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.140947 4869 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.140950 4869 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.140954 4869 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.140957 4869 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.140960 4869 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.140964 4869 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.140967 4869 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.140971 4869 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.140974 4869 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.140977 4869 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.140981 4869 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.140984 4869 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.140988 4869 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.140991 4869 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.140995 4869 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.140998 4869 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141002 4869 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141005 4869 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141009 4869 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141014 4869 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141018 4869 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141022 4869 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141026 4869 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141029 4869 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141033 4869 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141038 4869 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141042 4869 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141046 4869 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141050 4869 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141055 4869 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141059 4869 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141064 4869 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141067 4869 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141071 4869 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141074 4869 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141078 4869 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141082 4869 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141086 4869 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141089 4869 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141093 4869 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141096 4869 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141100 4869 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141104 4869 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141109 4869 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141113 4869 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141117 4869 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141121 4869 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141128 4869 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141131 4869 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141135 4869 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141138 4869 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141141 4869 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141145 4869 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141150 4869 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141155 4869 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141165 4869 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141171 4869 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.141176 4869 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.141868 4869 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.151705 4869 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.151751 4869 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.151848 4869 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.151859 4869 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.151865 4869 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.151870 4869 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.151876 4869 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.151881 4869 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.151886 4869 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.151891 4869 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.151895 4869 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.151900 4869 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.151904 4869 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.151908 4869 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.151912 4869 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.151916 4869 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.151921 4869 feature_gate.go:330] unrecognized feature gate: Example Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.151926 4869 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.151930 4869 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.151934 4869 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.151940 4869 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.151946 4869 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.151953 4869 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.151958 4869 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.151963 4869 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.151969 4869 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.151974 4869 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.151979 4869 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.151983 4869 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.151988 4869 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.151992 4869 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.151998 4869 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152003 4869 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152008 4869 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152012 4869 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152017 4869 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152021 4869 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152025 4869 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152030 4869 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152035 4869 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152039 4869 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152043 4869 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152048 4869 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152053 4869 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152057 4869 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152062 4869 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152067 4869 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152071 4869 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152078 4869 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152084 4869 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152090 4869 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152096 4869 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152103 4869 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152108 4869 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152114 4869 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152122 4869 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152128 4869 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152133 4869 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152139 4869 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152144 4869 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152149 4869 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152153 4869 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152158 4869 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152162 4869 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152167 4869 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152172 4869 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152176 4869 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152181 4869 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152185 4869 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152190 4869 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152195 4869 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152200 4869 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152204 4869 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.152213 4869 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152374 4869 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152384 4869 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152390 4869 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152395 4869 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152399 4869 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152404 4869 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152409 4869 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152414 4869 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152418 4869 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152425 4869 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152433 4869 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152439 4869 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152445 4869 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152451 4869 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152457 4869 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152463 4869 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152468 4869 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152474 4869 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152479 4869 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152484 4869 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152490 4869 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152496 4869 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152501 4869 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152505 4869 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152510 4869 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152515 4869 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152519 4869 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152524 4869 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152528 4869 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152533 4869 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152554 4869 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152559 4869 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152565 4869 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152571 4869 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152577 4869 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152582 4869 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152587 4869 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152592 4869 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152596 4869 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152601 4869 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152606 4869 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152611 4869 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152615 4869 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152620 4869 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152624 4869 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152629 4869 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152634 4869 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152639 4869 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152644 4869 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152648 4869 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152653 4869 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152657 4869 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152662 4869 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152666 4869 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152670 4869 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152675 4869 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152679 4869 feature_gate.go:330] unrecognized feature gate: Example Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152684 4869 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152688 4869 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152692 4869 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152698 4869 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152704 4869 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152710 4869 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152714 4869 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152719 4869 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152724 4869 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152729 4869 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152733 4869 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152738 4869 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152742 4869 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.152747 4869 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.152756 4869 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.153784 4869 server.go:940] "Client rotation is on, will bootstrap in background" Mar 12 14:47:28 crc kubenswrapper[4869]: E0312 14:47:28.157528 4869 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.162836 4869 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.162937 4869 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.164585 4869 server.go:997] "Starting client certificate rotation" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.164614 4869 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.164817 4869 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.191744 4869 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.193299 4869 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 12 14:47:28 crc kubenswrapper[4869]: E0312 14:47:28.194884 4869 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.203031 4869 log.go:25] "Validated CRI v1 runtime API" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.243144 4869 log.go:25] "Validated CRI v1 image API" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.244366 4869 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.249178 4869 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-12-14-42-41-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.249204 4869 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.264003 4869 manager.go:217] Machine: {Timestamp:2026-03-12 14:47:28.261650689 +0000 UTC m=+0.546875987 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:2ba13367-485d-48d1-abc3-723587dc31cc BootID:0727d113-6abb-4498-952f-5280a3e03df5 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:97:01:30 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:97:01:30 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:42:e6:3b Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:42:d3:9f Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:c3:eb:13 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:5e:a7:a9 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:7a:01:49:7e:42:15 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:86:ba:52:f1:67:11 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.264214 4869 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.264346 4869 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.265438 4869 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.265650 4869 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.265684 4869 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.265855 4869 topology_manager.go:138] "Creating topology manager with none policy" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.265865 4869 container_manager_linux.go:303] "Creating device plugin manager" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.266224 4869 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.266254 4869 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.266838 4869 state_mem.go:36] "Initialized new in-memory state store" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.266926 4869 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.272704 4869 kubelet.go:418] "Attempting to sync node with API server" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.272749 4869 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.272797 4869 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.272815 4869 kubelet.go:324] "Adding apiserver pod source" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.272830 4869 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.278529 4869 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.280378 4869 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 12 14:47:28 crc kubenswrapper[4869]: E0312 14:47:28.280463 4869 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.280418 4869 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 12 14:47:28 crc kubenswrapper[4869]: E0312 14:47:28.280534 4869 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.280419 4869 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.282998 4869 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.284206 4869 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.284234 4869 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.284243 4869 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.284252 4869 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.284265 4869 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.284276 4869 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.284284 4869 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.284298 4869 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.284309 4869 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.284318 4869 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.284331 4869 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.284347 4869 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.285408 4869 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.285954 4869 server.go:1280] "Started kubelet" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.286914 4869 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.286923 4869 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.287403 4869 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 12 14:47:28 crc systemd[1]: Started Kubernetes Kubelet. Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.287936 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.290601 4869 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.290660 4869 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 12 14:47:28 crc kubenswrapper[4869]: E0312 14:47:28.290916 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.290989 4869 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.290999 4869 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.291315 4869 server.go:460] "Adding debug handlers to kubelet server" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.291330 4869 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.291955 4869 factory.go:55] Registering systemd factory Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.291979 4869 factory.go:221] Registration of the systemd container factory successfully Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.292098 4869 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 12 14:47:28 crc kubenswrapper[4869]: E0312 14:47:28.292181 4869 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.292327 4869 factory.go:153] Registering CRI-O factory Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.296260 4869 factory.go:221] Registration of the crio container factory successfully Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.296393 4869 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.296421 4869 factory.go:103] Registering Raw factory Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.296437 4869 manager.go:1196] Started watching for new ooms in manager Mar 12 14:47:28 crc kubenswrapper[4869]: E0312 14:47:28.296649 4869 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="200ms" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.298203 4869 manager.go:319] Starting recovery of all containers Mar 12 14:47:28 crc kubenswrapper[4869]: E0312 14:47:28.297420 4869 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.223:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189c1f5f7ebeceb2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.285920946 +0000 UTC m=+0.571146224,LastTimestamp:2026-03-12 14:47:28.285920946 +0000 UTC m=+0.571146224,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305149 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305191 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305202 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305211 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305220 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305230 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305239 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305249 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305258 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305266 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305278 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305287 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305296 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305308 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305316 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305327 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305336 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305344 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305353 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305361 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305369 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305377 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305385 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305397 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305405 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305414 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305424 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305435 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305442 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305453 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305463 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305472 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305567 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305578 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305587 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305596 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305604 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305645 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305654 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305664 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305673 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305682 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305691 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305700 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305709 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305718 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305727 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305737 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305748 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305756 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305788 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305799 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305811 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305821 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305831 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305840 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305852 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305864 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305873 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305881 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305889 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305897 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305907 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305915 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305925 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305933 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305942 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305949 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305957 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305965 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305973 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305981 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305990 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.305998 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306005 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306014 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306024 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306033 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306068 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306077 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306086 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306095 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306103 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306111 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306119 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306127 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306137 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306147 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306157 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306170 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306179 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306189 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306198 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306207 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306216 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306226 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306236 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306246 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306255 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306264 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306275 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306285 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306294 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306304 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306319 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306332 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306345 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306358 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306371 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306382 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306392 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.306402 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308165 4869 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308610 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308644 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308667 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308678 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308690 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308700 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308716 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308726 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308739 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308750 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308760 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308770 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308781 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308790 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308800 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308809 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308818 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308826 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308836 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308845 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308854 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308863 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308873 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308885 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308893 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308903 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308914 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308925 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308935 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308945 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308955 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308966 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308975 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308984 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.308994 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.309004 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.309013 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.309023 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.309033 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.309043 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.309052 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.309061 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.309071 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.309081 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.309091 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.309100 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.309109 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.309119 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.309128 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.309137 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.309146 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.309156 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.309165 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.309174 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.309184 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.309193 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.309203 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.309214 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.309223 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.309233 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.309242 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.309251 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.309260 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.309272 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.309281 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.309973 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.310012 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.310022 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.310034 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.310045 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.310056 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.310066 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.310076 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.310085 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.310094 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.310108 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.310136 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.310145 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.310154 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.310167 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.310176 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.310185 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.310197 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.310207 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.310217 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.310228 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.310237 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.310246 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.310256 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.310268 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.310278 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.310288 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.310298 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.310307 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.310319 4869 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.310330 4869 reconstruct.go:97] "Volume reconstruction finished" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.310337 4869 reconciler.go:26] "Reconciler: start to sync state" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.313144 4869 manager.go:324] Recovery completed Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.322226 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.324445 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.324488 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.324496 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.325174 4869 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.325245 4869 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.325316 4869 state_mem.go:36] "Initialized new in-memory state store" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.332979 4869 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.335171 4869 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.335217 4869 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.335246 4869 kubelet.go:2335] "Starting kubelet main sync loop" Mar 12 14:47:28 crc kubenswrapper[4869]: E0312 14:47:28.335289 4869 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.338060 4869 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 12 14:47:28 crc kubenswrapper[4869]: E0312 14:47:28.338131 4869 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.338853 4869 policy_none.go:49] "None policy: Start" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.340498 4869 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.340527 4869 state_mem.go:35] "Initializing new in-memory state store" Mar 12 14:47:28 crc kubenswrapper[4869]: E0312 14:47:28.391672 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.394118 4869 manager.go:334] "Starting Device Plugin manager" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.394320 4869 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.394519 4869 server.go:79] "Starting device plugin registration server" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.395042 4869 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.395084 4869 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.395729 4869 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.395863 4869 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.395879 4869 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 12 14:47:28 crc kubenswrapper[4869]: E0312 14:47:28.404489 4869 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.435345 4869 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.435462 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.436407 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.436457 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.436470 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.436693 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.436800 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.436845 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.437672 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.437706 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.437721 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.437749 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.437763 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.437773 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.437861 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.437991 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.438029 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.438853 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.438873 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.438885 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.438929 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.438948 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.438956 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.439067 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.439249 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.439280 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.439628 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.439664 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.439676 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.439807 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.439831 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.439815 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.439840 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.439912 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.439944 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.440477 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.440519 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.440531 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.440662 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.440684 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.440758 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.440790 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.440799 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.441202 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.441226 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.441238 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.495236 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.496534 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.496600 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.496615 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.496640 4869 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 14:47:28 crc kubenswrapper[4869]: E0312 14:47:28.497106 4869 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Mar 12 14:47:28 crc kubenswrapper[4869]: E0312 14:47:28.497109 4869 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="400ms" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.513159 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.513210 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.513242 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.513306 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.513415 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.513478 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.513519 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.513577 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.513609 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.513642 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.513675 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.513752 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.513791 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.513826 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.513848 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: E0312 14:47:28.523028 4869 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.223:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189c1f5f7ebeceb2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.285920946 +0000 UTC m=+0.571146224,LastTimestamp:2026-03-12 14:47:28.285920946 +0000 UTC m=+0.571146224,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.615339 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.615384 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.615402 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.615417 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.615434 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.615468 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.615487 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.615504 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.615519 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.615535 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.615566 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.615582 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.615597 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.615614 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.615627 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.615741 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.615730 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.615804 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.615851 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.615837 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.615816 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.615799 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.615911 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.615943 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.615950 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.615965 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.615983 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.615993 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.616001 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.616103 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.697984 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.699677 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.699717 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.699732 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.699758 4869 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 14:47:28 crc kubenswrapper[4869]: E0312 14:47:28.700239 4869 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.765395 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.787643 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.803726 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.811406 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-be2689a53ab40295f4cd53e30ce7abd1a449b5ef3cfd3c0e54c87810f45c7ddd WatchSource:0}: Error finding container be2689a53ab40295f4cd53e30ce7abd1a449b5ef3cfd3c0e54c87810f45c7ddd: Status 404 returned error can't find the container with id be2689a53ab40295f4cd53e30ce7abd1a449b5ef3cfd3c0e54c87810f45c7ddd Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.822152 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-03e4c5f6f54c4a1a0f036f718625d0426b90e9f5ea5f4d76bbd85da50f941602 WatchSource:0}: Error finding container 03e4c5f6f54c4a1a0f036f718625d0426b90e9f5ea5f4d76bbd85da50f941602: Status 404 returned error can't find the container with id 03e4c5f6f54c4a1a0f036f718625d0426b90e9f5ea5f4d76bbd85da50f941602 Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.826062 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.828967 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-2492376b135f656b30c7d514bea65a6dc4278ea8d6d774eb64a8f0dd1c6688b0 WatchSource:0}: Error finding container 2492376b135f656b30c7d514bea65a6dc4278ea8d6d774eb64a8f0dd1c6688b0: Status 404 returned error can't find the container with id 2492376b135f656b30c7d514bea65a6dc4278ea8d6d774eb64a8f0dd1c6688b0 Mar 12 14:47:28 crc kubenswrapper[4869]: I0312 14:47:28.832916 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.847675 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-9e7afd5b62ea50d473d0c5efe0de151985e203ef964b70b3df3922ebd8df8a6d WatchSource:0}: Error finding container 9e7afd5b62ea50d473d0c5efe0de151985e203ef964b70b3df3922ebd8df8a6d: Status 404 returned error can't find the container with id 9e7afd5b62ea50d473d0c5efe0de151985e203ef964b70b3df3922ebd8df8a6d Mar 12 14:47:28 crc kubenswrapper[4869]: W0312 14:47:28.860337 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-b3733638bb5f2aade365f4cbe296b30cdbfa99dbf43f9c19a02a093740315185 WatchSource:0}: Error finding container b3733638bb5f2aade365f4cbe296b30cdbfa99dbf43f9c19a02a093740315185: Status 404 returned error can't find the container with id b3733638bb5f2aade365f4cbe296b30cdbfa99dbf43f9c19a02a093740315185 Mar 12 14:47:28 crc kubenswrapper[4869]: E0312 14:47:28.898722 4869 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="800ms" Mar 12 14:47:29 crc kubenswrapper[4869]: I0312 14:47:29.100505 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:29 crc kubenswrapper[4869]: I0312 14:47:29.101603 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:29 crc kubenswrapper[4869]: I0312 14:47:29.101653 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:29 crc kubenswrapper[4869]: I0312 14:47:29.101663 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:29 crc kubenswrapper[4869]: I0312 14:47:29.101687 4869 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 14:47:29 crc kubenswrapper[4869]: E0312 14:47:29.102125 4869 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Mar 12 14:47:29 crc kubenswrapper[4869]: I0312 14:47:29.289393 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 12 14:47:29 crc kubenswrapper[4869]: I0312 14:47:29.342812 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"be2689a53ab40295f4cd53e30ce7abd1a449b5ef3cfd3c0e54c87810f45c7ddd"} Mar 12 14:47:29 crc kubenswrapper[4869]: I0312 14:47:29.343935 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b3733638bb5f2aade365f4cbe296b30cdbfa99dbf43f9c19a02a093740315185"} Mar 12 14:47:29 crc kubenswrapper[4869]: I0312 14:47:29.344727 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9e7afd5b62ea50d473d0c5efe0de151985e203ef964b70b3df3922ebd8df8a6d"} Mar 12 14:47:29 crc kubenswrapper[4869]: I0312 14:47:29.345653 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2492376b135f656b30c7d514bea65a6dc4278ea8d6d774eb64a8f0dd1c6688b0"} Mar 12 14:47:29 crc kubenswrapper[4869]: I0312 14:47:29.346448 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"03e4c5f6f54c4a1a0f036f718625d0426b90e9f5ea5f4d76bbd85da50f941602"} Mar 12 14:47:29 crc kubenswrapper[4869]: W0312 14:47:29.562294 4869 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 12 14:47:29 crc kubenswrapper[4869]: E0312 14:47:29.562696 4869 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Mar 12 14:47:29 crc kubenswrapper[4869]: W0312 14:47:29.574559 4869 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 12 14:47:29 crc kubenswrapper[4869]: E0312 14:47:29.574650 4869 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Mar 12 14:47:29 crc kubenswrapper[4869]: E0312 14:47:29.699999 4869 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="1.6s" Mar 12 14:47:29 crc kubenswrapper[4869]: W0312 14:47:29.735207 4869 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 12 14:47:29 crc kubenswrapper[4869]: E0312 14:47:29.735289 4869 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Mar 12 14:47:29 crc kubenswrapper[4869]: W0312 14:47:29.817133 4869 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 12 14:47:29 crc kubenswrapper[4869]: E0312 14:47:29.817210 4869 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Mar 12 14:47:29 crc kubenswrapper[4869]: I0312 14:47:29.902561 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:29 crc kubenswrapper[4869]: I0312 14:47:29.904087 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:29 crc kubenswrapper[4869]: I0312 14:47:29.904115 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:29 crc kubenswrapper[4869]: I0312 14:47:29.904125 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:29 crc kubenswrapper[4869]: I0312 14:47:29.904150 4869 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 14:47:29 crc kubenswrapper[4869]: E0312 14:47:29.904497 4869 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.289698 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.291823 4869 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 12 14:47:30 crc kubenswrapper[4869]: E0312 14:47:30.292514 4869 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.351660 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2de20daa514f4a02e2e39870688912008c083747f381639a5877021403824edb"} Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.351708 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8522fd1ace1e915d1cf8b464a76990ea852b7969113559f840bdf29336a129fe"} Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.351721 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"894553568d4c9460c8eae3f6e38e7a070e5cd64faa344af8455c625368bf6ed7"} Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.351733 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4ee62258679276360176bb5fc50c9215884f5d5e9a291139f21b5e0216f5e6f1"} Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.351721 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.352644 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.352682 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.352691 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.353914 4869 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6ee40a3b1e31feae95747d08364edf9c9995181354849d3d3e55f2255664d4c3" exitCode=0 Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.354040 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.354059 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6ee40a3b1e31feae95747d08364edf9c9995181354849d3d3e55f2255664d4c3"} Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.354706 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.354733 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.354741 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.355402 4869 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c7ef0128b62f970dba438bdc2e6d12afbd4255856a3e75e536104abb0377d0d7" exitCode=0 Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.355453 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c7ef0128b62f970dba438bdc2e6d12afbd4255856a3e75e536104abb0377d0d7"} Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.355562 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.356084 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.356332 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.356360 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.356372 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.356824 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.356872 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.356886 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.357566 4869 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="3cced8a128f26c6a25a96960ecfb0afc98b151a3927e2f70870db4bab02f464b" exitCode=0 Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.357623 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.357601 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"3cced8a128f26c6a25a96960ecfb0afc98b151a3927e2f70870db4bab02f464b"} Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.358192 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.358212 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.358223 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.359201 4869 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="61862c4b14b53a9473d21253db6cdb86d32a0a92ece020f7c34d4fd742c6001b" exitCode=0 Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.359306 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.359231 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"61862c4b14b53a9473d21253db6cdb86d32a0a92ece020f7c34d4fd742c6001b"} Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.360025 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.360049 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.360058 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:30 crc kubenswrapper[4869]: I0312 14:47:30.649159 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 14:47:31 crc kubenswrapper[4869]: I0312 14:47:31.289719 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 12 14:47:31 crc kubenswrapper[4869]: E0312 14:47:31.301501 4869 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="3.2s" Mar 12 14:47:31 crc kubenswrapper[4869]: I0312 14:47:31.366127 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ade0e5dee1a3862128dbe2599fff70920f8af3d833f28981be57d7a67e0003c9"} Mar 12 14:47:31 crc kubenswrapper[4869]: I0312 14:47:31.366173 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f7f2d0e37f450282dca2d3e08b24868d3862fcb72eca6f7a6a9aca2d13015f4e"} Mar 12 14:47:31 crc kubenswrapper[4869]: I0312 14:47:31.366188 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"518439416ac86b6aa626cb78deb23ea94de1043a09abb2dc0ae51ea876199b72"} Mar 12 14:47:31 crc kubenswrapper[4869]: I0312 14:47:31.366200 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2be3fc94063ff64e8420a185bf53159150916de6355f8bb3ca727c6def21b5e8"} Mar 12 14:47:31 crc kubenswrapper[4869]: I0312 14:47:31.368217 4869 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7889e50b5da8eee982eec75372b86dc3dacc3e329cdaab89b29d22a390a2ebb7" exitCode=0 Mar 12 14:47:31 crc kubenswrapper[4869]: I0312 14:47:31.368323 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7889e50b5da8eee982eec75372b86dc3dacc3e329cdaab89b29d22a390a2ebb7"} Mar 12 14:47:31 crc kubenswrapper[4869]: I0312 14:47:31.368481 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:31 crc kubenswrapper[4869]: I0312 14:47:31.369595 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:31 crc kubenswrapper[4869]: I0312 14:47:31.369635 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:31 crc kubenswrapper[4869]: I0312 14:47:31.369650 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:31 crc kubenswrapper[4869]: I0312 14:47:31.371795 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"47d726b37435389b1bf4f3c2f6134bd1213d4ec3db055a240565f81865922d66"} Mar 12 14:47:31 crc kubenswrapper[4869]: I0312 14:47:31.371877 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:31 crc kubenswrapper[4869]: I0312 14:47:31.373491 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:31 crc kubenswrapper[4869]: I0312 14:47:31.373520 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:31 crc kubenswrapper[4869]: I0312 14:47:31.373532 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:31 crc kubenswrapper[4869]: I0312 14:47:31.375852 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c561dbf9bdcc677d4b21949bcde116558b54eb57a7ac57ead28f046c6a069cd5"} Mar 12 14:47:31 crc kubenswrapper[4869]: I0312 14:47:31.375884 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:31 crc kubenswrapper[4869]: I0312 14:47:31.376018 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:31 crc kubenswrapper[4869]: I0312 14:47:31.375883 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b66cad3ebf294979184567195d7259859a29ca463cac923b5c525ce999254cf9"} Mar 12 14:47:31 crc kubenswrapper[4869]: I0312 14:47:31.376270 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"991feb5c44b35dff11bf3dc517ef5c15b8f4991894d155b83b80ef50be1c4ef8"} Mar 12 14:47:31 crc kubenswrapper[4869]: I0312 14:47:31.377124 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:31 crc kubenswrapper[4869]: I0312 14:47:31.377151 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:31 crc kubenswrapper[4869]: I0312 14:47:31.377162 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:31 crc kubenswrapper[4869]: I0312 14:47:31.377254 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:31 crc kubenswrapper[4869]: I0312 14:47:31.377306 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:31 crc kubenswrapper[4869]: I0312 14:47:31.377320 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:31 crc kubenswrapper[4869]: W0312 14:47:31.456959 4869 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 12 14:47:31 crc kubenswrapper[4869]: E0312 14:47:31.457044 4869 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Mar 12 14:47:31 crc kubenswrapper[4869]: I0312 14:47:31.505351 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:31 crc kubenswrapper[4869]: I0312 14:47:31.506902 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:31 crc kubenswrapper[4869]: I0312 14:47:31.506953 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:31 crc kubenswrapper[4869]: I0312 14:47:31.506967 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:31 crc kubenswrapper[4869]: I0312 14:47:31.507000 4869 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 14:47:31 crc kubenswrapper[4869]: E0312 14:47:31.507639 4869 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Mar 12 14:47:32 crc kubenswrapper[4869]: I0312 14:47:32.383012 4869 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="41e6b129b618c81ef7ba90b18a790486340203cbdc44a4fb7855c4604f0a8422" exitCode=0 Mar 12 14:47:32 crc kubenswrapper[4869]: I0312 14:47:32.383121 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"41e6b129b618c81ef7ba90b18a790486340203cbdc44a4fb7855c4604f0a8422"} Mar 12 14:47:32 crc kubenswrapper[4869]: I0312 14:47:32.383181 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:32 crc kubenswrapper[4869]: I0312 14:47:32.384520 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:32 crc kubenswrapper[4869]: I0312 14:47:32.384557 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:32 crc kubenswrapper[4869]: I0312 14:47:32.384565 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:32 crc kubenswrapper[4869]: I0312 14:47:32.386946 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:32 crc kubenswrapper[4869]: I0312 14:47:32.387034 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:32 crc kubenswrapper[4869]: I0312 14:47:32.387147 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:32 crc kubenswrapper[4869]: I0312 14:47:32.387186 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"22fa42a48ae5e1f234df6ec4d02775f7dc0b9eeffeb8d8747525268597c7b0fd"} Mar 12 14:47:32 crc kubenswrapper[4869]: I0312 14:47:32.387264 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:32 crc kubenswrapper[4869]: I0312 14:47:32.387564 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 14:47:32 crc kubenswrapper[4869]: I0312 14:47:32.388275 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:32 crc kubenswrapper[4869]: I0312 14:47:32.388321 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:32 crc kubenswrapper[4869]: I0312 14:47:32.388330 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:32 crc kubenswrapper[4869]: I0312 14:47:32.388365 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:32 crc kubenswrapper[4869]: I0312 14:47:32.388431 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:32 crc kubenswrapper[4869]: I0312 14:47:32.388366 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:32 crc kubenswrapper[4869]: I0312 14:47:32.388481 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:32 crc kubenswrapper[4869]: I0312 14:47:32.388489 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:32 crc kubenswrapper[4869]: I0312 14:47:32.388499 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:32 crc kubenswrapper[4869]: I0312 14:47:32.388365 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:32 crc kubenswrapper[4869]: I0312 14:47:32.388553 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:32 crc kubenswrapper[4869]: I0312 14:47:32.388561 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:32 crc kubenswrapper[4869]: I0312 14:47:32.736762 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 14:47:32 crc kubenswrapper[4869]: I0312 14:47:32.743020 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 14:47:33 crc kubenswrapper[4869]: I0312 14:47:33.395512 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cc8747b2086cda75f1a5049c649143ac2f1a1c41116879057ae9baff8a640a48"} Mar 12 14:47:33 crc kubenswrapper[4869]: I0312 14:47:33.395577 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3164a5c7766869bf1d0099fe7843be1cccf9b8f93f950517dfe949d35fc03035"} Mar 12 14:47:33 crc kubenswrapper[4869]: I0312 14:47:33.395588 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"654518fe337611c4a840070ba48c2cf1f4f133bd5f20343b9c22ed7e06ca2166"} Mar 12 14:47:33 crc kubenswrapper[4869]: I0312 14:47:33.395597 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d7cc107b25f149da025d41ad9499074626a799ca93dca94a86a7b872e66679b1"} Mar 12 14:47:33 crc kubenswrapper[4869]: I0312 14:47:33.395607 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c925c5311e6becdc9c5bcc05f6476d5cf80d4840e9b66fe5bea9beb5c6fbb109"} Mar 12 14:47:33 crc kubenswrapper[4869]: I0312 14:47:33.395630 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:33 crc kubenswrapper[4869]: I0312 14:47:33.395652 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:33 crc kubenswrapper[4869]: I0312 14:47:33.395702 4869 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 14:47:33 crc kubenswrapper[4869]: I0312 14:47:33.395745 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:33 crc kubenswrapper[4869]: I0312 14:47:33.395753 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:33 crc kubenswrapper[4869]: I0312 14:47:33.396826 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:33 crc kubenswrapper[4869]: I0312 14:47:33.396855 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:33 crc kubenswrapper[4869]: I0312 14:47:33.396863 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:33 crc kubenswrapper[4869]: I0312 14:47:33.396922 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:33 crc kubenswrapper[4869]: I0312 14:47:33.396933 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:33 crc kubenswrapper[4869]: I0312 14:47:33.396950 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:33 crc kubenswrapper[4869]: I0312 14:47:33.396955 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:33 crc kubenswrapper[4869]: I0312 14:47:33.396962 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:33 crc kubenswrapper[4869]: I0312 14:47:33.396989 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:33 crc kubenswrapper[4869]: I0312 14:47:33.397007 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:33 crc kubenswrapper[4869]: I0312 14:47:33.397016 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:33 crc kubenswrapper[4869]: I0312 14:47:33.396966 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:33 crc kubenswrapper[4869]: I0312 14:47:33.853742 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 14:47:34 crc kubenswrapper[4869]: I0312 14:47:34.168651 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:47:34 crc kubenswrapper[4869]: I0312 14:47:34.185801 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 12 14:47:34 crc kubenswrapper[4869]: I0312 14:47:34.397434 4869 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 14:47:34 crc kubenswrapper[4869]: I0312 14:47:34.397460 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:34 crc kubenswrapper[4869]: I0312 14:47:34.397559 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:34 crc kubenswrapper[4869]: I0312 14:47:34.397663 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:34 crc kubenswrapper[4869]: I0312 14:47:34.398582 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:34 crc kubenswrapper[4869]: I0312 14:47:34.398602 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:34 crc kubenswrapper[4869]: I0312 14:47:34.398631 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:34 crc kubenswrapper[4869]: I0312 14:47:34.398859 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:34 crc kubenswrapper[4869]: I0312 14:47:34.398818 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:34 crc kubenswrapper[4869]: I0312 14:47:34.399002 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:34 crc kubenswrapper[4869]: I0312 14:47:34.400712 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:34 crc kubenswrapper[4869]: I0312 14:47:34.400747 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:34 crc kubenswrapper[4869]: I0312 14:47:34.400759 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:34 crc kubenswrapper[4869]: I0312 14:47:34.518177 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 12 14:47:34 crc kubenswrapper[4869]: I0312 14:47:34.541391 4869 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 12 14:47:34 crc kubenswrapper[4869]: I0312 14:47:34.708488 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:34 crc kubenswrapper[4869]: I0312 14:47:34.709986 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:34 crc kubenswrapper[4869]: I0312 14:47:34.710018 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:34 crc kubenswrapper[4869]: I0312 14:47:34.710028 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:34 crc kubenswrapper[4869]: I0312 14:47:34.710050 4869 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 14:47:35 crc kubenswrapper[4869]: I0312 14:47:35.398986 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:35 crc kubenswrapper[4869]: I0312 14:47:35.399069 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:35 crc kubenswrapper[4869]: I0312 14:47:35.400210 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:35 crc kubenswrapper[4869]: I0312 14:47:35.400269 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:35 crc kubenswrapper[4869]: I0312 14:47:35.400287 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:35 crc kubenswrapper[4869]: I0312 14:47:35.400362 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:35 crc kubenswrapper[4869]: I0312 14:47:35.400418 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:35 crc kubenswrapper[4869]: I0312 14:47:35.400455 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:35 crc kubenswrapper[4869]: I0312 14:47:35.510526 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:47:35 crc kubenswrapper[4869]: I0312 14:47:35.510823 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:35 crc kubenswrapper[4869]: I0312 14:47:35.512403 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:35 crc kubenswrapper[4869]: I0312 14:47:35.512447 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:35 crc kubenswrapper[4869]: I0312 14:47:35.512457 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:35 crc kubenswrapper[4869]: I0312 14:47:35.720062 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 14:47:35 crc kubenswrapper[4869]: I0312 14:47:35.852111 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:47:36 crc kubenswrapper[4869]: I0312 14:47:36.400923 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:36 crc kubenswrapper[4869]: I0312 14:47:36.400993 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:36 crc kubenswrapper[4869]: I0312 14:47:36.400928 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:36 crc kubenswrapper[4869]: I0312 14:47:36.402047 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:36 crc kubenswrapper[4869]: I0312 14:47:36.402137 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:36 crc kubenswrapper[4869]: I0312 14:47:36.402202 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:36 crc kubenswrapper[4869]: I0312 14:47:36.402083 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:36 crc kubenswrapper[4869]: I0312 14:47:36.402305 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:36 crc kubenswrapper[4869]: I0312 14:47:36.402318 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:36 crc kubenswrapper[4869]: I0312 14:47:36.402139 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:36 crc kubenswrapper[4869]: I0312 14:47:36.402351 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:36 crc kubenswrapper[4869]: I0312 14:47:36.402360 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:36 crc kubenswrapper[4869]: I0312 14:47:36.854673 4869 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 14:47:36 crc kubenswrapper[4869]: I0312 14:47:36.855068 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 12 14:47:38 crc kubenswrapper[4869]: E0312 14:47:38.404616 4869 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 14:47:40 crc kubenswrapper[4869]: I0312 14:47:40.653518 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 14:47:40 crc kubenswrapper[4869]: I0312 14:47:40.653657 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:40 crc kubenswrapper[4869]: I0312 14:47:40.654601 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:40 crc kubenswrapper[4869]: I0312 14:47:40.654644 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:40 crc kubenswrapper[4869]: I0312 14:47:40.654654 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:42 crc kubenswrapper[4869]: I0312 14:47:42.290056 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 12 14:47:42 crc kubenswrapper[4869]: W0312 14:47:42.479172 4869 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 12 14:47:42 crc kubenswrapper[4869]: I0312 14:47:42.479281 4869 trace.go:236] Trace[1989226673]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (12-Mar-2026 14:47:32.477) (total time: 10001ms): Mar 12 14:47:42 crc kubenswrapper[4869]: Trace[1989226673]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (14:47:42.479) Mar 12 14:47:42 crc kubenswrapper[4869]: Trace[1989226673]: [10.001664596s] [10.001664596s] END Mar 12 14:47:42 crc kubenswrapper[4869]: E0312 14:47:42.479322 4869 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 12 14:47:42 crc kubenswrapper[4869]: W0312 14:47:42.600677 4869 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 12 14:47:42 crc kubenswrapper[4869]: I0312 14:47:42.600786 4869 trace.go:236] Trace[28335153]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (12-Mar-2026 14:47:32.599) (total time: 10001ms): Mar 12 14:47:42 crc kubenswrapper[4869]: Trace[28335153]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (14:47:42.600) Mar 12 14:47:42 crc kubenswrapper[4869]: Trace[28335153]: [10.001370938s] [10.001370938s] END Mar 12 14:47:42 crc kubenswrapper[4869]: E0312 14:47:42.600814 4869 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 12 14:47:42 crc kubenswrapper[4869]: W0312 14:47:42.686856 4869 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 12 14:47:42 crc kubenswrapper[4869]: I0312 14:47:42.686993 4869 trace.go:236] Trace[1243355959]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (12-Mar-2026 14:47:32.685) (total time: 10001ms): Mar 12 14:47:42 crc kubenswrapper[4869]: Trace[1243355959]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (14:47:42.686) Mar 12 14:47:42 crc kubenswrapper[4869]: Trace[1243355959]: [10.00179361s] [10.00179361s] END Mar 12 14:47:42 crc kubenswrapper[4869]: E0312 14:47:42.687023 4869 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 12 14:47:43 crc kubenswrapper[4869]: E0312 14:47:43.026010 4869 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T14:47:43Z is after 2026-02-23T05:33:13Z" node="crc" Mar 12 14:47:43 crc kubenswrapper[4869]: W0312 14:47:43.027398 4869 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T14:47:43Z is after 2026-02-23T05:33:13Z Mar 12 14:47:43 crc kubenswrapper[4869]: E0312 14:47:43.027491 4869 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T14:47:43Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 14:47:43 crc kubenswrapper[4869]: I0312 14:47:43.031932 4869 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 12 14:47:43 crc kubenswrapper[4869]: I0312 14:47:43.032276 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 12 14:47:43 crc kubenswrapper[4869]: E0312 14:47:43.035140 4869 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T14:47:43Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c1f5f7ebeceb2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.285920946 +0000 UTC m=+0.571146224,LastTimestamp:2026-03-12 14:47:28.285920946 +0000 UTC m=+0.571146224,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:43 crc kubenswrapper[4869]: I0312 14:47:43.037705 4869 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 12 14:47:43 crc kubenswrapper[4869]: I0312 14:47:43.037764 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 12 14:47:43 crc kubenswrapper[4869]: E0312 14:47:43.038197 4869 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T14:47:43Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 12 14:47:43 crc kubenswrapper[4869]: E0312 14:47:43.043166 4869 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T14:47:43Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 14:47:43 crc kubenswrapper[4869]: I0312 14:47:43.291242 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T14:47:43Z is after 2026-02-23T05:33:13Z Mar 12 14:47:43 crc kubenswrapper[4869]: I0312 14:47:43.416311 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 12 14:47:43 crc kubenswrapper[4869]: I0312 14:47:43.417604 4869 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="22fa42a48ae5e1f234df6ec4d02775f7dc0b9eeffeb8d8747525268597c7b0fd" exitCode=255 Mar 12 14:47:43 crc kubenswrapper[4869]: I0312 14:47:43.417640 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"22fa42a48ae5e1f234df6ec4d02775f7dc0b9eeffeb8d8747525268597c7b0fd"} Mar 12 14:47:43 crc kubenswrapper[4869]: I0312 14:47:43.417759 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:43 crc kubenswrapper[4869]: I0312 14:47:43.418384 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:43 crc kubenswrapper[4869]: I0312 14:47:43.418407 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:43 crc kubenswrapper[4869]: I0312 14:47:43.418415 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:43 crc kubenswrapper[4869]: I0312 14:47:43.418844 4869 scope.go:117] "RemoveContainer" containerID="22fa42a48ae5e1f234df6ec4d02775f7dc0b9eeffeb8d8747525268597c7b0fd" Mar 12 14:47:44 crc kubenswrapper[4869]: I0312 14:47:44.290649 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T14:47:44Z is after 2026-02-23T05:33:13Z Mar 12 14:47:44 crc kubenswrapper[4869]: I0312 14:47:44.421396 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 12 14:47:44 crc kubenswrapper[4869]: I0312 14:47:44.422147 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 12 14:47:44 crc kubenswrapper[4869]: I0312 14:47:44.424716 4869 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2ed06d17d3082f227eb4944b8bef9da4eed6472d87946adee0d53fcf9d715cbd" exitCode=255 Mar 12 14:47:44 crc kubenswrapper[4869]: I0312 14:47:44.424768 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2ed06d17d3082f227eb4944b8bef9da4eed6472d87946adee0d53fcf9d715cbd"} Mar 12 14:47:44 crc kubenswrapper[4869]: I0312 14:47:44.424806 4869 scope.go:117] "RemoveContainer" containerID="22fa42a48ae5e1f234df6ec4d02775f7dc0b9eeffeb8d8747525268597c7b0fd" Mar 12 14:47:44 crc kubenswrapper[4869]: I0312 14:47:44.424970 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:44 crc kubenswrapper[4869]: I0312 14:47:44.425941 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:44 crc kubenswrapper[4869]: I0312 14:47:44.425987 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:44 crc kubenswrapper[4869]: I0312 14:47:44.426003 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:44 crc kubenswrapper[4869]: I0312 14:47:44.426747 4869 scope.go:117] "RemoveContainer" containerID="2ed06d17d3082f227eb4944b8bef9da4eed6472d87946adee0d53fcf9d715cbd" Mar 12 14:47:44 crc kubenswrapper[4869]: E0312 14:47:44.427015 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 14:47:44 crc kubenswrapper[4869]: I0312 14:47:44.542752 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 12 14:47:44 crc kubenswrapper[4869]: I0312 14:47:44.542878 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:44 crc kubenswrapper[4869]: I0312 14:47:44.543908 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:44 crc kubenswrapper[4869]: I0312 14:47:44.543936 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:44 crc kubenswrapper[4869]: I0312 14:47:44.543945 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:44 crc kubenswrapper[4869]: I0312 14:47:44.553274 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 12 14:47:45 crc kubenswrapper[4869]: I0312 14:47:45.293399 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T14:47:45Z is after 2026-02-23T05:33:13Z Mar 12 14:47:45 crc kubenswrapper[4869]: I0312 14:47:45.428520 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 12 14:47:45 crc kubenswrapper[4869]: I0312 14:47:45.430641 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:45 crc kubenswrapper[4869]: I0312 14:47:45.431339 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:45 crc kubenswrapper[4869]: I0312 14:47:45.431369 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:45 crc kubenswrapper[4869]: I0312 14:47:45.431377 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:45 crc kubenswrapper[4869]: I0312 14:47:45.510893 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:47:45 crc kubenswrapper[4869]: I0312 14:47:45.511110 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:45 crc kubenswrapper[4869]: I0312 14:47:45.512381 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:45 crc kubenswrapper[4869]: I0312 14:47:45.512423 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:45 crc kubenswrapper[4869]: I0312 14:47:45.512438 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:45 crc kubenswrapper[4869]: I0312 14:47:45.513123 4869 scope.go:117] "RemoveContainer" containerID="2ed06d17d3082f227eb4944b8bef9da4eed6472d87946adee0d53fcf9d715cbd" Mar 12 14:47:45 crc kubenswrapper[4869]: E0312 14:47:45.513360 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 14:47:45 crc kubenswrapper[4869]: I0312 14:47:45.858640 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:47:46 crc kubenswrapper[4869]: I0312 14:47:46.292910 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T14:47:46Z is after 2026-02-23T05:33:13Z Mar 12 14:47:46 crc kubenswrapper[4869]: I0312 14:47:46.432653 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:46 crc kubenswrapper[4869]: I0312 14:47:46.433595 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:46 crc kubenswrapper[4869]: I0312 14:47:46.433631 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:46 crc kubenswrapper[4869]: I0312 14:47:46.433640 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:46 crc kubenswrapper[4869]: I0312 14:47:46.434127 4869 scope.go:117] "RemoveContainer" containerID="2ed06d17d3082f227eb4944b8bef9da4eed6472d87946adee0d53fcf9d715cbd" Mar 12 14:47:46 crc kubenswrapper[4869]: E0312 14:47:46.434303 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 14:47:46 crc kubenswrapper[4869]: I0312 14:47:46.437024 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:47:46 crc kubenswrapper[4869]: I0312 14:47:46.854949 4869 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 14:47:46 crc kubenswrapper[4869]: I0312 14:47:46.855076 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 14:47:47 crc kubenswrapper[4869]: I0312 14:47:47.293723 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T14:47:47Z is after 2026-02-23T05:33:13Z Mar 12 14:47:47 crc kubenswrapper[4869]: I0312 14:47:47.435844 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:47 crc kubenswrapper[4869]: I0312 14:47:47.437117 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:47 crc kubenswrapper[4869]: I0312 14:47:47.437187 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:47 crc kubenswrapper[4869]: I0312 14:47:47.437205 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:47 crc kubenswrapper[4869]: I0312 14:47:47.437881 4869 scope.go:117] "RemoveContainer" containerID="2ed06d17d3082f227eb4944b8bef9da4eed6472d87946adee0d53fcf9d715cbd" Mar 12 14:47:47 crc kubenswrapper[4869]: E0312 14:47:47.438139 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 14:47:47 crc kubenswrapper[4869]: W0312 14:47:47.649663 4869 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T14:47:47Z is after 2026-02-23T05:33:13Z Mar 12 14:47:47 crc kubenswrapper[4869]: E0312 14:47:47.649794 4869 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T14:47:47Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 14:47:47 crc kubenswrapper[4869]: W0312 14:47:47.670101 4869 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T14:47:47Z is after 2026-02-23T05:33:13Z Mar 12 14:47:47 crc kubenswrapper[4869]: E0312 14:47:47.670220 4869 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T14:47:47Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 14:47:47 crc kubenswrapper[4869]: W0312 14:47:47.857370 4869 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T14:47:47Z is after 2026-02-23T05:33:13Z Mar 12 14:47:47 crc kubenswrapper[4869]: E0312 14:47:47.857470 4869 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T14:47:47Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 14:47:48 crc kubenswrapper[4869]: I0312 14:47:48.293367 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T14:47:48Z is after 2026-02-23T05:33:13Z Mar 12 14:47:48 crc kubenswrapper[4869]: I0312 14:47:48.326855 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:47:48 crc kubenswrapper[4869]: E0312 14:47:48.404712 4869 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 14:47:48 crc kubenswrapper[4869]: I0312 14:47:48.437442 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:48 crc kubenswrapper[4869]: I0312 14:47:48.438369 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:48 crc kubenswrapper[4869]: I0312 14:47:48.438405 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:48 crc kubenswrapper[4869]: I0312 14:47:48.438414 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:48 crc kubenswrapper[4869]: I0312 14:47:48.438956 4869 scope.go:117] "RemoveContainer" containerID="2ed06d17d3082f227eb4944b8bef9da4eed6472d87946adee0d53fcf9d715cbd" Mar 12 14:47:48 crc kubenswrapper[4869]: E0312 14:47:48.439140 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 14:47:49 crc kubenswrapper[4869]: I0312 14:47:49.291953 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T14:47:49Z is after 2026-02-23T05:33:13Z Mar 12 14:47:49 crc kubenswrapper[4869]: I0312 14:47:49.426368 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:49 crc kubenswrapper[4869]: I0312 14:47:49.428054 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:49 crc kubenswrapper[4869]: I0312 14:47:49.428103 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:49 crc kubenswrapper[4869]: I0312 14:47:49.428115 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:49 crc kubenswrapper[4869]: I0312 14:47:49.428146 4869 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 14:47:49 crc kubenswrapper[4869]: E0312 14:47:49.430925 4869 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T14:47:49Z is after 2026-02-23T05:33:13Z" node="crc" Mar 12 14:47:49 crc kubenswrapper[4869]: E0312 14:47:49.441257 4869 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T14:47:49Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 12 14:47:50 crc kubenswrapper[4869]: I0312 14:47:50.292526 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:47:51 crc kubenswrapper[4869]: I0312 14:47:51.095590 4869 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 12 14:47:51 crc kubenswrapper[4869]: I0312 14:47:51.113818 4869 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 12 14:47:51 crc kubenswrapper[4869]: I0312 14:47:51.293356 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:47:52 crc kubenswrapper[4869]: I0312 14:47:52.295394 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.039678 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1f5f7ebeceb2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.285920946 +0000 UTC m=+0.571146224,LastTimestamp:2026-03-12 14:47:28.285920946 +0000 UTC m=+0.571146224,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.047943 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1f5f810b1a82 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.324475522 +0000 UTC m=+0.609700790,LastTimestamp:2026-03-12 14:47:28.324475522 +0000 UTC m=+0.609700790,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.056656 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1f5f810b6079 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.324493433 +0000 UTC m=+0.609718711,LastTimestamp:2026-03-12 14:47:28.324493433 +0000 UTC m=+0.609718711,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.060983 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1f5f810b7f91 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.324501393 +0000 UTC m=+0.609726671,LastTimestamp:2026-03-12 14:47:28.324501393 +0000 UTC m=+0.609726671,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.066888 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1f5f85728615 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.398362133 +0000 UTC m=+0.683587411,LastTimestamp:2026-03-12 14:47:28.398362133 +0000 UTC m=+0.683587411,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.072129 4869 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1f5f810b1a82\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1f5f810b1a82 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.324475522 +0000 UTC m=+0.609700790,LastTimestamp:2026-03-12 14:47:28.436448876 +0000 UTC m=+0.721674144,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.076107 4869 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1f5f810b6079\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1f5f810b6079 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.324493433 +0000 UTC m=+0.609718711,LastTimestamp:2026-03-12 14:47:28.436465837 +0000 UTC m=+0.721691115,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.081189 4869 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1f5f810b7f91\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1f5f810b7f91 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.324501393 +0000 UTC m=+0.609726671,LastTimestamp:2026-03-12 14:47:28.436476047 +0000 UTC m=+0.721701325,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.085951 4869 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1f5f810b1a82\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1f5f810b1a82 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.324475522 +0000 UTC m=+0.609700790,LastTimestamp:2026-03-12 14:47:28.437697402 +0000 UTC m=+0.722922700,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.089968 4869 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1f5f810b6079\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1f5f810b6079 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.324493433 +0000 UTC m=+0.609718711,LastTimestamp:2026-03-12 14:47:28.437716113 +0000 UTC m=+0.722941411,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.094744 4869 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1f5f810b7f91\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1f5f810b7f91 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.324501393 +0000 UTC m=+0.609726671,LastTimestamp:2026-03-12 14:47:28.437731333 +0000 UTC m=+0.722956631,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.099186 4869 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1f5f810b1a82\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1f5f810b1a82 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.324475522 +0000 UTC m=+0.609700790,LastTimestamp:2026-03-12 14:47:28.437758784 +0000 UTC m=+0.722984062,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.104024 4869 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1f5f810b6079\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1f5f810b6079 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.324493433 +0000 UTC m=+0.609718711,LastTimestamp:2026-03-12 14:47:28.437768494 +0000 UTC m=+0.722993772,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.108005 4869 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1f5f810b7f91\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1f5f810b7f91 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.324501393 +0000 UTC m=+0.609726671,LastTimestamp:2026-03-12 14:47:28.437777535 +0000 UTC m=+0.723002813,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.112305 4869 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1f5f810b1a82\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1f5f810b1a82 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.324475522 +0000 UTC m=+0.609700790,LastTimestamp:2026-03-12 14:47:28.438867276 +0000 UTC m=+0.724092554,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.116643 4869 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1f5f810b6079\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1f5f810b6079 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.324493433 +0000 UTC m=+0.609718711,LastTimestamp:2026-03-12 14:47:28.438879826 +0000 UTC m=+0.724105104,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.123154 4869 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1f5f810b7f91\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1f5f810b7f91 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.324501393 +0000 UTC m=+0.609726671,LastTimestamp:2026-03-12 14:47:28.438891427 +0000 UTC m=+0.724116705,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.129086 4869 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1f5f810b1a82\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1f5f810b1a82 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.324475522 +0000 UTC m=+0.609700790,LastTimestamp:2026-03-12 14:47:28.438943828 +0000 UTC m=+0.724169106,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.130272 4869 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1f5f810b6079\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1f5f810b6079 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.324493433 +0000 UTC m=+0.609718711,LastTimestamp:2026-03-12 14:47:28.438953888 +0000 UTC m=+0.724179166,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.133967 4869 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1f5f810b7f91\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1f5f810b7f91 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.324501393 +0000 UTC m=+0.609726671,LastTimestamp:2026-03-12 14:47:28.438961229 +0000 UTC m=+0.724186507,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.137945 4869 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1f5f810b1a82\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1f5f810b1a82 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.324475522 +0000 UTC m=+0.609700790,LastTimestamp:2026-03-12 14:47:28.439649968 +0000 UTC m=+0.724875266,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.143287 4869 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1f5f810b6079\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1f5f810b6079 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.324493433 +0000 UTC m=+0.609718711,LastTimestamp:2026-03-12 14:47:28.439672199 +0000 UTC m=+0.724897487,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.147467 4869 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1f5f810b7f91\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1f5f810b7f91 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.324501393 +0000 UTC m=+0.609726671,LastTimestamp:2026-03-12 14:47:28.439684199 +0000 UTC m=+0.724909497,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.150948 4869 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1f5f810b1a82\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1f5f810b1a82 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.324475522 +0000 UTC m=+0.609700790,LastTimestamp:2026-03-12 14:47:28.439824603 +0000 UTC m=+0.725049881,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.154616 4869 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1f5f810b6079\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1f5f810b6079 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.324493433 +0000 UTC m=+0.609718711,LastTimestamp:2026-03-12 14:47:28.439837634 +0000 UTC m=+0.725062902,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.158725 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c1f5f9ea349e9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.820988393 +0000 UTC m=+1.106213671,LastTimestamp:2026-03-12 14:47:28.820988393 +0000 UTC m=+1.106213671,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.163806 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1f5f9ed0e67f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.823977599 +0000 UTC m=+1.109202877,LastTimestamp:2026-03-12 14:47:28.823977599 +0000 UTC m=+1.109202877,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.168923 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1f5f9f409a8d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.831298189 +0000 UTC m=+1.116523467,LastTimestamp:2026-03-12 14:47:28.831298189 +0000 UTC m=+1.116523467,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.171990 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1f5fa05cb0fa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.849916154 +0000 UTC m=+1.135141432,LastTimestamp:2026-03-12 14:47:28.849916154 +0000 UTC m=+1.135141432,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.174996 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c1f5fa1331891 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:28.863967377 +0000 UTC m=+1.149192655,LastTimestamp:2026-03-12 14:47:28.863967377 +0000 UTC m=+1.149192655,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.178816 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1f5fc153f01a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:29.402990618 +0000 UTC m=+1.688215896,LastTimestamp:2026-03-12 14:47:29.402990618 +0000 UTC m=+1.688215896,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.182587 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c1f5fc1672898 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:29.404250264 +0000 UTC m=+1.689475542,LastTimestamp:2026-03-12 14:47:29.404250264 +0000 UTC m=+1.689475542,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.186076 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c1f5fc16ed8bd openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:29.404754109 +0000 UTC m=+1.689979387,LastTimestamp:2026-03-12 14:47:29.404754109 +0000 UTC m=+1.689979387,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.190391 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1f5fc1740577 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:29.405093239 +0000 UTC m=+1.690318517,LastTimestamp:2026-03-12 14:47:29.405093239 +0000 UTC m=+1.690318517,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.194562 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1f5fc17f02e7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:29.405813479 +0000 UTC m=+1.691038757,LastTimestamp:2026-03-12 14:47:29.405813479 +0000 UTC m=+1.691038757,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.198265 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1f5fc1f7ba72 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:29.413724786 +0000 UTC m=+1.698950064,LastTimestamp:2026-03-12 14:47:29.413724786 +0000 UTC m=+1.698950064,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.202741 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1f5fc2156546 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:29.415669062 +0000 UTC m=+1.700894340,LastTimestamp:2026-03-12 14:47:29.415669062 +0000 UTC m=+1.700894340,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.206280 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c1f5fc2535c59 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:29.419730009 +0000 UTC m=+1.704955287,LastTimestamp:2026-03-12 14:47:29.419730009 +0000 UTC m=+1.704955287,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.209937 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1f5fc2ce3e84 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:29.4277833 +0000 UTC m=+1.713008578,LastTimestamp:2026-03-12 14:47:29.4277833 +0000 UTC m=+1.713008578,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.213103 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c1f5fc2cefb13 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:29.427831571 +0000 UTC m=+1.713056849,LastTimestamp:2026-03-12 14:47:29.427831571 +0000 UTC m=+1.713056849,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.216899 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1f5fc2cefe6f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:29.427832431 +0000 UTC m=+1.713057709,LastTimestamp:2026-03-12 14:47:29.427832431 +0000 UTC m=+1.713057709,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.220595 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1f5fd4bde46b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:29.728701547 +0000 UTC m=+2.013926835,LastTimestamp:2026-03-12 14:47:29.728701547 +0000 UTC m=+2.013926835,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.225033 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1f5fd56b0ab0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:29.740049072 +0000 UTC m=+2.025274350,LastTimestamp:2026-03-12 14:47:29.740049072 +0000 UTC m=+2.025274350,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.229493 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1f5fd57b46ab openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:29.741113003 +0000 UTC m=+2.026338281,LastTimestamp:2026-03-12 14:47:29.741113003 +0000 UTC m=+2.026338281,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.233370 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1f5fe0fe9c93 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:29.934269587 +0000 UTC m=+2.219494865,LastTimestamp:2026-03-12 14:47:29.934269587 +0000 UTC m=+2.219494865,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.236892 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1f5fe19bc37f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:29.944568703 +0000 UTC m=+2.229794001,LastTimestamp:2026-03-12 14:47:29.944568703 +0000 UTC m=+2.229794001,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.240430 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1f5fe1b6d01b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:29.946341403 +0000 UTC m=+2.231566681,LastTimestamp:2026-03-12 14:47:29.946341403 +0000 UTC m=+2.231566681,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.244075 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1f5feb7d519f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:30.110345631 +0000 UTC m=+2.395570909,LastTimestamp:2026-03-12 14:47:30.110345631 +0000 UTC m=+2.395570909,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.247809 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1f5fec207818 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:30.121037848 +0000 UTC m=+2.406263136,LastTimestamp:2026-03-12 14:47:30.121037848 +0000 UTC m=+2.406263136,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.251700 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1f5ffa205515 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:30.355909909 +0000 UTC m=+2.641135187,LastTimestamp:2026-03-12 14:47:30.355909909 +0000 UTC m=+2.641135187,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.255916 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1f5ffa341dcc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:30.357206476 +0000 UTC m=+2.642431754,LastTimestamp:2026-03-12 14:47:30.357206476 +0000 UTC m=+2.642431754,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.259687 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c1f5ffa4ef354 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:30.358965076 +0000 UTC m=+2.644190354,LastTimestamp:2026-03-12 14:47:30.358965076 +0000 UTC m=+2.644190354,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.263631 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c1f5ffa6bf62f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:30.360866351 +0000 UTC m=+2.646091629,LastTimestamp:2026-03-12 14:47:30.360866351 +0000 UTC m=+2.646091629,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.267856 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c1f6007062a08 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:30.57229876 +0000 UTC m=+2.857524038,LastTimestamp:2026-03-12 14:47:30.57229876 +0000 UTC m=+2.857524038,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.272230 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1f6007225901 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:30.574145793 +0000 UTC m=+2.859371071,LastTimestamp:2026-03-12 14:47:30.574145793 +0000 UTC m=+2.859371071,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.275418 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1f600723cec3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:30.574241475 +0000 UTC m=+2.859466753,LastTimestamp:2026-03-12 14:47:30.574241475 +0000 UTC m=+2.859466753,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.279326 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c1f600728984e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:30.574555214 +0000 UTC m=+2.859780492,LastTimestamp:2026-03-12 14:47:30.574555214 +0000 UTC m=+2.859780492,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.282903 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c1f6007d06e44 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:30.5855545 +0000 UTC m=+2.870779778,LastTimestamp:2026-03-12 14:47:30.5855545 +0000 UTC m=+2.870779778,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.286941 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c1f6007e11d90 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:30.586647952 +0000 UTC m=+2.871873230,LastTimestamp:2026-03-12 14:47:30.586647952 +0000 UTC m=+2.871873230,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: I0312 14:47:53.290483 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.290683 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1f6007f555eb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:30.587973099 +0000 UTC m=+2.873198387,LastTimestamp:2026-03-12 14:47:30.587973099 +0000 UTC m=+2.873198387,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.292580 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1f6008149dba openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:30.590023098 +0000 UTC m=+2.875248376,LastTimestamp:2026-03-12 14:47:30.590023098 +0000 UTC m=+2.875248376,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.295160 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c1f60081564ea openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:30.59007409 +0000 UTC m=+2.875299368,LastTimestamp:2026-03-12 14:47:30.59007409 +0000 UTC m=+2.875299368,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.300083 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1f60087fb714 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:30.59704194 +0000 UTC m=+2.882267218,LastTimestamp:2026-03-12 14:47:30.59704194 +0000 UTC m=+2.882267218,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.304256 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c1f6012f74355 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:30.772648789 +0000 UTC m=+3.057874067,LastTimestamp:2026-03-12 14:47:30.772648789 +0000 UTC m=+3.057874067,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.308526 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1f601302280c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:30.7733627 +0000 UTC m=+3.058587978,LastTimestamp:2026-03-12 14:47:30.7733627 +0000 UTC m=+3.058587978,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.312756 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c1f6013c95c5a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:30.786417754 +0000 UTC m=+3.071643032,LastTimestamp:2026-03-12 14:47:30.786417754 +0000 UTC m=+3.071643032,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.317043 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c1f6013dae1cb openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:30.787566027 +0000 UTC m=+3.072791295,LastTimestamp:2026-03-12 14:47:30.787566027 +0000 UTC m=+3.072791295,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.320977 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1f6013fd6f8e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:30.789830542 +0000 UTC m=+3.075055820,LastTimestamp:2026-03-12 14:47:30.789830542 +0000 UTC m=+3.075055820,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.324504 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1f60141140ac openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:30.79112926 +0000 UTC m=+3.076354538,LastTimestamp:2026-03-12 14:47:30.79112926 +0000 UTC m=+3.076354538,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.327620 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1f6020167a50 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:30.992798288 +0000 UTC m=+3.278023586,LastTimestamp:2026-03-12 14:47:30.992798288 +0000 UTC m=+3.278023586,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.331147 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c1f602016ec11 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:30.992827409 +0000 UTC m=+3.278052717,LastTimestamp:2026-03-12 14:47:30.992827409 +0000 UTC m=+3.278052717,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.334982 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c1f60215ff08a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:31.014389898 +0000 UTC m=+3.299615176,LastTimestamp:2026-03-12 14:47:31.014389898 +0000 UTC m=+3.299615176,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.338885 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1f60217f2228 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:31.016434216 +0000 UTC m=+3.301659494,LastTimestamp:2026-03-12 14:47:31.016434216 +0000 UTC m=+3.301659494,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.341881 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1f60218d4cd3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:31.017362643 +0000 UTC m=+3.302587921,LastTimestamp:2026-03-12 14:47:31.017362643 +0000 UTC m=+3.302587921,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.344913 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1f602b6db90f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:31.183065359 +0000 UTC m=+3.468290637,LastTimestamp:2026-03-12 14:47:31.183065359 +0000 UTC m=+3.468290637,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.347919 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1f602c3a3fc0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:31.196469184 +0000 UTC m=+3.481694462,LastTimestamp:2026-03-12 14:47:31.196469184 +0000 UTC m=+3.481694462,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.351104 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1f602c50c800 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:31.197945856 +0000 UTC m=+3.483171134,LastTimestamp:2026-03-12 14:47:31.197945856 +0000 UTC m=+3.483171134,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.354951 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1f603603558e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:31.360642446 +0000 UTC m=+3.645867734,LastTimestamp:2026-03-12 14:47:31.360642446 +0000 UTC m=+3.645867734,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.358942 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1f6036a67adb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:31.371334363 +0000 UTC m=+3.656559641,LastTimestamp:2026-03-12 14:47:31.371334363 +0000 UTC m=+3.656559641,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.363492 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1f6037777ff8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:31.385032696 +0000 UTC m=+3.670257974,LastTimestamp:2026-03-12 14:47:31.385032696 +0000 UTC m=+3.670257974,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.367720 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1f6041bfa97d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:31.557534077 +0000 UTC m=+3.842759355,LastTimestamp:2026-03-12 14:47:31.557534077 +0000 UTC m=+3.842759355,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.371123 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1f6042bfc7ab openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:31.574319019 +0000 UTC m=+3.859544287,LastTimestamp:2026-03-12 14:47:31.574319019 +0000 UTC m=+3.859544287,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.379619 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1f60731cd0dc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:32.385722588 +0000 UTC m=+4.670947866,LastTimestamp:2026-03-12 14:47:32.385722588 +0000 UTC m=+4.670947866,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.384454 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1f607b917a26 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:32.52758583 +0000 UTC m=+4.812811108,LastTimestamp:2026-03-12 14:47:32.52758583 +0000 UTC m=+4.812811108,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.388583 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1f607be8abe2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:32.533300194 +0000 UTC m=+4.818525462,LastTimestamp:2026-03-12 14:47:32.533300194 +0000 UTC m=+4.818525462,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.393370 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1f607bf63717 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:32.534187799 +0000 UTC m=+4.819413077,LastTimestamp:2026-03-12 14:47:32.534187799 +0000 UTC m=+4.819413077,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.399772 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1f60852cef44 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:32.688768836 +0000 UTC m=+4.973994104,LastTimestamp:2026-03-12 14:47:32.688768836 +0000 UTC m=+4.973994104,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.404965 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1f6085c9ff7d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:32.699062141 +0000 UTC m=+4.984287419,LastTimestamp:2026-03-12 14:47:32.699062141 +0000 UTC m=+4.984287419,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.409641 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1f6085e395a3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:32.700738979 +0000 UTC m=+4.985964257,LastTimestamp:2026-03-12 14:47:32.700738979 +0000 UTC m=+4.985964257,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.414424 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1f608f1c5494 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:32.85545282 +0000 UTC m=+5.140678098,LastTimestamp:2026-03-12 14:47:32.85545282 +0000 UTC m=+5.140678098,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.418143 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1f608ffa8424 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:32.870013988 +0000 UTC m=+5.155239266,LastTimestamp:2026-03-12 14:47:32.870013988 +0000 UTC m=+5.155239266,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.423247 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1f60900784fc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:32.870866172 +0000 UTC m=+5.156091450,LastTimestamp:2026-03-12 14:47:32.870866172 +0000 UTC m=+5.156091450,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.428184 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1f609a4b0063 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:33.043060835 +0000 UTC m=+5.328286113,LastTimestamp:2026-03-12 14:47:33.043060835 +0000 UTC m=+5.328286113,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.432165 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1f609b143b1b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:33.056248603 +0000 UTC m=+5.341473881,LastTimestamp:2026-03-12 14:47:33.056248603 +0000 UTC m=+5.341473881,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.436452 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1f609b252293 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:33.057356435 +0000 UTC m=+5.342581713,LastTimestamp:2026-03-12 14:47:33.057356435 +0000 UTC m=+5.342581713,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.440197 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1f60a5217458 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:33.224887384 +0000 UTC m=+5.510112662,LastTimestamp:2026-03-12 14:47:33.224887384 +0000 UTC m=+5.510112662,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.443766 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1f60a59efd16 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:33.23311439 +0000 UTC m=+5.518339668,LastTimestamp:2026-03-12 14:47:33.23311439 +0000 UTC m=+5.518339668,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.450253 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 12 14:47:53 crc kubenswrapper[4869]: &Event{ObjectMeta:{kube-controller-manager-crc.189c1f617d81345c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 12 14:47:53 crc kubenswrapper[4869]: body: Mar 12 14:47:53 crc kubenswrapper[4869]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:36.855041116 +0000 UTC m=+9.140266414,LastTimestamp:2026-03-12 14:47:36.855041116 +0000 UTC m=+9.140266414,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 12 14:47:53 crc kubenswrapper[4869]: > Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.455383 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1f617d838594 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:36.85519298 +0000 UTC m=+9.140418288,LastTimestamp:2026-03-12 14:47:36.85519298 +0000 UTC m=+9.140418288,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.461514 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 12 14:47:53 crc kubenswrapper[4869]: &Event{ObjectMeta:{kube-apiserver-crc.189c1f62edb13c67 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 12 14:47:53 crc kubenswrapper[4869]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 12 14:47:53 crc kubenswrapper[4869]: Mar 12 14:47:53 crc kubenswrapper[4869]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:43.032204391 +0000 UTC m=+15.317429890,LastTimestamp:2026-03-12 14:47:43.032204391 +0000 UTC m=+15.317429890,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 12 14:47:53 crc kubenswrapper[4869]: > Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.465773 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1f62edb3aaba openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:43.032363706 +0000 UTC m=+15.317588994,LastTimestamp:2026-03-12 14:47:43.032363706 +0000 UTC m=+15.317588994,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.470516 4869 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c1f62edb13c67\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 12 14:47:53 crc kubenswrapper[4869]: &Event{ObjectMeta:{kube-apiserver-crc.189c1f62edb13c67 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 12 14:47:53 crc kubenswrapper[4869]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 12 14:47:53 crc kubenswrapper[4869]: Mar 12 14:47:53 crc kubenswrapper[4869]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:43.032204391 +0000 UTC m=+15.317429890,LastTimestamp:2026-03-12 14:47:43.037746791 +0000 UTC m=+15.322972069,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 12 14:47:53 crc kubenswrapper[4869]: > Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.475690 4869 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c1f62edb3aaba\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1f62edb3aaba openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:43.032363706 +0000 UTC m=+15.317588994,LastTimestamp:2026-03-12 14:47:43.037789892 +0000 UTC m=+15.323015170,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.479402 4869 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c1f602c50c800\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1f602c50c800 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:31.197945856 +0000 UTC m=+3.483171134,LastTimestamp:2026-03-12 14:47:43.419987762 +0000 UTC m=+15.705213040,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.482661 4869 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c1f603603558e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1f603603558e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:31.360642446 +0000 UTC m=+3.645867734,LastTimestamp:2026-03-12 14:47:43.577490952 +0000 UTC m=+15.862716230,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.486346 4869 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c1f6037777ff8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1f6037777ff8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:31.385032696 +0000 UTC m=+3.670257974,LastTimestamp:2026-03-12 14:47:43.584437462 +0000 UTC m=+15.869662750,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.491997 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 12 14:47:53 crc kubenswrapper[4869]: &Event{ObjectMeta:{kube-controller-manager-crc.189c1f63d18d284a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 12 14:47:53 crc kubenswrapper[4869]: body: Mar 12 14:47:53 crc kubenswrapper[4869]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:46.855045194 +0000 UTC m=+19.140270492,LastTimestamp:2026-03-12 14:47:46.855045194 +0000 UTC m=+19.140270492,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 12 14:47:53 crc kubenswrapper[4869]: > Mar 12 14:47:53 crc kubenswrapper[4869]: E0312 14:47:53.495746 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1f63d18e256c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:46.855109996 +0000 UTC m=+19.140335284,LastTimestamp:2026-03-12 14:47:46.855109996 +0000 UTC m=+19.140335284,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:54 crc kubenswrapper[4869]: I0312 14:47:54.296042 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:47:55 crc kubenswrapper[4869]: I0312 14:47:55.296528 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:47:55 crc kubenswrapper[4869]: W0312 14:47:55.590160 4869 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 12 14:47:55 crc kubenswrapper[4869]: E0312 14:47:55.590210 4869 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 12 14:47:56 crc kubenswrapper[4869]: I0312 14:47:56.293390 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:47:56 crc kubenswrapper[4869]: I0312 14:47:56.431155 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:56 crc kubenswrapper[4869]: I0312 14:47:56.432806 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:56 crc kubenswrapper[4869]: I0312 14:47:56.432859 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:56 crc kubenswrapper[4869]: I0312 14:47:56.432878 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:56 crc kubenswrapper[4869]: I0312 14:47:56.432912 4869 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 14:47:56 crc kubenswrapper[4869]: E0312 14:47:56.436825 4869 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 12 14:47:56 crc kubenswrapper[4869]: E0312 14:47:56.445966 4869 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 12 14:47:56 crc kubenswrapper[4869]: W0312 14:47:56.558034 4869 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 12 14:47:56 crc kubenswrapper[4869]: E0312 14:47:56.558098 4869 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 12 14:47:56 crc kubenswrapper[4869]: I0312 14:47:56.854667 4869 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 14:47:56 crc kubenswrapper[4869]: I0312 14:47:56.854766 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 14:47:56 crc kubenswrapper[4869]: I0312 14:47:56.854872 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 14:47:56 crc kubenswrapper[4869]: I0312 14:47:56.855115 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:56 crc kubenswrapper[4869]: I0312 14:47:56.857036 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:56 crc kubenswrapper[4869]: I0312 14:47:56.857084 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:56 crc kubenswrapper[4869]: I0312 14:47:56.857103 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:56 crc kubenswrapper[4869]: I0312 14:47:56.858684 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"894553568d4c9460c8eae3f6e38e7a070e5cd64faa344af8455c625368bf6ed7"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 12 14:47:56 crc kubenswrapper[4869]: I0312 14:47:56.859022 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://894553568d4c9460c8eae3f6e38e7a070e5cd64faa344af8455c625368bf6ed7" gracePeriod=30 Mar 12 14:47:56 crc kubenswrapper[4869]: E0312 14:47:56.865876 4869 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c1f63d18d284a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 12 14:47:56 crc kubenswrapper[4869]: &Event{ObjectMeta:{kube-controller-manager-crc.189c1f63d18d284a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 12 14:47:56 crc kubenswrapper[4869]: body: Mar 12 14:47:56 crc kubenswrapper[4869]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:46.855045194 +0000 UTC m=+19.140270492,LastTimestamp:2026-03-12 14:47:56.854741414 +0000 UTC m=+29.139966732,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 12 14:47:56 crc kubenswrapper[4869]: > Mar 12 14:47:56 crc kubenswrapper[4869]: E0312 14:47:56.870388 4869 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c1f63d18e256c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1f63d18e256c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:46.855109996 +0000 UTC m=+19.140335284,LastTimestamp:2026-03-12 14:47:56.854821846 +0000 UTC m=+29.140047224,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:56 crc kubenswrapper[4869]: E0312 14:47:56.876108 4869 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1f6625d5667e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:56.859000446 +0000 UTC m=+29.144225744,LastTimestamp:2026-03-12 14:47:56.859000446 +0000 UTC m=+29.144225744,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:56 crc kubenswrapper[4869]: E0312 14:47:56.985478 4869 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c1f5fc2156546\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1f5fc2156546 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:29.415669062 +0000 UTC m=+1.700894340,LastTimestamp:2026-03-12 14:47:56.977285941 +0000 UTC m=+29.262511259,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:57 crc kubenswrapper[4869]: E0312 14:47:57.206433 4869 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c1f5fd4bde46b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1f5fd4bde46b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:29.728701547 +0000 UTC m=+2.013926835,LastTimestamp:2026-03-12 14:47:57.201838236 +0000 UTC m=+29.487063524,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:57 crc kubenswrapper[4869]: E0312 14:47:57.220732 4869 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c1f5fd56b0ab0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1f5fd56b0ab0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:47:29.740049072 +0000 UTC m=+2.025274350,LastTimestamp:2026-03-12 14:47:57.214897921 +0000 UTC m=+29.500123209,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:47:57 crc kubenswrapper[4869]: I0312 14:47:57.292353 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:47:57 crc kubenswrapper[4869]: I0312 14:47:57.461696 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 12 14:47:57 crc kubenswrapper[4869]: I0312 14:47:57.462350 4869 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="894553568d4c9460c8eae3f6e38e7a070e5cd64faa344af8455c625368bf6ed7" exitCode=255 Mar 12 14:47:57 crc kubenswrapper[4869]: I0312 14:47:57.462400 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"894553568d4c9460c8eae3f6e38e7a070e5cd64faa344af8455c625368bf6ed7"} Mar 12 14:47:57 crc kubenswrapper[4869]: I0312 14:47:57.462440 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ae74eb5bd2264b68ba4491c6694ec4082f8653359553bfcaab1ad779aceb4a7e"} Mar 12 14:47:57 crc kubenswrapper[4869]: I0312 14:47:57.462614 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:47:57 crc kubenswrapper[4869]: I0312 14:47:57.463771 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:47:57 crc kubenswrapper[4869]: I0312 14:47:57.463811 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:47:57 crc kubenswrapper[4869]: I0312 14:47:57.463821 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:47:57 crc kubenswrapper[4869]: W0312 14:47:57.651067 4869 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 12 14:47:57 crc kubenswrapper[4869]: E0312 14:47:57.651120 4869 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 12 14:47:58 crc kubenswrapper[4869]: I0312 14:47:58.296961 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:47:58 crc kubenswrapper[4869]: E0312 14:47:58.404875 4869 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 14:47:58 crc kubenswrapper[4869]: W0312 14:47:58.875496 4869 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 12 14:47:58 crc kubenswrapper[4869]: E0312 14:47:58.875605 4869 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 12 14:47:59 crc kubenswrapper[4869]: I0312 14:47:59.294649 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:00 crc kubenswrapper[4869]: I0312 14:48:00.293948 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:01 crc kubenswrapper[4869]: I0312 14:48:01.292206 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:02 crc kubenswrapper[4869]: I0312 14:48:02.292861 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:03 crc kubenswrapper[4869]: I0312 14:48:03.295533 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:03 crc kubenswrapper[4869]: I0312 14:48:03.336448 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:48:03 crc kubenswrapper[4869]: I0312 14:48:03.338355 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:03 crc kubenswrapper[4869]: I0312 14:48:03.338404 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:03 crc kubenswrapper[4869]: I0312 14:48:03.338419 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:03 crc kubenswrapper[4869]: I0312 14:48:03.339091 4869 scope.go:117] "RemoveContainer" containerID="2ed06d17d3082f227eb4944b8bef9da4eed6472d87946adee0d53fcf9d715cbd" Mar 12 14:48:03 crc kubenswrapper[4869]: I0312 14:48:03.438060 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:48:03 crc kubenswrapper[4869]: I0312 14:48:03.439780 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:03 crc kubenswrapper[4869]: I0312 14:48:03.439831 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:03 crc kubenswrapper[4869]: I0312 14:48:03.439846 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:03 crc kubenswrapper[4869]: I0312 14:48:03.439879 4869 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 14:48:03 crc kubenswrapper[4869]: E0312 14:48:03.447114 4869 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 12 14:48:03 crc kubenswrapper[4869]: E0312 14:48:03.451963 4869 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 12 14:48:03 crc kubenswrapper[4869]: I0312 14:48:03.853820 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 14:48:03 crc kubenswrapper[4869]: I0312 14:48:03.853989 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:48:03 crc kubenswrapper[4869]: I0312 14:48:03.855811 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:03 crc kubenswrapper[4869]: I0312 14:48:03.855878 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:03 crc kubenswrapper[4869]: I0312 14:48:03.855901 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:03 crc kubenswrapper[4869]: I0312 14:48:03.880287 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 14:48:04 crc kubenswrapper[4869]: I0312 14:48:04.293847 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:04 crc kubenswrapper[4869]: I0312 14:48:04.481986 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 12 14:48:04 crc kubenswrapper[4869]: I0312 14:48:04.482513 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 12 14:48:04 crc kubenswrapper[4869]: I0312 14:48:04.484408 4869 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8d5e17c1774b5f938ce2df1791f5eed411e1507a0f54555cc3d00e4bd7496650" exitCode=255 Mar 12 14:48:04 crc kubenswrapper[4869]: I0312 14:48:04.484486 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"8d5e17c1774b5f938ce2df1791f5eed411e1507a0f54555cc3d00e4bd7496650"} Mar 12 14:48:04 crc kubenswrapper[4869]: I0312 14:48:04.484530 4869 scope.go:117] "RemoveContainer" containerID="2ed06d17d3082f227eb4944b8bef9da4eed6472d87946adee0d53fcf9d715cbd" Mar 12 14:48:04 crc kubenswrapper[4869]: I0312 14:48:04.484585 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 14:48:04 crc kubenswrapper[4869]: I0312 14:48:04.484554 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:48:04 crc kubenswrapper[4869]: I0312 14:48:04.484633 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:48:04 crc kubenswrapper[4869]: I0312 14:48:04.485461 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:04 crc kubenswrapper[4869]: I0312 14:48:04.485471 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:04 crc kubenswrapper[4869]: I0312 14:48:04.485488 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:04 crc kubenswrapper[4869]: I0312 14:48:04.485496 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:04 crc kubenswrapper[4869]: I0312 14:48:04.485489 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:04 crc kubenswrapper[4869]: I0312 14:48:04.485557 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:04 crc kubenswrapper[4869]: I0312 14:48:04.486230 4869 scope.go:117] "RemoveContainer" containerID="8d5e17c1774b5f938ce2df1791f5eed411e1507a0f54555cc3d00e4bd7496650" Mar 12 14:48:04 crc kubenswrapper[4869]: E0312 14:48:04.486458 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 14:48:05 crc kubenswrapper[4869]: I0312 14:48:05.293484 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:05 crc kubenswrapper[4869]: I0312 14:48:05.487997 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 12 14:48:05 crc kubenswrapper[4869]: I0312 14:48:05.489962 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:48:05 crc kubenswrapper[4869]: I0312 14:48:05.490758 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:05 crc kubenswrapper[4869]: I0312 14:48:05.490791 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:05 crc kubenswrapper[4869]: I0312 14:48:05.490801 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:05 crc kubenswrapper[4869]: I0312 14:48:05.511143 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:48:05 crc kubenswrapper[4869]: I0312 14:48:05.511288 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:48:05 crc kubenswrapper[4869]: I0312 14:48:05.512276 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:05 crc kubenswrapper[4869]: I0312 14:48:05.512315 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:05 crc kubenswrapper[4869]: I0312 14:48:05.512332 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:05 crc kubenswrapper[4869]: I0312 14:48:05.512864 4869 scope.go:117] "RemoveContainer" containerID="8d5e17c1774b5f938ce2df1791f5eed411e1507a0f54555cc3d00e4bd7496650" Mar 12 14:48:05 crc kubenswrapper[4869]: E0312 14:48:05.513045 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 14:48:06 crc kubenswrapper[4869]: I0312 14:48:06.292015 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:07 crc kubenswrapper[4869]: I0312 14:48:07.293336 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:08 crc kubenswrapper[4869]: I0312 14:48:08.293975 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:08 crc kubenswrapper[4869]: I0312 14:48:08.326970 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:48:08 crc kubenswrapper[4869]: I0312 14:48:08.327138 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:48:08 crc kubenswrapper[4869]: I0312 14:48:08.328158 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:08 crc kubenswrapper[4869]: I0312 14:48:08.328190 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:08 crc kubenswrapper[4869]: I0312 14:48:08.328203 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:08 crc kubenswrapper[4869]: I0312 14:48:08.328814 4869 scope.go:117] "RemoveContainer" containerID="8d5e17c1774b5f938ce2df1791f5eed411e1507a0f54555cc3d00e4bd7496650" Mar 12 14:48:08 crc kubenswrapper[4869]: E0312 14:48:08.329003 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 14:48:08 crc kubenswrapper[4869]: E0312 14:48:08.405060 4869 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 14:48:09 crc kubenswrapper[4869]: I0312 14:48:09.293064 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:09 crc kubenswrapper[4869]: W0312 14:48:09.980128 4869 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:09 crc kubenswrapper[4869]: E0312 14:48:09.980811 4869 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 12 14:48:10 crc kubenswrapper[4869]: I0312 14:48:10.292166 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:10 crc kubenswrapper[4869]: I0312 14:48:10.447884 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:48:10 crc kubenswrapper[4869]: I0312 14:48:10.449279 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:10 crc kubenswrapper[4869]: I0312 14:48:10.449317 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:10 crc kubenswrapper[4869]: I0312 14:48:10.449331 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:10 crc kubenswrapper[4869]: I0312 14:48:10.449355 4869 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 14:48:10 crc kubenswrapper[4869]: E0312 14:48:10.453467 4869 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 12 14:48:10 crc kubenswrapper[4869]: E0312 14:48:10.453687 4869 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 12 14:48:11 crc kubenswrapper[4869]: I0312 14:48:11.294809 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:12 crc kubenswrapper[4869]: I0312 14:48:12.294264 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:12 crc kubenswrapper[4869]: W0312 14:48:12.901596 4869 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 12 14:48:12 crc kubenswrapper[4869]: E0312 14:48:12.901649 4869 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 12 14:48:13 crc kubenswrapper[4869]: I0312 14:48:13.296328 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:14 crc kubenswrapper[4869]: I0312 14:48:14.293162 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:15 crc kubenswrapper[4869]: I0312 14:48:15.296084 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:15 crc kubenswrapper[4869]: W0312 14:48:15.725232 4869 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 12 14:48:15 crc kubenswrapper[4869]: E0312 14:48:15.725280 4869 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 12 14:48:15 crc kubenswrapper[4869]: I0312 14:48:15.726658 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 14:48:15 crc kubenswrapper[4869]: I0312 14:48:15.726832 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:48:15 crc kubenswrapper[4869]: I0312 14:48:15.728160 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:15 crc kubenswrapper[4869]: I0312 14:48:15.728324 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:15 crc kubenswrapper[4869]: I0312 14:48:15.728395 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:16 crc kubenswrapper[4869]: I0312 14:48:16.295612 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:17 crc kubenswrapper[4869]: I0312 14:48:17.293254 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:17 crc kubenswrapper[4869]: I0312 14:48:17.453867 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:48:17 crc kubenswrapper[4869]: I0312 14:48:17.455584 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:17 crc kubenswrapper[4869]: I0312 14:48:17.455626 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:17 crc kubenswrapper[4869]: I0312 14:48:17.455637 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:17 crc kubenswrapper[4869]: I0312 14:48:17.455667 4869 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 14:48:17 crc kubenswrapper[4869]: E0312 14:48:17.458558 4869 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 12 14:48:17 crc kubenswrapper[4869]: E0312 14:48:17.459073 4869 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 12 14:48:18 crc kubenswrapper[4869]: I0312 14:48:18.295185 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:18 crc kubenswrapper[4869]: E0312 14:48:18.405646 4869 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 14:48:19 crc kubenswrapper[4869]: I0312 14:48:19.292935 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:19 crc kubenswrapper[4869]: W0312 14:48:19.313509 4869 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 12 14:48:19 crc kubenswrapper[4869]: E0312 14:48:19.313621 4869 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 12 14:48:20 crc kubenswrapper[4869]: I0312 14:48:20.292645 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:20 crc kubenswrapper[4869]: I0312 14:48:20.943014 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 14:48:20 crc kubenswrapper[4869]: I0312 14:48:20.943147 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:48:20 crc kubenswrapper[4869]: I0312 14:48:20.944027 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:20 crc kubenswrapper[4869]: I0312 14:48:20.944068 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:20 crc kubenswrapper[4869]: I0312 14:48:20.944080 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:21 crc kubenswrapper[4869]: I0312 14:48:21.293228 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:22 crc kubenswrapper[4869]: I0312 14:48:22.295684 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:23 crc kubenswrapper[4869]: I0312 14:48:23.294729 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:23 crc kubenswrapper[4869]: I0312 14:48:23.336686 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:48:23 crc kubenswrapper[4869]: I0312 14:48:23.343435 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:23 crc kubenswrapper[4869]: I0312 14:48:23.343495 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:23 crc kubenswrapper[4869]: I0312 14:48:23.343507 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:23 crc kubenswrapper[4869]: I0312 14:48:23.344238 4869 scope.go:117] "RemoveContainer" containerID="8d5e17c1774b5f938ce2df1791f5eed411e1507a0f54555cc3d00e4bd7496650" Mar 12 14:48:23 crc kubenswrapper[4869]: E0312 14:48:23.344419 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 14:48:24 crc kubenswrapper[4869]: I0312 14:48:24.292961 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:24 crc kubenswrapper[4869]: I0312 14:48:24.458652 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:48:24 crc kubenswrapper[4869]: I0312 14:48:24.460403 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:24 crc kubenswrapper[4869]: I0312 14:48:24.460472 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:24 crc kubenswrapper[4869]: I0312 14:48:24.460488 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:24 crc kubenswrapper[4869]: I0312 14:48:24.460520 4869 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 14:48:24 crc kubenswrapper[4869]: E0312 14:48:24.465754 4869 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 12 14:48:24 crc kubenswrapper[4869]: E0312 14:48:24.465823 4869 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 12 14:48:25 crc kubenswrapper[4869]: I0312 14:48:25.293442 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:26 crc kubenswrapper[4869]: I0312 14:48:26.293283 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:27 crc kubenswrapper[4869]: I0312 14:48:27.294357 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:28 crc kubenswrapper[4869]: I0312 14:48:28.293514 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:28 crc kubenswrapper[4869]: E0312 14:48:28.406757 4869 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 14:48:29 crc kubenswrapper[4869]: I0312 14:48:29.292738 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:30 crc kubenswrapper[4869]: I0312 14:48:30.292846 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:31 crc kubenswrapper[4869]: I0312 14:48:31.295692 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:31 crc kubenswrapper[4869]: I0312 14:48:31.466500 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:48:31 crc kubenswrapper[4869]: I0312 14:48:31.468188 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:31 crc kubenswrapper[4869]: I0312 14:48:31.468258 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:31 crc kubenswrapper[4869]: I0312 14:48:31.468269 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:31 crc kubenswrapper[4869]: I0312 14:48:31.468293 4869 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 14:48:31 crc kubenswrapper[4869]: E0312 14:48:31.471261 4869 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 12 14:48:31 crc kubenswrapper[4869]: E0312 14:48:31.471315 4869 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 12 14:48:32 crc kubenswrapper[4869]: I0312 14:48:32.295591 4869 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 14:48:32 crc kubenswrapper[4869]: I0312 14:48:32.755396 4869 csr.go:261] certificate signing request csr-2w66r is approved, waiting to be issued Mar 12 14:48:32 crc kubenswrapper[4869]: I0312 14:48:32.762660 4869 csr.go:257] certificate signing request csr-2w66r is issued Mar 12 14:48:32 crc kubenswrapper[4869]: I0312 14:48:32.769432 4869 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 12 14:48:33 crc kubenswrapper[4869]: I0312 14:48:33.164838 4869 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 12 14:48:33 crc kubenswrapper[4869]: I0312 14:48:33.335823 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:48:33 crc kubenswrapper[4869]: I0312 14:48:33.337430 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:33 crc kubenswrapper[4869]: I0312 14:48:33.337467 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:33 crc kubenswrapper[4869]: I0312 14:48:33.337478 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:33 crc kubenswrapper[4869]: I0312 14:48:33.764343 4869 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-06 15:59:10.540788126 +0000 UTC Mar 12 14:48:33 crc kubenswrapper[4869]: I0312 14:48:33.764448 4869 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6457h10m36.776349039s for next certificate rotation Mar 12 14:48:36 crc kubenswrapper[4869]: I0312 14:48:36.336090 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:48:36 crc kubenswrapper[4869]: I0312 14:48:36.338144 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:36 crc kubenswrapper[4869]: I0312 14:48:36.338194 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:36 crc kubenswrapper[4869]: I0312 14:48:36.338207 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:36 crc kubenswrapper[4869]: I0312 14:48:36.339047 4869 scope.go:117] "RemoveContainer" containerID="8d5e17c1774b5f938ce2df1791f5eed411e1507a0f54555cc3d00e4bd7496650" Mar 12 14:48:37 crc kubenswrapper[4869]: I0312 14:48:37.568441 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 12 14:48:37 crc kubenswrapper[4869]: I0312 14:48:37.569629 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 12 14:48:37 crc kubenswrapper[4869]: I0312 14:48:37.571413 4869 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0e6104f6e86200fc4f007b43b7b8c0c0dfb0cf70075ca81ff9773e4424d03e28" exitCode=255 Mar 12 14:48:37 crc kubenswrapper[4869]: I0312 14:48:37.571557 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0e6104f6e86200fc4f007b43b7b8c0c0dfb0cf70075ca81ff9773e4424d03e28"} Mar 12 14:48:37 crc kubenswrapper[4869]: I0312 14:48:37.571691 4869 scope.go:117] "RemoveContainer" containerID="8d5e17c1774b5f938ce2df1791f5eed411e1507a0f54555cc3d00e4bd7496650" Mar 12 14:48:37 crc kubenswrapper[4869]: I0312 14:48:37.571880 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:48:37 crc kubenswrapper[4869]: I0312 14:48:37.573011 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:37 crc kubenswrapper[4869]: I0312 14:48:37.573117 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:37 crc kubenswrapper[4869]: I0312 14:48:37.573217 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:37 crc kubenswrapper[4869]: I0312 14:48:37.573860 4869 scope.go:117] "RemoveContainer" containerID="0e6104f6e86200fc4f007b43b7b8c0c0dfb0cf70075ca81ff9773e4424d03e28" Mar 12 14:48:37 crc kubenswrapper[4869]: E0312 14:48:37.574136 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.326421 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:48:38 crc kubenswrapper[4869]: E0312 14:48:38.408671 4869 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.472087 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.473320 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.473368 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.473379 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.473488 4869 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.482493 4869 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.482636 4869 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 12 14:48:38 crc kubenswrapper[4869]: E0312 14:48:38.482678 4869 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.485421 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.485447 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.485457 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.485474 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.485487 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:38Z","lastTransitionTime":"2026-03-12T14:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:38 crc kubenswrapper[4869]: E0312 14:48:38.497502 4869 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0727d113-6abb-4498-952f-5280a3e03df5\\\",\\\"systemUUID\\\":\\\"2ba13367-485d-48d1-abc3-723587dc31cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.503921 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.503959 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.503972 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.503990 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.504005 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:38Z","lastTransitionTime":"2026-03-12T14:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:38 crc kubenswrapper[4869]: E0312 14:48:38.513702 4869 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0727d113-6abb-4498-952f-5280a3e03df5\\\",\\\"systemUUID\\\":\\\"2ba13367-485d-48d1-abc3-723587dc31cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.523079 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.523141 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.523166 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.523200 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.523216 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:38Z","lastTransitionTime":"2026-03-12T14:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:38 crc kubenswrapper[4869]: E0312 14:48:38.533651 4869 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0727d113-6abb-4498-952f-5280a3e03df5\\\",\\\"systemUUID\\\":\\\"2ba13367-485d-48d1-abc3-723587dc31cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.540233 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.540401 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.540477 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.540569 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.540664 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:38Z","lastTransitionTime":"2026-03-12T14:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:38 crc kubenswrapper[4869]: E0312 14:48:38.548883 4869 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0727d113-6abb-4498-952f-5280a3e03df5\\\",\\\"systemUUID\\\":\\\"2ba13367-485d-48d1-abc3-723587dc31cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:38 crc kubenswrapper[4869]: E0312 14:48:38.549162 4869 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 14:48:38 crc kubenswrapper[4869]: E0312 14:48:38.549243 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.576153 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.578585 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.579383 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.579469 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.579533 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:38 crc kubenswrapper[4869]: I0312 14:48:38.580076 4869 scope.go:117] "RemoveContainer" containerID="0e6104f6e86200fc4f007b43b7b8c0c0dfb0cf70075ca81ff9773e4424d03e28" Mar 12 14:48:38 crc kubenswrapper[4869]: E0312 14:48:38.580275 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 14:48:38 crc kubenswrapper[4869]: E0312 14:48:38.650271 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:38 crc kubenswrapper[4869]: E0312 14:48:38.750355 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:38 crc kubenswrapper[4869]: E0312 14:48:38.851490 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:38 crc kubenswrapper[4869]: E0312 14:48:38.951711 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:39 crc kubenswrapper[4869]: E0312 14:48:39.051845 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:39 crc kubenswrapper[4869]: E0312 14:48:39.152828 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:39 crc kubenswrapper[4869]: E0312 14:48:39.253939 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:39 crc kubenswrapper[4869]: E0312 14:48:39.354994 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:39 crc kubenswrapper[4869]: E0312 14:48:39.455452 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:39 crc kubenswrapper[4869]: E0312 14:48:39.556368 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:39 crc kubenswrapper[4869]: E0312 14:48:39.657297 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:39 crc kubenswrapper[4869]: E0312 14:48:39.758239 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:39 crc kubenswrapper[4869]: E0312 14:48:39.858380 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:39 crc kubenswrapper[4869]: E0312 14:48:39.959498 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:40 crc kubenswrapper[4869]: E0312 14:48:40.059949 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:40 crc kubenswrapper[4869]: E0312 14:48:40.161044 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:40 crc kubenswrapper[4869]: E0312 14:48:40.262158 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:40 crc kubenswrapper[4869]: E0312 14:48:40.363069 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:40 crc kubenswrapper[4869]: I0312 14:48:40.367873 4869 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 12 14:48:40 crc kubenswrapper[4869]: E0312 14:48:40.464090 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:40 crc kubenswrapper[4869]: E0312 14:48:40.564689 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:40 crc kubenswrapper[4869]: E0312 14:48:40.666035 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:40 crc kubenswrapper[4869]: E0312 14:48:40.766467 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:40 crc kubenswrapper[4869]: E0312 14:48:40.866983 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:40 crc kubenswrapper[4869]: E0312 14:48:40.967355 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:41 crc kubenswrapper[4869]: E0312 14:48:41.068592 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:41 crc kubenswrapper[4869]: E0312 14:48:41.169784 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:41 crc kubenswrapper[4869]: E0312 14:48:41.271041 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:41 crc kubenswrapper[4869]: E0312 14:48:41.372615 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:41 crc kubenswrapper[4869]: E0312 14:48:41.473488 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:41 crc kubenswrapper[4869]: E0312 14:48:41.574426 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:41 crc kubenswrapper[4869]: E0312 14:48:41.674596 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:41 crc kubenswrapper[4869]: E0312 14:48:41.775446 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:41 crc kubenswrapper[4869]: E0312 14:48:41.875999 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:41 crc kubenswrapper[4869]: E0312 14:48:41.977061 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:42 crc kubenswrapper[4869]: E0312 14:48:42.077157 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:42 crc kubenswrapper[4869]: E0312 14:48:42.178310 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:42 crc kubenswrapper[4869]: E0312 14:48:42.278844 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:42 crc kubenswrapper[4869]: E0312 14:48:42.379109 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:42 crc kubenswrapper[4869]: E0312 14:48:42.479723 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:42 crc kubenswrapper[4869]: E0312 14:48:42.579841 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:42 crc kubenswrapper[4869]: E0312 14:48:42.680185 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:42 crc kubenswrapper[4869]: E0312 14:48:42.781057 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:42 crc kubenswrapper[4869]: E0312 14:48:42.882408 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:42 crc kubenswrapper[4869]: E0312 14:48:42.983625 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:43 crc kubenswrapper[4869]: E0312 14:48:43.084741 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:43 crc kubenswrapper[4869]: E0312 14:48:43.185201 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:43 crc kubenswrapper[4869]: E0312 14:48:43.286286 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:43 crc kubenswrapper[4869]: E0312 14:48:43.387020 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:43 crc kubenswrapper[4869]: E0312 14:48:43.487163 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:43 crc kubenswrapper[4869]: E0312 14:48:43.588308 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:43 crc kubenswrapper[4869]: E0312 14:48:43.689300 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:43 crc kubenswrapper[4869]: E0312 14:48:43.789951 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:43 crc kubenswrapper[4869]: E0312 14:48:43.890491 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:43 crc kubenswrapper[4869]: E0312 14:48:43.990978 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:44 crc kubenswrapper[4869]: E0312 14:48:44.091460 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:44 crc kubenswrapper[4869]: E0312 14:48:44.191707 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:44 crc kubenswrapper[4869]: E0312 14:48:44.292012 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:44 crc kubenswrapper[4869]: E0312 14:48:44.392492 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:44 crc kubenswrapper[4869]: E0312 14:48:44.492798 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:44 crc kubenswrapper[4869]: E0312 14:48:44.592873 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:44 crc kubenswrapper[4869]: E0312 14:48:44.693862 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:44 crc kubenswrapper[4869]: E0312 14:48:44.794768 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:44 crc kubenswrapper[4869]: E0312 14:48:44.895576 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:44 crc kubenswrapper[4869]: E0312 14:48:44.995852 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:45 crc kubenswrapper[4869]: E0312 14:48:45.096861 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:45 crc kubenswrapper[4869]: E0312 14:48:45.198031 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:45 crc kubenswrapper[4869]: E0312 14:48:45.298177 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:45 crc kubenswrapper[4869]: E0312 14:48:45.398618 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:45 crc kubenswrapper[4869]: E0312 14:48:45.498854 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:45 crc kubenswrapper[4869]: I0312 14:48:45.511106 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:48:45 crc kubenswrapper[4869]: I0312 14:48:45.511281 4869 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 14:48:45 crc kubenswrapper[4869]: I0312 14:48:45.512903 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:45 crc kubenswrapper[4869]: I0312 14:48:45.512945 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:45 crc kubenswrapper[4869]: I0312 14:48:45.512961 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:45 crc kubenswrapper[4869]: I0312 14:48:45.513473 4869 scope.go:117] "RemoveContainer" containerID="0e6104f6e86200fc4f007b43b7b8c0c0dfb0cf70075ca81ff9773e4424d03e28" Mar 12 14:48:45 crc kubenswrapper[4869]: E0312 14:48:45.513686 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 14:48:45 crc kubenswrapper[4869]: E0312 14:48:45.599895 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:45 crc kubenswrapper[4869]: E0312 14:48:45.701097 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:45 crc kubenswrapper[4869]: E0312 14:48:45.802228 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:45 crc kubenswrapper[4869]: E0312 14:48:45.902712 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:46 crc kubenswrapper[4869]: E0312 14:48:46.003574 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:46 crc kubenswrapper[4869]: E0312 14:48:46.104646 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:46 crc kubenswrapper[4869]: E0312 14:48:46.205722 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:46 crc kubenswrapper[4869]: E0312 14:48:46.306447 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:46 crc kubenswrapper[4869]: E0312 14:48:46.407146 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:46 crc kubenswrapper[4869]: E0312 14:48:46.507532 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:46 crc kubenswrapper[4869]: E0312 14:48:46.608462 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:46 crc kubenswrapper[4869]: E0312 14:48:46.709631 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:46 crc kubenswrapper[4869]: E0312 14:48:46.811371 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:46 crc kubenswrapper[4869]: E0312 14:48:46.911631 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:47 crc kubenswrapper[4869]: E0312 14:48:47.012748 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:47 crc kubenswrapper[4869]: E0312 14:48:47.113804 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:47 crc kubenswrapper[4869]: E0312 14:48:47.214150 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:47 crc kubenswrapper[4869]: E0312 14:48:47.314399 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:47 crc kubenswrapper[4869]: E0312 14:48:47.414608 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:47 crc kubenswrapper[4869]: E0312 14:48:47.515427 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:47 crc kubenswrapper[4869]: E0312 14:48:47.615897 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:47 crc kubenswrapper[4869]: E0312 14:48:47.716798 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:47 crc kubenswrapper[4869]: E0312 14:48:47.817754 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:47 crc kubenswrapper[4869]: E0312 14:48:47.918659 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:48 crc kubenswrapper[4869]: E0312 14:48:48.019451 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:48 crc kubenswrapper[4869]: E0312 14:48:48.120653 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:48 crc kubenswrapper[4869]: E0312 14:48:48.221671 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:48 crc kubenswrapper[4869]: E0312 14:48:48.322181 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:48 crc kubenswrapper[4869]: E0312 14:48:48.408981 4869 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 14:48:48 crc kubenswrapper[4869]: E0312 14:48:48.422390 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:48 crc kubenswrapper[4869]: E0312 14:48:48.523455 4869 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 14:48:48 crc kubenswrapper[4869]: E0312 14:48:48.574763 4869 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.579593 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.579620 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.579629 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.579643 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.579652 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:48Z","lastTransitionTime":"2026-03-12T14:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:48 crc kubenswrapper[4869]: E0312 14:48:48.593879 4869 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0727d113-6abb-4498-952f-5280a3e03df5\\\",\\\"systemUUID\\\":\\\"2ba13367-485d-48d1-abc3-723587dc31cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.599380 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.599452 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.599479 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.599510 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.599531 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:48Z","lastTransitionTime":"2026-03-12T14:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:48 crc kubenswrapper[4869]: E0312 14:48:48.616827 4869 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0727d113-6abb-4498-952f-5280a3e03df5\\\",\\\"systemUUID\\\":\\\"2ba13367-485d-48d1-abc3-723587dc31cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.619630 4869 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.621787 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.621816 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.621844 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.621859 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.621868 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:48Z","lastTransitionTime":"2026-03-12T14:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:48 crc kubenswrapper[4869]: E0312 14:48:48.635704 4869 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0727d113-6abb-4498-952f-5280a3e03df5\\\",\\\"systemUUID\\\":\\\"2ba13367-485d-48d1-abc3-723587dc31cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.639530 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.639573 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.639583 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.639596 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.639606 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:48Z","lastTransitionTime":"2026-03-12T14:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:48 crc kubenswrapper[4869]: E0312 14:48:48.649919 4869 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0727d113-6abb-4498-952f-5280a3e03df5\\\",\\\"systemUUID\\\":\\\"2ba13367-485d-48d1-abc3-723587dc31cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:48 crc kubenswrapper[4869]: E0312 14:48:48.650153 4869 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.652113 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.652171 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.652190 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.652215 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.652233 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:48Z","lastTransitionTime":"2026-03-12T14:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.755374 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.756029 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.756124 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.756223 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.756315 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:48Z","lastTransitionTime":"2026-03-12T14:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.859055 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.859271 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.859294 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.859308 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.859318 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:48Z","lastTransitionTime":"2026-03-12T14:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.962910 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.963303 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.963381 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.963453 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:48 crc kubenswrapper[4869]: I0312 14:48:48.963522 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:48Z","lastTransitionTime":"2026-03-12T14:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.067158 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.067513 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.067733 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.067903 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.068041 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:49Z","lastTransitionTime":"2026-03-12T14:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.170374 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.170453 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.170478 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.170508 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.170525 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:49Z","lastTransitionTime":"2026-03-12T14:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.273299 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.273798 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.273948 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.274133 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.274294 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:49Z","lastTransitionTime":"2026-03-12T14:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.321739 4869 apiserver.go:52] "Watching apiserver" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.333727 4869 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.334143 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-hllm5","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-target-xd92c","openshift-image-registry/node-ca-wkrgx","openshift-multus/multus-l8qfx","openshift-multus/multus-additional-cni-plugins-thzhj","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h","openshift-dns/node-resolver-9pvbj","openshift-machine-config-operator/machine-config-daemon-2lgzz","openshift-ovn-kubernetes/ovnkube-node-42vwv","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-operator/iptables-alerter-4ln5h"] Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.334469 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.334888 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wkrgx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.335013 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.335163 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.335167 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9pvbj" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.335203 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.335410 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-thzhj" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.334889 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.334645 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.335594 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.335644 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.335647 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.335692 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.336243 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.336578 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.336669 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.336740 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hllm5" Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.336893 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hllm5" podUID="8415254a-55e8-451e-8be1-364b98f44196" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.340074 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.341159 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.343170 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.343309 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.343527 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.344101 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.344252 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.344367 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.344586 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.344266 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.344839 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.344938 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.345000 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.345224 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.345354 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.345472 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.345723 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.345852 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.345502 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.345534 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.345582 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.345693 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.346293 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.346443 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.346333 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.346821 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.347102 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.347264 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.347385 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.347501 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.347725 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.347934 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.347989 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.348154 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.349593 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.349883 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.349886 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.371852 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.376998 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.377052 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.377067 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.377090 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.377106 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:49Z","lastTransitionTime":"2026-03-12T14:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.385479 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wkrgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5344bfd-e537-4710-abf4-24ece04a3ff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wkrgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.392989 4869 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.399076 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.399153 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.399189 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.399220 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.399246 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.399272 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.399299 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.399333 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.399361 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.399386 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.399413 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.399439 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.399462 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.399400 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.399490 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.399517 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.399563 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.399589 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.399622 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.399647 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.399674 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.399700 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.399728 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.399752 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.399781 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.399845 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.399880 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.399951 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.400015 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.400043 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.400066 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.400116 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.400143 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.400364 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.400400 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.400432 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.400459 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.400589 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.400614 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.400705 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.400741 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.400805 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.400859 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.400884 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.400910 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.400941 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.400979 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.401010 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.401034 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.401061 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.401089 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.401114 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.401138 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.401162 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.401186 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.401207 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.401236 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.401268 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.401299 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.401326 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.401351 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.401412 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.401439 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.401689 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.401661 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.401772 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.401870 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.401882 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.402041 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.402134 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.402178 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.402209 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.402223 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.402264 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.402307 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.402348 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.402388 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.402396 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.402439 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.402475 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.402495 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.402576 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.402616 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.402636 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.402656 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.402706 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.402758 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.402796 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.402834 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.402873 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.402910 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.402948 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.402986 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.402922 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.403029 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.403135 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.403158 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.403221 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.403279 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.403330 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.403379 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.403426 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.403470 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.403603 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.403631 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.403692 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.403745 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.403797 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.403843 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.403888 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.403923 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.403935 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.403986 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.404031 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.404075 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.404118 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.404130 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.404162 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.404212 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.404258 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.404301 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.404343 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.404385 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.404434 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.404482 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.404435 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.404574 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.404629 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.404684 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.404735 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.404788 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.404841 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.404892 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.404939 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.404993 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.405041 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.405095 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.405148 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.405201 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.405254 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.405304 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.405353 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.405418 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.405476 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.405531 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.405622 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.405630 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.405676 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.405742 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.405809 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.405868 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.405915 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.405966 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.406008 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.406051 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.406143 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.406200 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.406252 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.406307 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.406351 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.406390 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.406434 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.406399 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.406474 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.406417 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.406516 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.406589 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.406631 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.406644 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.406769 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.406818 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.406676 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.406985 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.407040 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.407095 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.407148 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.407199 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.407251 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.407309 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.407355 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.407400 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.407463 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.407514 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.407577 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.407616 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.407655 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.407699 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.407737 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.407776 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.407826 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.407898 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.407972 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.408810 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.408881 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.408924 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.408995 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.409048 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.409089 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.409130 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.409174 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.409254 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.409339 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.409406 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.410697 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.411373 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.411450 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.411504 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.411579 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.411627 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.411679 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.411733 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.411793 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.411854 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.412123 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.412335 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.412408 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.412483 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.412596 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.412789 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9eee1745-fda9-4657-8cb0-d491ae450f82-tuning-conf-dir\") pod \"multus-additional-cni-plugins-thzhj\" (UID: \"9eee1745-fda9-4657-8cb0-d491ae450f82\") " pod="openshift-multus/multus-additional-cni-plugins-thzhj" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.412880 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-cni-netd\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.412937 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-host-run-k8s-cni-cncf-io\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.413006 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.413064 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-824vx\" (UniqueName: \"kubernetes.io/projected/8415254a-55e8-451e-8be1-364b98f44196-kube-api-access-824vx\") pod \"network-metrics-daemon-hllm5\" (UID: \"8415254a-55e8-451e-8be1-364b98f44196\") " pod="openshift-multus/network-metrics-daemon-hllm5" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.413124 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.413185 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-hostroot\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.413245 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.413302 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.413352 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fe3a984d-84f6-4421-85dc-ecb4bdb29707-hosts-file\") pod \"node-resolver-9pvbj\" (UID: \"fe3a984d-84f6-4421-85dc-ecb4bdb29707\") " pod="openshift-dns/node-resolver-9pvbj" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.413406 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss8tm\" (UniqueName: \"kubernetes.io/projected/fe3a984d-84f6-4421-85dc-ecb4bdb29707-kube-api-access-ss8tm\") pod \"node-resolver-9pvbj\" (UID: \"fe3a984d-84f6-4421-85dc-ecb4bdb29707\") " pod="openshift-dns/node-resolver-9pvbj" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.413478 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-log-socket\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.413598 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7rnt\" (UniqueName: \"kubernetes.io/projected/7edaf111-2689-4453-ba78-00677e1b6316-kube-api-access-t7rnt\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.413643 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-multus-conf-dir\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.413684 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8415254a-55e8-451e-8be1-364b98f44196-metrics-certs\") pod \"network-metrics-daemon-hllm5\" (UID: \"8415254a-55e8-451e-8be1-364b98f44196\") " pod="openshift-multus/network-metrics-daemon-hllm5" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.413732 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.413773 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-run-systemd\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.413814 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c553534-16cf-4a8f-8d01-518e9526a117-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zgf6h\" (UID: \"2c553534-16cf-4a8f-8d01-518e9526a117\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.413858 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78csh\" (UniqueName: \"kubernetes.io/projected/2c553534-16cf-4a8f-8d01-518e9526a117-kube-api-access-78csh\") pod \"ovnkube-control-plane-749d76644c-zgf6h\" (UID: \"2c553534-16cf-4a8f-8d01-518e9526a117\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.413901 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.413942 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1621c994-94d2-4105-a988-f4739518ba91-proxy-tls\") pod \"machine-config-daemon-2lgzz\" (UID: \"1621c994-94d2-4105-a988-f4739518ba91\") " pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.413981 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1621c994-94d2-4105-a988-f4739518ba91-mcd-auth-proxy-config\") pod \"machine-config-daemon-2lgzz\" (UID: \"1621c994-94d2-4105-a988-f4739518ba91\") " pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.414020 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7edaf111-2689-4453-ba78-00677e1b6316-ovn-node-metrics-cert\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.414094 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-host-run-netns\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.414140 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-host-var-lib-cni-multus\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.414180 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-host-run-multus-certs\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.414229 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.414272 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-run-ovn-kubernetes\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.414333 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.414393 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9eee1745-fda9-4657-8cb0-d491ae450f82-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-thzhj\" (UID: \"9eee1745-fda9-4657-8cb0-d491ae450f82\") " pod="openshift-multus/multus-additional-cni-plugins-thzhj" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.414461 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.414521 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7edaf111-2689-4453-ba78-00677e1b6316-ovnkube-script-lib\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.414613 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-os-release\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.414655 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2fd2bb3f-6860-4631-a95c-c910d33724b6-cni-binary-copy\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.414693 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-host-var-lib-cni-bin\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.414743 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.414786 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-run-ovn\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.414837 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.414876 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-run-netns\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.414920 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7edaf111-2689-4453-ba78-00677e1b6316-env-overrides\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.414975 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-system-cni-dir\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.415030 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2c553534-16cf-4a8f-8d01-518e9526a117-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zgf6h\" (UID: \"2c553534-16cf-4a8f-8d01-518e9526a117\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.415091 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.415150 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9eee1745-fda9-4657-8cb0-d491ae450f82-cnibin\") pod \"multus-additional-cni-plugins-thzhj\" (UID: \"9eee1745-fda9-4657-8cb0-d491ae450f82\") " pod="openshift-multus/multus-additional-cni-plugins-thzhj" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.415197 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9eee1745-fda9-4657-8cb0-d491ae450f82-os-release\") pod \"multus-additional-cni-plugins-thzhj\" (UID: \"9eee1745-fda9-4657-8cb0-d491ae450f82\") " pod="openshift-multus/multus-additional-cni-plugins-thzhj" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.415314 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-etc-kubernetes\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.415387 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9eee1745-fda9-4657-8cb0-d491ae450f82-cni-binary-copy\") pod \"multus-additional-cni-plugins-thzhj\" (UID: \"9eee1745-fda9-4657-8cb0-d491ae450f82\") " pod="openshift-multus/multus-additional-cni-plugins-thzhj" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.415462 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-kubelet\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.407097 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.407131 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.407182 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.407557 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.408260 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.409431 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.409583 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.409572 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.410058 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.410420 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.418970 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.410471 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.410708 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.411418 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.411698 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.411746 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.412174 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.412738 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.412804 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.413112 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.413127 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.413687 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.414179 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.414375 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.415065 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.415321 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.415565 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:48:49.915527939 +0000 UTC m=+82.200753217 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.415946 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.415966 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.416069 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.416218 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.416590 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.416683 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.416775 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.416807 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.417093 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.417740 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.417993 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.418283 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.418599 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.418721 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.419488 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.419517 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.419142 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.419652 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.419855 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.419853 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.419874 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-systemd-units\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.418846 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.419008 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.419051 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.419219 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.419344 4869 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.419941 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.418680 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.420140 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.420343 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.420567 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.420786 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.421080 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.422029 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.421817 4869 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.422442 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.422451 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.422526 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.422922 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.423066 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.423095 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.422783 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.423366 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.423483 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.423436 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.423458 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.423472 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.423606 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.422923 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.423751 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.424083 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.424173 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.424225 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.424228 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.424794 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.424963 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.425087 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.424719 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.425643 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.425915 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.426190 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.426347 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.426354 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.426374 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.426969 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.427014 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.427361 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.427660 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.427757 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.427771 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.427825 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.427920 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.428250 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.428278 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.428300 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.428671 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.428784 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.428821 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.429054 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.429112 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.429308 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.429716 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.429816 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.430158 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.430169 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.430178 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.430268 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-multus-cni-dir\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.430304 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-multus-socket-dir-parent\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.420029 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.430406 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dhml\" (UniqueName: \"kubernetes.io/projected/2fd2bb3f-6860-4631-a95c-c910d33724b6-kube-api-access-2dhml\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.430273 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.430469 4869 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.430501 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.430510 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.430528 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 14:48:49.930511686 +0000 UTC m=+82.215736964 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.430627 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-slash\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.430697 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 14:48:49.930689271 +0000 UTC m=+82.215914539 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.430729 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.430777 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.430816 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-host-var-lib-kubelet\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.430871 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2fd2bb3f-6860-4631-a95c-c910d33724b6-multus-daemon-config\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.430924 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9eee1745-fda9-4657-8cb0-d491ae450f82-system-cni-dir\") pod \"multus-additional-cni-plugins-thzhj\" (UID: \"9eee1745-fda9-4657-8cb0-d491ae450f82\") " pod="openshift-multus/multus-additional-cni-plugins-thzhj" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.430965 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxrpz\" (UniqueName: \"kubernetes.io/projected/f5344bfd-e537-4710-abf4-24ece04a3ff0-kube-api-access-jxrpz\") pod \"node-ca-wkrgx\" (UID: \"f5344bfd-e537-4710-abf4-24ece04a3ff0\") " pod="openshift-image-registry/node-ca-wkrgx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.431000 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2c553534-16cf-4a8f-8d01-518e9526a117-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zgf6h\" (UID: \"2c553534-16cf-4a8f-8d01-518e9526a117\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.431108 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6qq8\" (UniqueName: \"kubernetes.io/projected/9eee1745-fda9-4657-8cb0-d491ae450f82-kube-api-access-s6qq8\") pod \"multus-additional-cni-plugins-thzhj\" (UID: \"9eee1745-fda9-4657-8cb0-d491ae450f82\") " pod="openshift-multus/multus-additional-cni-plugins-thzhj" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.431146 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-etc-openvswitch\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.431178 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-node-log\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.431212 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.431238 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f5344bfd-e537-4710-abf4-24ece04a3ff0-serviceca\") pod \"node-ca-wkrgx\" (UID: \"f5344bfd-e537-4710-abf4-24ece04a3ff0\") " pod="openshift-image-registry/node-ca-wkrgx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.431322 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-cnibin\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.431497 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.431582 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.431613 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-var-lib-openvswitch\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.431637 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5344bfd-e537-4710-abf4-24ece04a3ff0-host\") pod \"node-ca-wkrgx\" (UID: \"f5344bfd-e537-4710-abf4-24ece04a3ff0\") " pod="openshift-image-registry/node-ca-wkrgx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.432138 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1621c994-94d2-4105-a988-f4739518ba91-rootfs\") pod \"machine-config-daemon-2lgzz\" (UID: \"1621c994-94d2-4105-a988-f4739518ba91\") " pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.431743 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.432143 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.432183 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.432215 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr2bz\" (UniqueName: \"kubernetes.io/projected/1621c994-94d2-4105-a988-f4739518ba91-kube-api-access-gr2bz\") pod \"machine-config-daemon-2lgzz\" (UID: \"1621c994-94d2-4105-a988-f4739518ba91\") " pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.432270 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-run-openvswitch\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.432447 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-cni-bin\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.432490 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7edaf111-2689-4453-ba78-00677e1b6316-ovnkube-config\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.432507 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.432676 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.432706 4869 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.432720 4869 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.432737 4869 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.432753 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.432771 4869 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.432786 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.432799 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.432812 4869 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.432821 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.432831 4869 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.432841 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.432759 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.432851 4869 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.432896 4869 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.432914 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.432932 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.432947 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.432962 4869 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.432977 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.432990 4869 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433004 4869 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433017 4869 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433031 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433046 4869 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433059 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433076 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433090 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433104 4869 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433120 4869 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433133 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433146 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433159 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433171 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433183 4869 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433195 4869 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433209 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433222 4869 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433235 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433247 4869 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433262 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433279 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433293 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433305 4869 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433319 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433414 4869 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433427 4869 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433440 4869 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433454 4869 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433467 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433480 4869 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433494 4869 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433508 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433520 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433533 4869 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433564 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433577 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433588 4869 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433601 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433614 4869 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433626 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433638 4869 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433650 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433663 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433677 4869 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433689 4869 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433701 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433713 4869 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433726 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433738 4869 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433752 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433764 4869 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433776 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433789 4869 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433803 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433815 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433828 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433841 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433855 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433873 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433903 4869 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433922 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433938 4869 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433955 4869 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433971 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.433987 4869 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434002 4869 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434016 4869 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434034 4869 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434050 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434066 4869 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434114 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434129 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434144 4869 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434160 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434179 4869 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434196 4869 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434210 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434224 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434236 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434248 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434264 4869 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434277 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434289 4869 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434302 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434316 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434329 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434343 4869 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434356 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434368 4869 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434382 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434400 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434418 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434434 4869 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434448 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434462 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434476 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434488 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434503 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434516 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434529 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434583 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434597 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434609 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434621 4869 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434634 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434649 4869 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434661 4869 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434675 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434688 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434701 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434714 4869 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434725 4869 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434738 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434752 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434764 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.434776 4869 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.435394 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.435485 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.436344 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1621c994-94d2-4105-a988-f4739518ba91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2bz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2bz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2lgzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.436919 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.438382 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.438927 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.439054 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.439727 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.440386 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.441077 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.441752 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.445368 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.445697 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.445889 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.446257 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.446830 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.446949 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.449237 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.449262 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.449304 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.449717 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.449763 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.449786 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.449801 4869 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.449838 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.449858 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.449868 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-12 14:48:49.949849972 +0000 UTC m=+82.235075250 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.449855 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.449873 4869 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.449996 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-12 14:48:49.949972205 +0000 UTC m=+82.235197493 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.450188 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.450374 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.450880 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.451320 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.453911 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.453976 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.454108 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.455089 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.455249 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.455630 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.455854 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.455970 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.456308 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.456852 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.456943 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.457155 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.457410 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.457205 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.457685 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.457716 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.457802 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.458990 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.459082 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.461075 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.461073 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7edaf111-2689-4453-ba78-00677e1b6316\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-42vwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.462511 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.462615 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.462717 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.462969 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.463633 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.463772 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.468435 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.468474 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.468586 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.468860 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.469144 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.469734 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.469847 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.472824 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hllm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8415254a-55e8-451e-8be1-364b98f44196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-824vx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-824vx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hllm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.480079 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.480146 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.480161 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.480181 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.480195 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:49Z","lastTransitionTime":"2026-03-12T14:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.480832 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wkrgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5344bfd-e537-4710-abf4-24ece04a3ff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wkrgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.485199 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.486078 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.490375 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1621c994-94d2-4105-a988-f4739518ba91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2bz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2bz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2lgzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.492570 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.509143 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7edaf111-2689-4453-ba78-00677e1b6316\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-42vwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.520085 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.529102 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.536199 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-multus-conf-dir\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.536247 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8415254a-55e8-451e-8be1-364b98f44196-metrics-certs\") pod \"network-metrics-daemon-hllm5\" (UID: \"8415254a-55e8-451e-8be1-364b98f44196\") " pod="openshift-multus/network-metrics-daemon-hllm5" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.536271 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.536294 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-run-systemd\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.536313 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-multus-conf-dir\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.536319 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c553534-16cf-4a8f-8d01-518e9526a117-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zgf6h\" (UID: \"2c553534-16cf-4a8f-8d01-518e9526a117\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.536383 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78csh\" (UniqueName: \"kubernetes.io/projected/2c553534-16cf-4a8f-8d01-518e9526a117-kube-api-access-78csh\") pod \"ovnkube-control-plane-749d76644c-zgf6h\" (UID: \"2c553534-16cf-4a8f-8d01-518e9526a117\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.536452 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-run-systemd\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.536557 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.536578 4869 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.536768 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1621c994-94d2-4105-a988-f4739518ba91-proxy-tls\") pod \"machine-config-daemon-2lgzz\" (UID: \"1621c994-94d2-4105-a988-f4739518ba91\") " pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.536879 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8415254a-55e8-451e-8be1-364b98f44196-metrics-certs podName:8415254a-55e8-451e-8be1-364b98f44196 nodeName:}" failed. No retries permitted until 2026-03-12 14:48:50.036858467 +0000 UTC m=+82.322083795 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8415254a-55e8-451e-8be1-364b98f44196-metrics-certs") pod "network-metrics-daemon-hllm5" (UID: "8415254a-55e8-451e-8be1-364b98f44196") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.536916 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1621c994-94d2-4105-a988-f4739518ba91-mcd-auth-proxy-config\") pod \"machine-config-daemon-2lgzz\" (UID: \"1621c994-94d2-4105-a988-f4739518ba91\") " pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.536935 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7edaf111-2689-4453-ba78-00677e1b6316-ovn-node-metrics-cert\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.536955 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-host-run-netns\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.536973 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-host-var-lib-cni-multus\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.536988 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-host-run-multus-certs\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537013 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-run-ovn-kubernetes\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537035 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9eee1745-fda9-4657-8cb0-d491ae450f82-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-thzhj\" (UID: \"9eee1745-fda9-4657-8cb0-d491ae450f82\") " pod="openshift-multus/multus-additional-cni-plugins-thzhj" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537068 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7edaf111-2689-4453-ba78-00677e1b6316-ovnkube-script-lib\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537103 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-os-release\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537120 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2fd2bb3f-6860-4631-a95c-c910d33724b6-cni-binary-copy\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537139 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-host-var-lib-cni-bin\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537164 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-run-ovn\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537190 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-run-netns\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537205 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7edaf111-2689-4453-ba78-00677e1b6316-env-overrides\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537220 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-system-cni-dir\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537234 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2c553534-16cf-4a8f-8d01-518e9526a117-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zgf6h\" (UID: \"2c553534-16cf-4a8f-8d01-518e9526a117\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537256 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9eee1745-fda9-4657-8cb0-d491ae450f82-cnibin\") pod \"multus-additional-cni-plugins-thzhj\" (UID: \"9eee1745-fda9-4657-8cb0-d491ae450f82\") " pod="openshift-multus/multus-additional-cni-plugins-thzhj" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537275 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9eee1745-fda9-4657-8cb0-d491ae450f82-os-release\") pod \"multus-additional-cni-plugins-thzhj\" (UID: \"9eee1745-fda9-4657-8cb0-d491ae450f82\") " pod="openshift-multus/multus-additional-cni-plugins-thzhj" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537290 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-etc-kubernetes\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537296 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-host-run-netns\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537397 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9eee1745-fda9-4657-8cb0-d491ae450f82-cni-binary-copy\") pod \"multus-additional-cni-plugins-thzhj\" (UID: \"9eee1745-fda9-4657-8cb0-d491ae450f82\") " pod="openshift-multus/multus-additional-cni-plugins-thzhj" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537394 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-host-var-lib-cni-multus\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537428 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-kubelet\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537466 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-kubelet\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537479 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-host-run-multus-certs\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537488 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-systemd-units\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537517 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-multus-cni-dir\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537523 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-run-ovn\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537562 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-multus-socket-dir-parent\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537571 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-run-netns\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537588 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dhml\" (UniqueName: \"kubernetes.io/projected/2fd2bb3f-6860-4631-a95c-c910d33724b6-kube-api-access-2dhml\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537613 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-slash\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537640 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537665 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-host-var-lib-kubelet\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537687 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2fd2bb3f-6860-4631-a95c-c910d33724b6-multus-daemon-config\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537708 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9eee1745-fda9-4657-8cb0-d491ae450f82-system-cni-dir\") pod \"multus-additional-cni-plugins-thzhj\" (UID: \"9eee1745-fda9-4657-8cb0-d491ae450f82\") " pod="openshift-multus/multus-additional-cni-plugins-thzhj" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537732 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxrpz\" (UniqueName: \"kubernetes.io/projected/f5344bfd-e537-4710-abf4-24ece04a3ff0-kube-api-access-jxrpz\") pod \"node-ca-wkrgx\" (UID: \"f5344bfd-e537-4710-abf4-24ece04a3ff0\") " pod="openshift-image-registry/node-ca-wkrgx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537756 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2c553534-16cf-4a8f-8d01-518e9526a117-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zgf6h\" (UID: \"2c553534-16cf-4a8f-8d01-518e9526a117\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537779 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6qq8\" (UniqueName: \"kubernetes.io/projected/9eee1745-fda9-4657-8cb0-d491ae450f82-kube-api-access-s6qq8\") pod \"multus-additional-cni-plugins-thzhj\" (UID: \"9eee1745-fda9-4657-8cb0-d491ae450f82\") " pod="openshift-multus/multus-additional-cni-plugins-thzhj" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537811 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-etc-openvswitch\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537832 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-node-log\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537859 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f5344bfd-e537-4710-abf4-24ece04a3ff0-serviceca\") pod \"node-ca-wkrgx\" (UID: \"f5344bfd-e537-4710-abf4-24ece04a3ff0\") " pod="openshift-image-registry/node-ca-wkrgx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537884 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-cnibin\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537897 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537938 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-var-lib-openvswitch\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537953 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1621c994-94d2-4105-a988-f4739518ba91-mcd-auth-proxy-config\") pod \"machine-config-daemon-2lgzz\" (UID: \"1621c994-94d2-4105-a988-f4739518ba91\") " pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537999 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-host-var-lib-kubelet\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.538003 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5344bfd-e537-4710-abf4-24ece04a3ff0-host\") pod \"node-ca-wkrgx\" (UID: \"f5344bfd-e537-4710-abf4-24ece04a3ff0\") " pod="openshift-image-registry/node-ca-wkrgx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537962 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5344bfd-e537-4710-abf4-24ece04a3ff0-host\") pod \"node-ca-wkrgx\" (UID: \"f5344bfd-e537-4710-abf4-24ece04a3ff0\") " pod="openshift-image-registry/node-ca-wkrgx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.538006 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-multus-cni-dir\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537956 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-systemd-units\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.538040 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1621c994-94d2-4105-a988-f4739518ba91-rootfs\") pod \"machine-config-daemon-2lgzz\" (UID: \"1621c994-94d2-4105-a988-f4739518ba91\") " pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.538058 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9eee1745-fda9-4657-8cb0-d491ae450f82-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-thzhj\" (UID: \"9eee1745-fda9-4657-8cb0-d491ae450f82\") " pod="openshift-multus/multus-additional-cni-plugins-thzhj" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.538064 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1621c994-94d2-4105-a988-f4739518ba91-rootfs\") pod \"machine-config-daemon-2lgzz\" (UID: \"1621c994-94d2-4105-a988-f4739518ba91\") " pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.538089 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr2bz\" (UniqueName: \"kubernetes.io/projected/1621c994-94d2-4105-a988-f4739518ba91-kube-api-access-gr2bz\") pod \"machine-config-daemon-2lgzz\" (UID: \"1621c994-94d2-4105-a988-f4739518ba91\") " pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.538115 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-run-openvswitch\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.538169 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-cni-bin\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.538173 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7edaf111-2689-4453-ba78-00677e1b6316-env-overrides\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.538193 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7edaf111-2689-4453-ba78-00677e1b6316-ovnkube-config\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.538246 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-system-cni-dir\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.538253 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9eee1745-fda9-4657-8cb0-d491ae450f82-tuning-conf-dir\") pod \"multus-additional-cni-plugins-thzhj\" (UID: \"9eee1745-fda9-4657-8cb0-d491ae450f82\") " pod="openshift-multus/multus-additional-cni-plugins-thzhj" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.538288 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-cni-netd\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.538321 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-host-run-k8s-cni-cncf-io\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.538363 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-slash\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.538358 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-824vx\" (UniqueName: \"kubernetes.io/projected/8415254a-55e8-451e-8be1-364b98f44196-kube-api-access-824vx\") pod \"network-metrics-daemon-hllm5\" (UID: \"8415254a-55e8-451e-8be1-364b98f44196\") " pod="openshift-multus/network-metrics-daemon-hllm5" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.538423 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-log-socket\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.538448 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7rnt\" (UniqueName: \"kubernetes.io/projected/7edaf111-2689-4453-ba78-00677e1b6316-kube-api-access-t7rnt\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.538473 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-hostroot\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.538496 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.538518 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fe3a984d-84f6-4421-85dc-ecb4bdb29707-hosts-file\") pod \"node-resolver-9pvbj\" (UID: \"fe3a984d-84f6-4421-85dc-ecb4bdb29707\") " pod="openshift-dns/node-resolver-9pvbj" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.538039 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-multus-socket-dir-parent\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.538562 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss8tm\" (UniqueName: \"kubernetes.io/projected/fe3a984d-84f6-4421-85dc-ecb4bdb29707-kube-api-access-ss8tm\") pod \"node-resolver-9pvbj\" (UID: \"fe3a984d-84f6-4421-85dc-ecb4bdb29707\") " pod="openshift-dns/node-resolver-9pvbj" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.538635 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.538699 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2c553534-16cf-4a8f-8d01-518e9526a117-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zgf6h\" (UID: \"2c553534-16cf-4a8f-8d01-518e9526a117\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.537517 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-run-ovn-kubernetes\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.538969 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7edaf111-2689-4453-ba78-00677e1b6316-ovnkube-script-lib\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.539994 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2fd2bb3f-6860-4631-a95c-c910d33724b6-multus-daemon-config\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.540011 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c553534-16cf-4a8f-8d01-518e9526a117-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zgf6h\" (UID: \"2c553534-16cf-4a8f-8d01-518e9526a117\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.540297 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9eee1745-fda9-4657-8cb0-d491ae450f82-cnibin\") pod \"multus-additional-cni-plugins-thzhj\" (UID: \"9eee1745-fda9-4657-8cb0-d491ae450f82\") " pod="openshift-multus/multus-additional-cni-plugins-thzhj" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.540551 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-etc-kubernetes\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.541069 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1621c994-94d2-4105-a988-f4739518ba91-proxy-tls\") pod \"machine-config-daemon-2lgzz\" (UID: \"1621c994-94d2-4105-a988-f4739518ba91\") " pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.541082 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9eee1745-fda9-4657-8cb0-d491ae450f82-cni-binary-copy\") pod \"multus-additional-cni-plugins-thzhj\" (UID: \"9eee1745-fda9-4657-8cb0-d491ae450f82\") " pod="openshift-multus/multus-additional-cni-plugins-thzhj" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.541117 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.541145 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-etc-openvswitch\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.541148 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-cni-bin\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.541231 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-hostroot\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.541247 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9eee1745-fda9-4657-8cb0-d491ae450f82-system-cni-dir\") pod \"multus-additional-cni-plugins-thzhj\" (UID: \"9eee1745-fda9-4657-8cb0-d491ae450f82\") " pod="openshift-multus/multus-additional-cni-plugins-thzhj" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.541337 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fe3a984d-84f6-4421-85dc-ecb4bdb29707-hosts-file\") pod \"node-resolver-9pvbj\" (UID: \"fe3a984d-84f6-4421-85dc-ecb4bdb29707\") " pod="openshift-dns/node-resolver-9pvbj" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.541337 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-log-socket\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.541493 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-cnibin\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.541522 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-node-log\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.541871 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-var-lib-openvswitch\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.541911 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-host-var-lib-cni-bin\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.542019 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2c553534-16cf-4a8f-8d01-518e9526a117-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zgf6h\" (UID: \"2c553534-16cf-4a8f-8d01-518e9526a117\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.542034 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7edaf111-2689-4453-ba78-00677e1b6316-ovnkube-config\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.542074 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-host-run-k8s-cni-cncf-io\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.542417 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-cni-netd\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.542400 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2fd2bb3f-6860-4631-a95c-c910d33724b6-os-release\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.542457 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-run-openvswitch\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.542588 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7edaf111-2689-4453-ba78-00677e1b6316-ovn-node-metrics-cert\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.542896 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2fd2bb3f-6860-4631-a95c-c910d33724b6-cni-binary-copy\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.543742 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.544155 4869 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.544174 4869 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.544187 4869 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.544204 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.544216 4869 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.544228 4869 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.544240 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.544608 4869 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.544623 4869 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.544637 4869 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.544650 4869 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.544663 4869 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.544709 4869 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.544723 4869 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.544736 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.544732 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9eee1745-fda9-4657-8cb0-d491ae450f82-tuning-conf-dir\") pod \"multus-additional-cni-plugins-thzhj\" (UID: \"9eee1745-fda9-4657-8cb0-d491ae450f82\") " pod="openshift-multus/multus-additional-cni-plugins-thzhj" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.544748 4869 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.544788 4869 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.544801 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.544831 4869 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.544857 4869 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.544875 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.544893 4869 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.544911 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.544930 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.544948 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.544970 4869 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.544988 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.545006 4869 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.545025 4869 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.545043 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.545061 4869 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.545080 4869 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.545097 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.545114 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.545131 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.545151 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.545169 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.545187 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.545205 4869 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.545222 4869 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.545239 4869 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.545255 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.545266 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f5344bfd-e537-4710-abf4-24ece04a3ff0-serviceca\") pod \"node-ca-wkrgx\" (UID: \"f5344bfd-e537-4710-abf4-24ece04a3ff0\") " pod="openshift-image-registry/node-ca-wkrgx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.545273 4869 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.545872 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.545891 4869 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.545906 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.545918 4869 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.545930 4869 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.545944 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.545958 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.545971 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.545983 4869 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.545995 4869 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.546009 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.546022 4869 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.546034 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.546045 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.548140 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9pvbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe3a984d-84f6-4421-85dc-ecb4bdb29707\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ss8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9pvbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.549280 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9eee1745-fda9-4657-8cb0-d491ae450f82-os-release\") pod \"multus-additional-cni-plugins-thzhj\" (UID: \"9eee1745-fda9-4657-8cb0-d491ae450f82\") " pod="openshift-multus/multus-additional-cni-plugins-thzhj" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.554912 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dhml\" (UniqueName: \"kubernetes.io/projected/2fd2bb3f-6860-4631-a95c-c910d33724b6-kube-api-access-2dhml\") pod \"multus-l8qfx\" (UID: \"2fd2bb3f-6860-4631-a95c-c910d33724b6\") " pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.555211 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78csh\" (UniqueName: \"kubernetes.io/projected/2c553534-16cf-4a8f-8d01-518e9526a117-kube-api-access-78csh\") pod \"ovnkube-control-plane-749d76644c-zgf6h\" (UID: \"2c553534-16cf-4a8f-8d01-518e9526a117\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.557387 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6qq8\" (UniqueName: \"kubernetes.io/projected/9eee1745-fda9-4657-8cb0-d491ae450f82-kube-api-access-s6qq8\") pod \"multus-additional-cni-plugins-thzhj\" (UID: \"9eee1745-fda9-4657-8cb0-d491ae450f82\") " pod="openshift-multus/multus-additional-cni-plugins-thzhj" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.557986 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-824vx\" (UniqueName: \"kubernetes.io/projected/8415254a-55e8-451e-8be1-364b98f44196-kube-api-access-824vx\") pod \"network-metrics-daemon-hllm5\" (UID: \"8415254a-55e8-451e-8be1-364b98f44196\") " pod="openshift-multus/network-metrics-daemon-hllm5" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.558358 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxrpz\" (UniqueName: \"kubernetes.io/projected/f5344bfd-e537-4710-abf4-24ece04a3ff0-kube-api-access-jxrpz\") pod \"node-ca-wkrgx\" (UID: \"f5344bfd-e537-4710-abf4-24ece04a3ff0\") " pod="openshift-image-registry/node-ca-wkrgx" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.560199 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.562470 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss8tm\" (UniqueName: \"kubernetes.io/projected/fe3a984d-84f6-4421-85dc-ecb4bdb29707-kube-api-access-ss8tm\") pod \"node-resolver-9pvbj\" (UID: \"fe3a984d-84f6-4421-85dc-ecb4bdb29707\") " pod="openshift-dns/node-resolver-9pvbj" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.562877 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7rnt\" (UniqueName: \"kubernetes.io/projected/7edaf111-2689-4453-ba78-00677e1b6316-kube-api-access-t7rnt\") pod \"ovnkube-node-42vwv\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.565728 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr2bz\" (UniqueName: \"kubernetes.io/projected/1621c994-94d2-4105-a988-f4739518ba91-kube-api-access-gr2bz\") pod \"machine-config-daemon-2lgzz\" (UID: \"1621c994-94d2-4105-a988-f4739518ba91\") " pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.567773 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.575141 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.581898 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.581932 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.581943 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.581957 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.581967 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:49Z","lastTransitionTime":"2026-03-12T14:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.592989 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thzhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eee1745-fda9-4657-8cb0-d491ae450f82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thzhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.600245 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c553534-16cf-4a8f-8d01-518e9526a117\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78csh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78csh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zgf6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.614565 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-l8qfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd2bb3f-6860-4631-a95c-c910d33724b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dhml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-l8qfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.653177 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.662079 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wkrgx" Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.667390 4869 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 12 14:48:49 crc kubenswrapper[4869]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 12 14:48:49 crc kubenswrapper[4869]: set -o allexport Mar 12 14:48:49 crc kubenswrapper[4869]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 12 14:48:49 crc kubenswrapper[4869]: source /etc/kubernetes/apiserver-url.env Mar 12 14:48:49 crc kubenswrapper[4869]: else Mar 12 14:48:49 crc kubenswrapper[4869]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 12 14:48:49 crc kubenswrapper[4869]: exit 1 Mar 12 14:48:49 crc kubenswrapper[4869]: fi Mar 12 14:48:49 crc kubenswrapper[4869]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 12 14:48:49 crc kubenswrapper[4869]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 14:48:49 crc kubenswrapper[4869]: > logger="UnhandledError" Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.668749 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 12 14:48:49 crc kubenswrapper[4869]: W0312 14:48:49.671192 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5344bfd_e537_4710_abf4_24ece04a3ff0.slice/crio-b4517203c5417971561d1cac24e53ab4f9a9ccba2797a34f4a5f4cd4225dfc5c WatchSource:0}: Error finding container b4517203c5417971561d1cac24e53ab4f9a9ccba2797a34f4a5f4cd4225dfc5c: Status 404 returned error can't find the container with id b4517203c5417971561d1cac24e53ab4f9a9ccba2797a34f4a5f4cd4225dfc5c Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.672991 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.674802 4869 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 12 14:48:49 crc kubenswrapper[4869]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 12 14:48:49 crc kubenswrapper[4869]: while [ true ]; Mar 12 14:48:49 crc kubenswrapper[4869]: do Mar 12 14:48:49 crc kubenswrapper[4869]: for f in $(ls /tmp/serviceca); do Mar 12 14:48:49 crc kubenswrapper[4869]: echo $f Mar 12 14:48:49 crc kubenswrapper[4869]: ca_file_path="/tmp/serviceca/${f}" Mar 12 14:48:49 crc kubenswrapper[4869]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 12 14:48:49 crc kubenswrapper[4869]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 12 14:48:49 crc kubenswrapper[4869]: if [ -e "${reg_dir_path}" ]; then Mar 12 14:48:49 crc kubenswrapper[4869]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 12 14:48:49 crc kubenswrapper[4869]: else Mar 12 14:48:49 crc kubenswrapper[4869]: mkdir $reg_dir_path Mar 12 14:48:49 crc kubenswrapper[4869]: cp $ca_file_path $reg_dir_path/ca.crt Mar 12 14:48:49 crc kubenswrapper[4869]: fi Mar 12 14:48:49 crc kubenswrapper[4869]: done Mar 12 14:48:49 crc kubenswrapper[4869]: for d in $(ls /etc/docker/certs.d); do Mar 12 14:48:49 crc kubenswrapper[4869]: echo $d Mar 12 14:48:49 crc kubenswrapper[4869]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 12 14:48:49 crc kubenswrapper[4869]: reg_conf_path="/tmp/serviceca/${dp}" Mar 12 14:48:49 crc kubenswrapper[4869]: if [ ! -e "${reg_conf_path}" ]; then Mar 12 14:48:49 crc kubenswrapper[4869]: rm -rf /etc/docker/certs.d/$d Mar 12 14:48:49 crc kubenswrapper[4869]: fi Mar 12 14:48:49 crc kubenswrapper[4869]: done Mar 12 14:48:49 crc kubenswrapper[4869]: sleep 60 & wait ${!} Mar 12 14:48:49 crc kubenswrapper[4869]: done Mar 12 14:48:49 crc kubenswrapper[4869]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jxrpz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-wkrgx_openshift-image-registry(f5344bfd-e537-4710-abf4-24ece04a3ff0): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 14:48:49 crc kubenswrapper[4869]: > logger="UnhandledError" Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.675974 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-wkrgx" podUID="f5344bfd-e537-4710-abf4-24ece04a3ff0" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.683380 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.683528 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.683565 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.683574 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.683586 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.683594 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:49Z","lastTransitionTime":"2026-03-12T14:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:49 crc kubenswrapper[4869]: W0312 14:48:49.683887 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-e176f194db94f38f8699b67ee930578b3fe4e650b9d66be55f7c9c2d6aa625ce WatchSource:0}: Error finding container e176f194db94f38f8699b67ee930578b3fe4e650b9d66be55f7c9c2d6aa625ce: Status 404 returned error can't find the container with id e176f194db94f38f8699b67ee930578b3fe4e650b9d66be55f7c9c2d6aa625ce Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.686871 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.688086 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 12 14:48:49 crc kubenswrapper[4869]: W0312 14:48:49.691959 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1621c994_94d2_4105_a988_f4739518ba91.slice/crio-d56c38143aed882c4ecd67b37d48878589ec84858bdb72f71b57711fe9316c77 WatchSource:0}: Error finding container d56c38143aed882c4ecd67b37d48878589ec84858bdb72f71b57711fe9316c77: Status 404 returned error can't find the container with id d56c38143aed882c4ecd67b37d48878589ec84858bdb72f71b57711fe9316c77 Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.693860 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gr2bz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.695071 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-thzhj" Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.696661 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gr2bz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.698083 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.709159 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s6qq8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-thzhj_openshift-multus(9eee1745-fda9-4657-8cb0-d491ae450f82): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.710287 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-thzhj" podUID="9eee1745-fda9-4657-8cb0-d491ae450f82" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.713845 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:48:49 crc kubenswrapper[4869]: W0312 14:48:49.725101 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7edaf111_2689_4453_ba78_00677e1b6316.slice/crio-7f1e42c08801651f0218ac42cd5a6a27bea2aa0c9fff4ba369ad616449216821 WatchSource:0}: Error finding container 7f1e42c08801651f0218ac42cd5a6a27bea2aa0c9fff4ba369ad616449216821: Status 404 returned error can't find the container with id 7f1e42c08801651f0218ac42cd5a6a27bea2aa0c9fff4ba369ad616449216821 Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.728635 4869 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 12 14:48:49 crc kubenswrapper[4869]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 12 14:48:49 crc kubenswrapper[4869]: apiVersion: v1 Mar 12 14:48:49 crc kubenswrapper[4869]: clusters: Mar 12 14:48:49 crc kubenswrapper[4869]: - cluster: Mar 12 14:48:49 crc kubenswrapper[4869]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 12 14:48:49 crc kubenswrapper[4869]: server: https://api-int.crc.testing:6443 Mar 12 14:48:49 crc kubenswrapper[4869]: name: default-cluster Mar 12 14:48:49 crc kubenswrapper[4869]: contexts: Mar 12 14:48:49 crc kubenswrapper[4869]: - context: Mar 12 14:48:49 crc kubenswrapper[4869]: cluster: default-cluster Mar 12 14:48:49 crc kubenswrapper[4869]: namespace: default Mar 12 14:48:49 crc kubenswrapper[4869]: user: default-auth Mar 12 14:48:49 crc kubenswrapper[4869]: name: default-context Mar 12 14:48:49 crc kubenswrapper[4869]: current-context: default-context Mar 12 14:48:49 crc kubenswrapper[4869]: kind: Config Mar 12 14:48:49 crc kubenswrapper[4869]: preferences: {} Mar 12 14:48:49 crc kubenswrapper[4869]: users: Mar 12 14:48:49 crc kubenswrapper[4869]: - name: default-auth Mar 12 14:48:49 crc kubenswrapper[4869]: user: Mar 12 14:48:49 crc kubenswrapper[4869]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 12 14:48:49 crc kubenswrapper[4869]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 12 14:48:49 crc kubenswrapper[4869]: EOF Mar 12 14:48:49 crc kubenswrapper[4869]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t7rnt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-42vwv_openshift-ovn-kubernetes(7edaf111-2689-4453-ba78-00677e1b6316): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 14:48:49 crc kubenswrapper[4869]: > logger="UnhandledError" Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.729870 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" podUID="7edaf111-2689-4453-ba78-00677e1b6316" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.750732 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.756768 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" Mar 12 14:48:49 crc kubenswrapper[4869]: W0312 14:48:49.760279 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-e898c62f5c7075089a7c34e9b3608a62f82343958020d3331f218dd94150a1a9 WatchSource:0}: Error finding container e898c62f5c7075089a7c34e9b3608a62f82343958020d3331f218dd94150a1a9: Status 404 returned error can't find the container with id e898c62f5c7075089a7c34e9b3608a62f82343958020d3331f218dd94150a1a9 Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.762299 4869 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 12 14:48:49 crc kubenswrapper[4869]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 12 14:48:49 crc kubenswrapper[4869]: if [[ -f "/env/_master" ]]; then Mar 12 14:48:49 crc kubenswrapper[4869]: set -o allexport Mar 12 14:48:49 crc kubenswrapper[4869]: source "/env/_master" Mar 12 14:48:49 crc kubenswrapper[4869]: set +o allexport Mar 12 14:48:49 crc kubenswrapper[4869]: fi Mar 12 14:48:49 crc kubenswrapper[4869]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 12 14:48:49 crc kubenswrapper[4869]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 12 14:48:49 crc kubenswrapper[4869]: ho_enable="--enable-hybrid-overlay" Mar 12 14:48:49 crc kubenswrapper[4869]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 12 14:48:49 crc kubenswrapper[4869]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 12 14:48:49 crc kubenswrapper[4869]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 12 14:48:49 crc kubenswrapper[4869]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 12 14:48:49 crc kubenswrapper[4869]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 12 14:48:49 crc kubenswrapper[4869]: --webhook-host=127.0.0.1 \ Mar 12 14:48:49 crc kubenswrapper[4869]: --webhook-port=9743 \ Mar 12 14:48:49 crc kubenswrapper[4869]: ${ho_enable} \ Mar 12 14:48:49 crc kubenswrapper[4869]: --enable-interconnect \ Mar 12 14:48:49 crc kubenswrapper[4869]: --disable-approver \ Mar 12 14:48:49 crc kubenswrapper[4869]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 12 14:48:49 crc kubenswrapper[4869]: --wait-for-kubernetes-api=200s \ Mar 12 14:48:49 crc kubenswrapper[4869]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 12 14:48:49 crc kubenswrapper[4869]: --loglevel="${LOGLEVEL}" Mar 12 14:48:49 crc kubenswrapper[4869]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 14:48:49 crc kubenswrapper[4869]: > logger="UnhandledError" Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.764960 4869 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 12 14:48:49 crc kubenswrapper[4869]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 12 14:48:49 crc kubenswrapper[4869]: if [[ -f "/env/_master" ]]; then Mar 12 14:48:49 crc kubenswrapper[4869]: set -o allexport Mar 12 14:48:49 crc kubenswrapper[4869]: source "/env/_master" Mar 12 14:48:49 crc kubenswrapper[4869]: set +o allexport Mar 12 14:48:49 crc kubenswrapper[4869]: fi Mar 12 14:48:49 crc kubenswrapper[4869]: Mar 12 14:48:49 crc kubenswrapper[4869]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 12 14:48:49 crc kubenswrapper[4869]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 12 14:48:49 crc kubenswrapper[4869]: --disable-webhook \ Mar 12 14:48:49 crc kubenswrapper[4869]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 12 14:48:49 crc kubenswrapper[4869]: --loglevel="${LOGLEVEL}" Mar 12 14:48:49 crc kubenswrapper[4869]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 14:48:49 crc kubenswrapper[4869]: > logger="UnhandledError" Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.766895 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 12 14:48:49 crc kubenswrapper[4869]: W0312 14:48:49.768279 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c553534_16cf_4a8f_8d01_518e9526a117.slice/crio-0288c5a56b6cc206b32d595b627e13b544cd5b1b91358b7ee6ed882e830011e4 WatchSource:0}: Error finding container 0288c5a56b6cc206b32d595b627e13b544cd5b1b91358b7ee6ed882e830011e4: Status 404 returned error can't find the container with id 0288c5a56b6cc206b32d595b627e13b544cd5b1b91358b7ee6ed882e830011e4 Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.769884 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9pvbj" Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.771438 4869 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 12 14:48:49 crc kubenswrapper[4869]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 12 14:48:49 crc kubenswrapper[4869]: set -euo pipefail Mar 12 14:48:49 crc kubenswrapper[4869]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 12 14:48:49 crc kubenswrapper[4869]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 12 14:48:49 crc kubenswrapper[4869]: # As the secret mount is optional we must wait for the files to be present. Mar 12 14:48:49 crc kubenswrapper[4869]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 12 14:48:49 crc kubenswrapper[4869]: TS=$(date +%s) Mar 12 14:48:49 crc kubenswrapper[4869]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 12 14:48:49 crc kubenswrapper[4869]: HAS_LOGGED_INFO=0 Mar 12 14:48:49 crc kubenswrapper[4869]: Mar 12 14:48:49 crc kubenswrapper[4869]: log_missing_certs(){ Mar 12 14:48:49 crc kubenswrapper[4869]: CUR_TS=$(date +%s) Mar 12 14:48:49 crc kubenswrapper[4869]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 12 14:48:49 crc kubenswrapper[4869]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 12 14:48:49 crc kubenswrapper[4869]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 12 14:48:49 crc kubenswrapper[4869]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 12 14:48:49 crc kubenswrapper[4869]: HAS_LOGGED_INFO=1 Mar 12 14:48:49 crc kubenswrapper[4869]: fi Mar 12 14:48:49 crc kubenswrapper[4869]: } Mar 12 14:48:49 crc kubenswrapper[4869]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 12 14:48:49 crc kubenswrapper[4869]: log_missing_certs Mar 12 14:48:49 crc kubenswrapper[4869]: sleep 5 Mar 12 14:48:49 crc kubenswrapper[4869]: done Mar 12 14:48:49 crc kubenswrapper[4869]: Mar 12 14:48:49 crc kubenswrapper[4869]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 12 14:48:49 crc kubenswrapper[4869]: exec /usr/bin/kube-rbac-proxy \ Mar 12 14:48:49 crc kubenswrapper[4869]: --logtostderr \ Mar 12 14:48:49 crc kubenswrapper[4869]: --secure-listen-address=:9108 \ Mar 12 14:48:49 crc kubenswrapper[4869]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 12 14:48:49 crc kubenswrapper[4869]: --upstream=http://127.0.0.1:29108/ \ Mar 12 14:48:49 crc kubenswrapper[4869]: --tls-private-key-file=${TLS_PK} \ Mar 12 14:48:49 crc kubenswrapper[4869]: --tls-cert-file=${TLS_CERT} Mar 12 14:48:49 crc kubenswrapper[4869]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-78csh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-zgf6h_openshift-ovn-kubernetes(2c553534-16cf-4a8f-8d01-518e9526a117): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 14:48:49 crc kubenswrapper[4869]: > logger="UnhandledError" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.776094 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-l8qfx" Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.776174 4869 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 12 14:48:49 crc kubenswrapper[4869]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 12 14:48:49 crc kubenswrapper[4869]: if [[ -f "/env/_master" ]]; then Mar 12 14:48:49 crc kubenswrapper[4869]: set -o allexport Mar 12 14:48:49 crc kubenswrapper[4869]: source "/env/_master" Mar 12 14:48:49 crc kubenswrapper[4869]: set +o allexport Mar 12 14:48:49 crc kubenswrapper[4869]: fi Mar 12 14:48:49 crc kubenswrapper[4869]: Mar 12 14:48:49 crc kubenswrapper[4869]: ovn_v4_join_subnet_opt= Mar 12 14:48:49 crc kubenswrapper[4869]: if [[ "" != "" ]]; then Mar 12 14:48:49 crc kubenswrapper[4869]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 12 14:48:49 crc kubenswrapper[4869]: fi Mar 12 14:48:49 crc kubenswrapper[4869]: ovn_v6_join_subnet_opt= Mar 12 14:48:49 crc kubenswrapper[4869]: if [[ "" != "" ]]; then Mar 12 14:48:49 crc kubenswrapper[4869]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 12 14:48:49 crc kubenswrapper[4869]: fi Mar 12 14:48:49 crc kubenswrapper[4869]: Mar 12 14:48:49 crc kubenswrapper[4869]: ovn_v4_transit_switch_subnet_opt= Mar 12 14:48:49 crc kubenswrapper[4869]: if [[ "" != "" ]]; then Mar 12 14:48:49 crc kubenswrapper[4869]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 12 14:48:49 crc kubenswrapper[4869]: fi Mar 12 14:48:49 crc kubenswrapper[4869]: ovn_v6_transit_switch_subnet_opt= Mar 12 14:48:49 crc kubenswrapper[4869]: if [[ "" != "" ]]; then Mar 12 14:48:49 crc kubenswrapper[4869]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 12 14:48:49 crc kubenswrapper[4869]: fi Mar 12 14:48:49 crc kubenswrapper[4869]: Mar 12 14:48:49 crc kubenswrapper[4869]: dns_name_resolver_enabled_flag= Mar 12 14:48:49 crc kubenswrapper[4869]: if [[ "false" == "true" ]]; then Mar 12 14:48:49 crc kubenswrapper[4869]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 12 14:48:49 crc kubenswrapper[4869]: fi Mar 12 14:48:49 crc kubenswrapper[4869]: Mar 12 14:48:49 crc kubenswrapper[4869]: persistent_ips_enabled_flag= Mar 12 14:48:49 crc kubenswrapper[4869]: if [[ "true" == "true" ]]; then Mar 12 14:48:49 crc kubenswrapper[4869]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 12 14:48:49 crc kubenswrapper[4869]: fi Mar 12 14:48:49 crc kubenswrapper[4869]: Mar 12 14:48:49 crc kubenswrapper[4869]: # This is needed so that converting clusters from GA to TP Mar 12 14:48:49 crc kubenswrapper[4869]: # will rollout control plane pods as well Mar 12 14:48:49 crc kubenswrapper[4869]: network_segmentation_enabled_flag= Mar 12 14:48:49 crc kubenswrapper[4869]: multi_network_enabled_flag= Mar 12 14:48:49 crc kubenswrapper[4869]: if [[ "true" == "true" ]]; then Mar 12 14:48:49 crc kubenswrapper[4869]: multi_network_enabled_flag="--enable-multi-network" Mar 12 14:48:49 crc kubenswrapper[4869]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 12 14:48:49 crc kubenswrapper[4869]: fi Mar 12 14:48:49 crc kubenswrapper[4869]: Mar 12 14:48:49 crc kubenswrapper[4869]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 12 14:48:49 crc kubenswrapper[4869]: exec /usr/bin/ovnkube \ Mar 12 14:48:49 crc kubenswrapper[4869]: --enable-interconnect \ Mar 12 14:48:49 crc kubenswrapper[4869]: --init-cluster-manager "${K8S_NODE}" \ Mar 12 14:48:49 crc kubenswrapper[4869]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 12 14:48:49 crc kubenswrapper[4869]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 12 14:48:49 crc kubenswrapper[4869]: --metrics-bind-address "127.0.0.1:29108" \ Mar 12 14:48:49 crc kubenswrapper[4869]: --metrics-enable-pprof \ Mar 12 14:48:49 crc kubenswrapper[4869]: --metrics-enable-config-duration \ Mar 12 14:48:49 crc kubenswrapper[4869]: ${ovn_v4_join_subnet_opt} \ Mar 12 14:48:49 crc kubenswrapper[4869]: ${ovn_v6_join_subnet_opt} \ Mar 12 14:48:49 crc kubenswrapper[4869]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 12 14:48:49 crc kubenswrapper[4869]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 12 14:48:49 crc kubenswrapper[4869]: ${dns_name_resolver_enabled_flag} \ Mar 12 14:48:49 crc kubenswrapper[4869]: ${persistent_ips_enabled_flag} \ Mar 12 14:48:49 crc kubenswrapper[4869]: ${multi_network_enabled_flag} \ Mar 12 14:48:49 crc kubenswrapper[4869]: ${network_segmentation_enabled_flag} Mar 12 14:48:49 crc kubenswrapper[4869]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-78csh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-zgf6h_openshift-ovn-kubernetes(2c553534-16cf-4a8f-8d01-518e9526a117): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 14:48:49 crc kubenswrapper[4869]: > logger="UnhandledError" Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.778153 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" podUID="2c553534-16cf-4a8f-8d01-518e9526a117" Mar 12 14:48:49 crc kubenswrapper[4869]: W0312 14:48:49.784037 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe3a984d_84f6_4421_85dc_ecb4bdb29707.slice/crio-339628b3ced9f4fc8c4846a7b9cb2fe3bf60cd64300fc9d36cea9d8f7755a1ad WatchSource:0}: Error finding container 339628b3ced9f4fc8c4846a7b9cb2fe3bf60cd64300fc9d36cea9d8f7755a1ad: Status 404 returned error can't find the container with id 339628b3ced9f4fc8c4846a7b9cb2fe3bf60cd64300fc9d36cea9d8f7755a1ad Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.785219 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.785245 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.785253 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.785266 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.785276 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:49Z","lastTransitionTime":"2026-03-12T14:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.787087 4869 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 12 14:48:49 crc kubenswrapper[4869]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 12 14:48:49 crc kubenswrapper[4869]: set -uo pipefail Mar 12 14:48:49 crc kubenswrapper[4869]: Mar 12 14:48:49 crc kubenswrapper[4869]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 12 14:48:49 crc kubenswrapper[4869]: Mar 12 14:48:49 crc kubenswrapper[4869]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 12 14:48:49 crc kubenswrapper[4869]: HOSTS_FILE="/etc/hosts" Mar 12 14:48:49 crc kubenswrapper[4869]: TEMP_FILE="/etc/hosts.tmp" Mar 12 14:48:49 crc kubenswrapper[4869]: Mar 12 14:48:49 crc kubenswrapper[4869]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 12 14:48:49 crc kubenswrapper[4869]: Mar 12 14:48:49 crc kubenswrapper[4869]: # Make a temporary file with the old hosts file's attributes. Mar 12 14:48:49 crc kubenswrapper[4869]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 12 14:48:49 crc kubenswrapper[4869]: echo "Failed to preserve hosts file. Exiting." Mar 12 14:48:49 crc kubenswrapper[4869]: exit 1 Mar 12 14:48:49 crc kubenswrapper[4869]: fi Mar 12 14:48:49 crc kubenswrapper[4869]: Mar 12 14:48:49 crc kubenswrapper[4869]: while true; do Mar 12 14:48:49 crc kubenswrapper[4869]: declare -A svc_ips Mar 12 14:48:49 crc kubenswrapper[4869]: for svc in "${services[@]}"; do Mar 12 14:48:49 crc kubenswrapper[4869]: # Fetch service IP from cluster dns if present. We make several tries Mar 12 14:48:49 crc kubenswrapper[4869]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 12 14:48:49 crc kubenswrapper[4869]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 12 14:48:49 crc kubenswrapper[4869]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 12 14:48:49 crc kubenswrapper[4869]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 12 14:48:49 crc kubenswrapper[4869]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 12 14:48:49 crc kubenswrapper[4869]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 12 14:48:49 crc kubenswrapper[4869]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 12 14:48:49 crc kubenswrapper[4869]: for i in ${!cmds[*]} Mar 12 14:48:49 crc kubenswrapper[4869]: do Mar 12 14:48:49 crc kubenswrapper[4869]: ips=($(eval "${cmds[i]}")) Mar 12 14:48:49 crc kubenswrapper[4869]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 12 14:48:49 crc kubenswrapper[4869]: svc_ips["${svc}"]="${ips[@]}" Mar 12 14:48:49 crc kubenswrapper[4869]: break Mar 12 14:48:49 crc kubenswrapper[4869]: fi Mar 12 14:48:49 crc kubenswrapper[4869]: done Mar 12 14:48:49 crc kubenswrapper[4869]: done Mar 12 14:48:49 crc kubenswrapper[4869]: Mar 12 14:48:49 crc kubenswrapper[4869]: # Update /etc/hosts only if we get valid service IPs Mar 12 14:48:49 crc kubenswrapper[4869]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 12 14:48:49 crc kubenswrapper[4869]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 12 14:48:49 crc kubenswrapper[4869]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 12 14:48:49 crc kubenswrapper[4869]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 12 14:48:49 crc kubenswrapper[4869]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 12 14:48:49 crc kubenswrapper[4869]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 12 14:48:49 crc kubenswrapper[4869]: sleep 60 & wait Mar 12 14:48:49 crc kubenswrapper[4869]: continue Mar 12 14:48:49 crc kubenswrapper[4869]: fi Mar 12 14:48:49 crc kubenswrapper[4869]: Mar 12 14:48:49 crc kubenswrapper[4869]: # Append resolver entries for services Mar 12 14:48:49 crc kubenswrapper[4869]: rc=0 Mar 12 14:48:49 crc kubenswrapper[4869]: for svc in "${!svc_ips[@]}"; do Mar 12 14:48:49 crc kubenswrapper[4869]: for ip in ${svc_ips[${svc}]}; do Mar 12 14:48:49 crc kubenswrapper[4869]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 12 14:48:49 crc kubenswrapper[4869]: done Mar 12 14:48:49 crc kubenswrapper[4869]: done Mar 12 14:48:49 crc kubenswrapper[4869]: if [[ $rc -ne 0 ]]; then Mar 12 14:48:49 crc kubenswrapper[4869]: sleep 60 & wait Mar 12 14:48:49 crc kubenswrapper[4869]: continue Mar 12 14:48:49 crc kubenswrapper[4869]: fi Mar 12 14:48:49 crc kubenswrapper[4869]: Mar 12 14:48:49 crc kubenswrapper[4869]: Mar 12 14:48:49 crc kubenswrapper[4869]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 12 14:48:49 crc kubenswrapper[4869]: # Replace /etc/hosts with our modified version if needed Mar 12 14:48:49 crc kubenswrapper[4869]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 12 14:48:49 crc kubenswrapper[4869]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 12 14:48:49 crc kubenswrapper[4869]: fi Mar 12 14:48:49 crc kubenswrapper[4869]: sleep 60 & wait Mar 12 14:48:49 crc kubenswrapper[4869]: unset svc_ips Mar 12 14:48:49 crc kubenswrapper[4869]: done Mar 12 14:48:49 crc kubenswrapper[4869]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ss8tm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-9pvbj_openshift-dns(fe3a984d-84f6-4421-85dc-ecb4bdb29707): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 14:48:49 crc kubenswrapper[4869]: > logger="UnhandledError" Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.788981 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-9pvbj" podUID="fe3a984d-84f6-4421-85dc-ecb4bdb29707" Mar 12 14:48:49 crc kubenswrapper[4869]: W0312 14:48:49.791244 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fd2bb3f_6860_4631_a95c_c910d33724b6.slice/crio-d81b2833a1509b9fe941223d777e310e163f55fdf3f0140b3ef1c732681849e1 WatchSource:0}: Error finding container d81b2833a1509b9fe941223d777e310e163f55fdf3f0140b3ef1c732681849e1: Status 404 returned error can't find the container with id d81b2833a1509b9fe941223d777e310e163f55fdf3f0140b3ef1c732681849e1 Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.793381 4869 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 12 14:48:49 crc kubenswrapper[4869]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 12 14:48:49 crc kubenswrapper[4869]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 12 14:48:49 crc kubenswrapper[4869]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2dhml,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-l8qfx_openshift-multus(2fd2bb3f-6860-4631-a95c-c910d33724b6): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 14:48:49 crc kubenswrapper[4869]: > logger="UnhandledError" Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.794585 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-l8qfx" podUID="2fd2bb3f-6860-4631-a95c-c910d33724b6" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.887778 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.887835 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.887849 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.887865 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.887876 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:49Z","lastTransitionTime":"2026-03-12T14:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.950579 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.950721 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:48:50.950695899 +0000 UTC m=+83.235921197 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.950834 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.950904 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.950981 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.951005 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.951075 4869 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.951074 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.951109 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.951122 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 14:48:50.951109542 +0000 UTC m=+83.236334820 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.951129 4869 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.951182 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-12 14:48:50.951165283 +0000 UTC m=+83.236390581 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.951175 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.951206 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.951219 4869 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.951234 4869 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.951288 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-12 14:48:50.951272066 +0000 UTC m=+83.236497344 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 14:48:49 crc kubenswrapper[4869]: E0312 14:48:49.951338 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 14:48:50.951309748 +0000 UTC m=+83.236535056 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.990144 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.990210 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.990222 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.990239 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:49 crc kubenswrapper[4869]: I0312 14:48:49.990250 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:49Z","lastTransitionTime":"2026-03-12T14:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.052251 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8415254a-55e8-451e-8be1-364b98f44196-metrics-certs\") pod \"network-metrics-daemon-hllm5\" (UID: \"8415254a-55e8-451e-8be1-364b98f44196\") " pod="openshift-multus/network-metrics-daemon-hllm5" Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.052466 4869 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.052603 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8415254a-55e8-451e-8be1-364b98f44196-metrics-certs podName:8415254a-55e8-451e-8be1-364b98f44196 nodeName:}" failed. No retries permitted until 2026-03-12 14:48:51.052580788 +0000 UTC m=+83.337806066 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8415254a-55e8-451e-8be1-364b98f44196-metrics-certs") pod "network-metrics-daemon-hllm5" (UID: "8415254a-55e8-451e-8be1-364b98f44196") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.093945 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.094033 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.094063 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.094095 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.094120 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:50Z","lastTransitionTime":"2026-03-12T14:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.196698 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.196768 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.196782 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.196799 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.196810 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:50Z","lastTransitionTime":"2026-03-12T14:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.299787 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.299863 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.299945 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.299977 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.299994 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:50Z","lastTransitionTime":"2026-03-12T14:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.342365 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.342974 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.344273 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.344923 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.345908 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.346675 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.347319 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.348223 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.348948 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.349926 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.350392 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.351566 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.352140 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.352764 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.353846 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.354377 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.355436 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.355889 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.356531 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.357805 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.358332 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.359490 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.359980 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.361019 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.361416 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.362148 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.363328 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.363880 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.364889 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.365337 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.366273 4869 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.366374 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.368155 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.369113 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.369691 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.371183 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.371814 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.372747 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.373652 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.374923 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.375482 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.376623 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.377260 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.378327 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.378872 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.379794 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.380289 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.381468 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.382112 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.382981 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.383436 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.384419 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.384991 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.385443 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.402383 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.402431 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.402450 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.402471 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.402483 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:50Z","lastTransitionTime":"2026-03-12T14:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.505700 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.505748 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.505766 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.505790 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.505809 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:50Z","lastTransitionTime":"2026-03-12T14:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.607191 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.607222 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.607231 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.607243 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.607251 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:50Z","lastTransitionTime":"2026-03-12T14:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.608406 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9pvbj" event={"ID":"fe3a984d-84f6-4421-85dc-ecb4bdb29707","Type":"ContainerStarted","Data":"339628b3ced9f4fc8c4846a7b9cb2fe3bf60cd64300fc9d36cea9d8f7755a1ad"} Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.609935 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wkrgx" event={"ID":"f5344bfd-e537-4710-abf4-24ece04a3ff0","Type":"ContainerStarted","Data":"b4517203c5417971561d1cac24e53ab4f9a9ccba2797a34f4a5f4cd4225dfc5c"} Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.610522 4869 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 12 14:48:50 crc kubenswrapper[4869]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 12 14:48:50 crc kubenswrapper[4869]: set -uo pipefail Mar 12 14:48:50 crc kubenswrapper[4869]: Mar 12 14:48:50 crc kubenswrapper[4869]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 12 14:48:50 crc kubenswrapper[4869]: Mar 12 14:48:50 crc kubenswrapper[4869]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 12 14:48:50 crc kubenswrapper[4869]: HOSTS_FILE="/etc/hosts" Mar 12 14:48:50 crc kubenswrapper[4869]: TEMP_FILE="/etc/hosts.tmp" Mar 12 14:48:50 crc kubenswrapper[4869]: Mar 12 14:48:50 crc kubenswrapper[4869]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 12 14:48:50 crc kubenswrapper[4869]: Mar 12 14:48:50 crc kubenswrapper[4869]: # Make a temporary file with the old hosts file's attributes. Mar 12 14:48:50 crc kubenswrapper[4869]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 12 14:48:50 crc kubenswrapper[4869]: echo "Failed to preserve hosts file. Exiting." Mar 12 14:48:50 crc kubenswrapper[4869]: exit 1 Mar 12 14:48:50 crc kubenswrapper[4869]: fi Mar 12 14:48:50 crc kubenswrapper[4869]: Mar 12 14:48:50 crc kubenswrapper[4869]: while true; do Mar 12 14:48:50 crc kubenswrapper[4869]: declare -A svc_ips Mar 12 14:48:50 crc kubenswrapper[4869]: for svc in "${services[@]}"; do Mar 12 14:48:50 crc kubenswrapper[4869]: # Fetch service IP from cluster dns if present. We make several tries Mar 12 14:48:50 crc kubenswrapper[4869]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 12 14:48:50 crc kubenswrapper[4869]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 12 14:48:50 crc kubenswrapper[4869]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 12 14:48:50 crc kubenswrapper[4869]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 12 14:48:50 crc kubenswrapper[4869]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 12 14:48:50 crc kubenswrapper[4869]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 12 14:48:50 crc kubenswrapper[4869]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 12 14:48:50 crc kubenswrapper[4869]: for i in ${!cmds[*]} Mar 12 14:48:50 crc kubenswrapper[4869]: do Mar 12 14:48:50 crc kubenswrapper[4869]: ips=($(eval "${cmds[i]}")) Mar 12 14:48:50 crc kubenswrapper[4869]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 12 14:48:50 crc kubenswrapper[4869]: svc_ips["${svc}"]="${ips[@]}" Mar 12 14:48:50 crc kubenswrapper[4869]: break Mar 12 14:48:50 crc kubenswrapper[4869]: fi Mar 12 14:48:50 crc kubenswrapper[4869]: done Mar 12 14:48:50 crc kubenswrapper[4869]: done Mar 12 14:48:50 crc kubenswrapper[4869]: Mar 12 14:48:50 crc kubenswrapper[4869]: # Update /etc/hosts only if we get valid service IPs Mar 12 14:48:50 crc kubenswrapper[4869]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 12 14:48:50 crc kubenswrapper[4869]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 12 14:48:50 crc kubenswrapper[4869]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 12 14:48:50 crc kubenswrapper[4869]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 12 14:48:50 crc kubenswrapper[4869]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 12 14:48:50 crc kubenswrapper[4869]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 12 14:48:50 crc kubenswrapper[4869]: sleep 60 & wait Mar 12 14:48:50 crc kubenswrapper[4869]: continue Mar 12 14:48:50 crc kubenswrapper[4869]: fi Mar 12 14:48:50 crc kubenswrapper[4869]: Mar 12 14:48:50 crc kubenswrapper[4869]: # Append resolver entries for services Mar 12 14:48:50 crc kubenswrapper[4869]: rc=0 Mar 12 14:48:50 crc kubenswrapper[4869]: for svc in "${!svc_ips[@]}"; do Mar 12 14:48:50 crc kubenswrapper[4869]: for ip in ${svc_ips[${svc}]}; do Mar 12 14:48:50 crc kubenswrapper[4869]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 12 14:48:50 crc kubenswrapper[4869]: done Mar 12 14:48:50 crc kubenswrapper[4869]: done Mar 12 14:48:50 crc kubenswrapper[4869]: if [[ $rc -ne 0 ]]; then Mar 12 14:48:50 crc kubenswrapper[4869]: sleep 60 & wait Mar 12 14:48:50 crc kubenswrapper[4869]: continue Mar 12 14:48:50 crc kubenswrapper[4869]: fi Mar 12 14:48:50 crc kubenswrapper[4869]: Mar 12 14:48:50 crc kubenswrapper[4869]: Mar 12 14:48:50 crc kubenswrapper[4869]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 12 14:48:50 crc kubenswrapper[4869]: # Replace /etc/hosts with our modified version if needed Mar 12 14:48:50 crc kubenswrapper[4869]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 12 14:48:50 crc kubenswrapper[4869]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 12 14:48:50 crc kubenswrapper[4869]: fi Mar 12 14:48:50 crc kubenswrapper[4869]: sleep 60 & wait Mar 12 14:48:50 crc kubenswrapper[4869]: unset svc_ips Mar 12 14:48:50 crc kubenswrapper[4869]: done Mar 12 14:48:50 crc kubenswrapper[4869]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ss8tm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-9pvbj_openshift-dns(fe3a984d-84f6-4421-85dc-ecb4bdb29707): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 14:48:50 crc kubenswrapper[4869]: > logger="UnhandledError" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.611724 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"9b173a6fbedb6874a1e228918d90d518bb20cf086e68a0add88e6ed2a3365475"} Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.612628 4869 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 12 14:48:50 crc kubenswrapper[4869]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 12 14:48:50 crc kubenswrapper[4869]: while [ true ]; Mar 12 14:48:50 crc kubenswrapper[4869]: do Mar 12 14:48:50 crc kubenswrapper[4869]: for f in $(ls /tmp/serviceca); do Mar 12 14:48:50 crc kubenswrapper[4869]: echo $f Mar 12 14:48:50 crc kubenswrapper[4869]: ca_file_path="/tmp/serviceca/${f}" Mar 12 14:48:50 crc kubenswrapper[4869]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 12 14:48:50 crc kubenswrapper[4869]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 12 14:48:50 crc kubenswrapper[4869]: if [ -e "${reg_dir_path}" ]; then Mar 12 14:48:50 crc kubenswrapper[4869]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 12 14:48:50 crc kubenswrapper[4869]: else Mar 12 14:48:50 crc kubenswrapper[4869]: mkdir $reg_dir_path Mar 12 14:48:50 crc kubenswrapper[4869]: cp $ca_file_path $reg_dir_path/ca.crt Mar 12 14:48:50 crc kubenswrapper[4869]: fi Mar 12 14:48:50 crc kubenswrapper[4869]: done Mar 12 14:48:50 crc kubenswrapper[4869]: for d in $(ls /etc/docker/certs.d); do Mar 12 14:48:50 crc kubenswrapper[4869]: echo $d Mar 12 14:48:50 crc kubenswrapper[4869]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 12 14:48:50 crc kubenswrapper[4869]: reg_conf_path="/tmp/serviceca/${dp}" Mar 12 14:48:50 crc kubenswrapper[4869]: if [ ! -e "${reg_conf_path}" ]; then Mar 12 14:48:50 crc kubenswrapper[4869]: rm -rf /etc/docker/certs.d/$d Mar 12 14:48:50 crc kubenswrapper[4869]: fi Mar 12 14:48:50 crc kubenswrapper[4869]: done Mar 12 14:48:50 crc kubenswrapper[4869]: sleep 60 & wait ${!} Mar 12 14:48:50 crc kubenswrapper[4869]: done Mar 12 14:48:50 crc kubenswrapper[4869]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jxrpz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-wkrgx_openshift-image-registry(f5344bfd-e537-4710-abf4-24ece04a3ff0): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 14:48:50 crc kubenswrapper[4869]: > logger="UnhandledError" Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.612773 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-9pvbj" podUID="fe3a984d-84f6-4421-85dc-ecb4bdb29707" Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.613302 4869 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 12 14:48:50 crc kubenswrapper[4869]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 12 14:48:50 crc kubenswrapper[4869]: set -o allexport Mar 12 14:48:50 crc kubenswrapper[4869]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 12 14:48:50 crc kubenswrapper[4869]: source /etc/kubernetes/apiserver-url.env Mar 12 14:48:50 crc kubenswrapper[4869]: else Mar 12 14:48:50 crc kubenswrapper[4869]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 12 14:48:50 crc kubenswrapper[4869]: exit 1 Mar 12 14:48:50 crc kubenswrapper[4869]: fi Mar 12 14:48:50 crc kubenswrapper[4869]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 12 14:48:50 crc kubenswrapper[4869]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 14:48:50 crc kubenswrapper[4869]: > logger="UnhandledError" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.613417 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e898c62f5c7075089a7c34e9b3608a62f82343958020d3331f218dd94150a1a9"} Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.613728 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-wkrgx" podUID="f5344bfd-e537-4710-abf4-24ece04a3ff0" Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.614380 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.615207 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"e176f194db94f38f8699b67ee930578b3fe4e650b9d66be55f7c9c2d6aa625ce"} Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.616138 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" event={"ID":"2c553534-16cf-4a8f-8d01-518e9526a117","Type":"ContainerStarted","Data":"0288c5a56b6cc206b32d595b627e13b544cd5b1b91358b7ee6ed882e830011e4"} Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.616607 4869 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 12 14:48:50 crc kubenswrapper[4869]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 12 14:48:50 crc kubenswrapper[4869]: if [[ -f "/env/_master" ]]; then Mar 12 14:48:50 crc kubenswrapper[4869]: set -o allexport Mar 12 14:48:50 crc kubenswrapper[4869]: source "/env/_master" Mar 12 14:48:50 crc kubenswrapper[4869]: set +o allexport Mar 12 14:48:50 crc kubenswrapper[4869]: fi Mar 12 14:48:50 crc kubenswrapper[4869]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 12 14:48:50 crc kubenswrapper[4869]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 12 14:48:50 crc kubenswrapper[4869]: ho_enable="--enable-hybrid-overlay" Mar 12 14:48:50 crc kubenswrapper[4869]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 12 14:48:50 crc kubenswrapper[4869]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 12 14:48:50 crc kubenswrapper[4869]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 12 14:48:50 crc kubenswrapper[4869]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 12 14:48:50 crc kubenswrapper[4869]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 12 14:48:50 crc kubenswrapper[4869]: --webhook-host=127.0.0.1 \ Mar 12 14:48:50 crc kubenswrapper[4869]: --webhook-port=9743 \ Mar 12 14:48:50 crc kubenswrapper[4869]: ${ho_enable} \ Mar 12 14:48:50 crc kubenswrapper[4869]: --enable-interconnect \ Mar 12 14:48:50 crc kubenswrapper[4869]: --disable-approver \ Mar 12 14:48:50 crc kubenswrapper[4869]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 12 14:48:50 crc kubenswrapper[4869]: --wait-for-kubernetes-api=200s \ Mar 12 14:48:50 crc kubenswrapper[4869]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 12 14:48:50 crc kubenswrapper[4869]: --loglevel="${LOGLEVEL}" Mar 12 14:48:50 crc kubenswrapper[4869]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 14:48:50 crc kubenswrapper[4869]: > logger="UnhandledError" Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.616713 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.617501 4869 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 12 14:48:50 crc kubenswrapper[4869]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 12 14:48:50 crc kubenswrapper[4869]: set -euo pipefail Mar 12 14:48:50 crc kubenswrapper[4869]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 12 14:48:50 crc kubenswrapper[4869]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 12 14:48:50 crc kubenswrapper[4869]: # As the secret mount is optional we must wait for the files to be present. Mar 12 14:48:50 crc kubenswrapper[4869]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 12 14:48:50 crc kubenswrapper[4869]: TS=$(date +%s) Mar 12 14:48:50 crc kubenswrapper[4869]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 12 14:48:50 crc kubenswrapper[4869]: HAS_LOGGED_INFO=0 Mar 12 14:48:50 crc kubenswrapper[4869]: Mar 12 14:48:50 crc kubenswrapper[4869]: log_missing_certs(){ Mar 12 14:48:50 crc kubenswrapper[4869]: CUR_TS=$(date +%s) Mar 12 14:48:50 crc kubenswrapper[4869]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 12 14:48:50 crc kubenswrapper[4869]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 12 14:48:50 crc kubenswrapper[4869]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 12 14:48:50 crc kubenswrapper[4869]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 12 14:48:50 crc kubenswrapper[4869]: HAS_LOGGED_INFO=1 Mar 12 14:48:50 crc kubenswrapper[4869]: fi Mar 12 14:48:50 crc kubenswrapper[4869]: } Mar 12 14:48:50 crc kubenswrapper[4869]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 12 14:48:50 crc kubenswrapper[4869]: log_missing_certs Mar 12 14:48:50 crc kubenswrapper[4869]: sleep 5 Mar 12 14:48:50 crc kubenswrapper[4869]: done Mar 12 14:48:50 crc kubenswrapper[4869]: Mar 12 14:48:50 crc kubenswrapper[4869]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 12 14:48:50 crc kubenswrapper[4869]: exec /usr/bin/kube-rbac-proxy \ Mar 12 14:48:50 crc kubenswrapper[4869]: --logtostderr \ Mar 12 14:48:50 crc kubenswrapper[4869]: --secure-listen-address=:9108 \ Mar 12 14:48:50 crc kubenswrapper[4869]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 12 14:48:50 crc kubenswrapper[4869]: --upstream=http://127.0.0.1:29108/ \ Mar 12 14:48:50 crc kubenswrapper[4869]: --tls-private-key-file=${TLS_PK} \ Mar 12 14:48:50 crc kubenswrapper[4869]: --tls-cert-file=${TLS_CERT} Mar 12 14:48:50 crc kubenswrapper[4869]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-78csh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-zgf6h_openshift-ovn-kubernetes(2c553534-16cf-4a8f-8d01-518e9526a117): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 14:48:50 crc kubenswrapper[4869]: > logger="UnhandledError" Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.617772 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.617862 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-thzhj" event={"ID":"9eee1745-fda9-4657-8cb0-d491ae450f82","Type":"ContainerStarted","Data":"ce2c7f26517887d5b7c22557cff1fdcdc8a210f8de95d3bc6131ca5bee077576"} Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.618631 4869 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 12 14:48:50 crc kubenswrapper[4869]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 12 14:48:50 crc kubenswrapper[4869]: if [[ -f "/env/_master" ]]; then Mar 12 14:48:50 crc kubenswrapper[4869]: set -o allexport Mar 12 14:48:50 crc kubenswrapper[4869]: source "/env/_master" Mar 12 14:48:50 crc kubenswrapper[4869]: set +o allexport Mar 12 14:48:50 crc kubenswrapper[4869]: fi Mar 12 14:48:50 crc kubenswrapper[4869]: Mar 12 14:48:50 crc kubenswrapper[4869]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 12 14:48:50 crc kubenswrapper[4869]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 12 14:48:50 crc kubenswrapper[4869]: --disable-webhook \ Mar 12 14:48:50 crc kubenswrapper[4869]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 12 14:48:50 crc kubenswrapper[4869]: --loglevel="${LOGLEVEL}" Mar 12 14:48:50 crc kubenswrapper[4869]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 14:48:50 crc kubenswrapper[4869]: > logger="UnhandledError" Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.619162 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s6qq8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-thzhj_openshift-multus(9eee1745-fda9-4657-8cb0-d491ae450f82): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.619231 4869 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 12 14:48:50 crc kubenswrapper[4869]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 12 14:48:50 crc kubenswrapper[4869]: if [[ -f "/env/_master" ]]; then Mar 12 14:48:50 crc kubenswrapper[4869]: set -o allexport Mar 12 14:48:50 crc kubenswrapper[4869]: source "/env/_master" Mar 12 14:48:50 crc kubenswrapper[4869]: set +o allexport Mar 12 14:48:50 crc kubenswrapper[4869]: fi Mar 12 14:48:50 crc kubenswrapper[4869]: Mar 12 14:48:50 crc kubenswrapper[4869]: ovn_v4_join_subnet_opt= Mar 12 14:48:50 crc kubenswrapper[4869]: if [[ "" != "" ]]; then Mar 12 14:48:50 crc kubenswrapper[4869]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 12 14:48:50 crc kubenswrapper[4869]: fi Mar 12 14:48:50 crc kubenswrapper[4869]: ovn_v6_join_subnet_opt= Mar 12 14:48:50 crc kubenswrapper[4869]: if [[ "" != "" ]]; then Mar 12 14:48:50 crc kubenswrapper[4869]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 12 14:48:50 crc kubenswrapper[4869]: fi Mar 12 14:48:50 crc kubenswrapper[4869]: Mar 12 14:48:50 crc kubenswrapper[4869]: ovn_v4_transit_switch_subnet_opt= Mar 12 14:48:50 crc kubenswrapper[4869]: if [[ "" != "" ]]; then Mar 12 14:48:50 crc kubenswrapper[4869]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 12 14:48:50 crc kubenswrapper[4869]: fi Mar 12 14:48:50 crc kubenswrapper[4869]: ovn_v6_transit_switch_subnet_opt= Mar 12 14:48:50 crc kubenswrapper[4869]: if [[ "" != "" ]]; then Mar 12 14:48:50 crc kubenswrapper[4869]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 12 14:48:50 crc kubenswrapper[4869]: fi Mar 12 14:48:50 crc kubenswrapper[4869]: Mar 12 14:48:50 crc kubenswrapper[4869]: dns_name_resolver_enabled_flag= Mar 12 14:48:50 crc kubenswrapper[4869]: if [[ "false" == "true" ]]; then Mar 12 14:48:50 crc kubenswrapper[4869]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 12 14:48:50 crc kubenswrapper[4869]: fi Mar 12 14:48:50 crc kubenswrapper[4869]: Mar 12 14:48:50 crc kubenswrapper[4869]: persistent_ips_enabled_flag= Mar 12 14:48:50 crc kubenswrapper[4869]: if [[ "true" == "true" ]]; then Mar 12 14:48:50 crc kubenswrapper[4869]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 12 14:48:50 crc kubenswrapper[4869]: fi Mar 12 14:48:50 crc kubenswrapper[4869]: Mar 12 14:48:50 crc kubenswrapper[4869]: # This is needed so that converting clusters from GA to TP Mar 12 14:48:50 crc kubenswrapper[4869]: # will rollout control plane pods as well Mar 12 14:48:50 crc kubenswrapper[4869]: network_segmentation_enabled_flag= Mar 12 14:48:50 crc kubenswrapper[4869]: multi_network_enabled_flag= Mar 12 14:48:50 crc kubenswrapper[4869]: if [[ "true" == "true" ]]; then Mar 12 14:48:50 crc kubenswrapper[4869]: multi_network_enabled_flag="--enable-multi-network" Mar 12 14:48:50 crc kubenswrapper[4869]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 12 14:48:50 crc kubenswrapper[4869]: fi Mar 12 14:48:50 crc kubenswrapper[4869]: Mar 12 14:48:50 crc kubenswrapper[4869]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 12 14:48:50 crc kubenswrapper[4869]: exec /usr/bin/ovnkube \ Mar 12 14:48:50 crc kubenswrapper[4869]: --enable-interconnect \ Mar 12 14:48:50 crc kubenswrapper[4869]: --init-cluster-manager "${K8S_NODE}" \ Mar 12 14:48:50 crc kubenswrapper[4869]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 12 14:48:50 crc kubenswrapper[4869]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 12 14:48:50 crc kubenswrapper[4869]: --metrics-bind-address "127.0.0.1:29108" \ Mar 12 14:48:50 crc kubenswrapper[4869]: --metrics-enable-pprof \ Mar 12 14:48:50 crc kubenswrapper[4869]: --metrics-enable-config-duration \ Mar 12 14:48:50 crc kubenswrapper[4869]: ${ovn_v4_join_subnet_opt} \ Mar 12 14:48:50 crc kubenswrapper[4869]: ${ovn_v6_join_subnet_opt} \ Mar 12 14:48:50 crc kubenswrapper[4869]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 12 14:48:50 crc kubenswrapper[4869]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 12 14:48:50 crc kubenswrapper[4869]: ${dns_name_resolver_enabled_flag} \ Mar 12 14:48:50 crc kubenswrapper[4869]: ${persistent_ips_enabled_flag} \ Mar 12 14:48:50 crc kubenswrapper[4869]: ${multi_network_enabled_flag} \ Mar 12 14:48:50 crc kubenswrapper[4869]: ${network_segmentation_enabled_flag} Mar 12 14:48:50 crc kubenswrapper[4869]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-78csh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-zgf6h_openshift-ovn-kubernetes(2c553534-16cf-4a8f-8d01-518e9526a117): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 14:48:50 crc kubenswrapper[4869]: > logger="UnhandledError" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.619412 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-l8qfx" event={"ID":"2fd2bb3f-6860-4631-a95c-c910d33724b6","Type":"ContainerStarted","Data":"d81b2833a1509b9fe941223d777e310e163f55fdf3f0140b3ef1c732681849e1"} Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.619818 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.620394 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-thzhj" podUID="9eee1745-fda9-4657-8cb0-d491ae450f82" Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.620536 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" podUID="2c553534-16cf-4a8f-8d01-518e9526a117" Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.620733 4869 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 12 14:48:50 crc kubenswrapper[4869]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 12 14:48:50 crc kubenswrapper[4869]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 12 14:48:50 crc kubenswrapper[4869]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2dhml,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-l8qfx_openshift-multus(2fd2bb3f-6860-4631-a95c-c910d33724b6): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 14:48:50 crc kubenswrapper[4869]: > logger="UnhandledError" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.620828 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerStarted","Data":"d56c38143aed882c4ecd67b37d48878589ec84858bdb72f71b57711fe9316c77"} Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.621776 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-l8qfx" podUID="2fd2bb3f-6860-4631-a95c-c910d33724b6" Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.622010 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gr2bz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.622113 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" event={"ID":"7edaf111-2689-4453-ba78-00677e1b6316","Type":"ContainerStarted","Data":"7f1e42c08801651f0218ac42cd5a6a27bea2aa0c9fff4ba369ad616449216821"} Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.622894 4869 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 12 14:48:50 crc kubenswrapper[4869]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 12 14:48:50 crc kubenswrapper[4869]: apiVersion: v1 Mar 12 14:48:50 crc kubenswrapper[4869]: clusters: Mar 12 14:48:50 crc kubenswrapper[4869]: - cluster: Mar 12 14:48:50 crc kubenswrapper[4869]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 12 14:48:50 crc kubenswrapper[4869]: server: https://api-int.crc.testing:6443 Mar 12 14:48:50 crc kubenswrapper[4869]: name: default-cluster Mar 12 14:48:50 crc kubenswrapper[4869]: contexts: Mar 12 14:48:50 crc kubenswrapper[4869]: - context: Mar 12 14:48:50 crc kubenswrapper[4869]: cluster: default-cluster Mar 12 14:48:50 crc kubenswrapper[4869]: namespace: default Mar 12 14:48:50 crc kubenswrapper[4869]: user: default-auth Mar 12 14:48:50 crc kubenswrapper[4869]: name: default-context Mar 12 14:48:50 crc kubenswrapper[4869]: current-context: default-context Mar 12 14:48:50 crc kubenswrapper[4869]: kind: Config Mar 12 14:48:50 crc kubenswrapper[4869]: preferences: {} Mar 12 14:48:50 crc kubenswrapper[4869]: users: Mar 12 14:48:50 crc kubenswrapper[4869]: - name: default-auth Mar 12 14:48:50 crc kubenswrapper[4869]: user: Mar 12 14:48:50 crc kubenswrapper[4869]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 12 14:48:50 crc kubenswrapper[4869]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 12 14:48:50 crc kubenswrapper[4869]: EOF Mar 12 14:48:50 crc kubenswrapper[4869]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t7rnt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-42vwv_openshift-ovn-kubernetes(7edaf111-2689-4453-ba78-00677e1b6316): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 14:48:50 crc kubenswrapper[4869]: > logger="UnhandledError" Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.624421 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" podUID="7edaf111-2689-4453-ba78-00677e1b6316" Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.624662 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gr2bz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.625845 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.629350 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-l8qfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd2bb3f-6860-4631-a95c-c910d33724b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dhml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-l8qfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.639875 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thzhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eee1745-fda9-4657-8cb0-d491ae450f82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thzhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.648353 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c553534-16cf-4a8f-8d01-518e9526a117\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78csh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78csh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zgf6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.663037 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.673175 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hllm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8415254a-55e8-451e-8be1-364b98f44196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-824vx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-824vx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hllm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.680150 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wkrgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5344bfd-e537-4710-abf4-24ece04a3ff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wkrgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.689286 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1621c994-94d2-4105-a988-f4739518ba91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2bz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2bz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2lgzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.702659 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7edaf111-2689-4453-ba78-00677e1b6316\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-42vwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.709426 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.709733 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.709913 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.710030 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.710138 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:50Z","lastTransitionTime":"2026-03-12T14:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.711164 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.720313 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.729048 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.734493 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9pvbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe3a984d-84f6-4421-85dc-ecb4bdb29707\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ss8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9pvbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.742383 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.750587 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.760671 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.768793 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.775343 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hllm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8415254a-55e8-451e-8be1-364b98f44196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-824vx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-824vx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hllm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.781912 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wkrgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5344bfd-e537-4710-abf4-24ece04a3ff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wkrgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.789098 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1621c994-94d2-4105-a988-f4739518ba91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2bz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2bz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2lgzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.805585 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7edaf111-2689-4453-ba78-00677e1b6316\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-42vwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.813532 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.813749 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.813841 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.813930 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.814004 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:50Z","lastTransitionTime":"2026-03-12T14:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.817433 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.826420 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.835853 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.843003 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9pvbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe3a984d-84f6-4421-85dc-ecb4bdb29707\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ss8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9pvbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.852089 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.863114 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c553534-16cf-4a8f-8d01-518e9526a117\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78csh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78csh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zgf6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.871245 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-l8qfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd2bb3f-6860-4631-a95c-c910d33724b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dhml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-l8qfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.880604 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thzhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eee1745-fda9-4657-8cb0-d491ae450f82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thzhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.916860 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.916925 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.916937 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.916955 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.916968 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:50Z","lastTransitionTime":"2026-03-12T14:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.961108 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.961233 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.961256 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:48:52.961238009 +0000 UTC m=+85.246463287 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.961280 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.961309 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 14:48:50 crc kubenswrapper[4869]: I0312 14:48:50.961361 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.961364 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.961396 4869 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.961397 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.961414 4869 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.961438 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 14:48:52.961428144 +0000 UTC m=+85.246653422 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.961435 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.961455 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-12 14:48:52.961447395 +0000 UTC m=+85.246672673 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.961457 4869 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.961561 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 14:48:52.961513497 +0000 UTC m=+85.246738785 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.961467 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.961700 4869 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 14:48:50 crc kubenswrapper[4869]: E0312 14:48:50.961890 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-12 14:48:52.961845917 +0000 UTC m=+85.247071225 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.018884 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.018958 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.018972 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.018988 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.018999 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:51Z","lastTransitionTime":"2026-03-12T14:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.062501 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8415254a-55e8-451e-8be1-364b98f44196-metrics-certs\") pod \"network-metrics-daemon-hllm5\" (UID: \"8415254a-55e8-451e-8be1-364b98f44196\") " pod="openshift-multus/network-metrics-daemon-hllm5" Mar 12 14:48:51 crc kubenswrapper[4869]: E0312 14:48:51.062837 4869 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 14:48:51 crc kubenswrapper[4869]: E0312 14:48:51.063045 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8415254a-55e8-451e-8be1-364b98f44196-metrics-certs podName:8415254a-55e8-451e-8be1-364b98f44196 nodeName:}" failed. No retries permitted until 2026-03-12 14:48:53.063011394 +0000 UTC m=+85.348236682 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8415254a-55e8-451e-8be1-364b98f44196-metrics-certs") pod "network-metrics-daemon-hllm5" (UID: "8415254a-55e8-451e-8be1-364b98f44196") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.091750 4869 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.121206 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.121254 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.121271 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.121292 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.121307 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:51Z","lastTransitionTime":"2026-03-12T14:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.223417 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.223457 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.223467 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.223483 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.223493 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:51Z","lastTransitionTime":"2026-03-12T14:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.325562 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.325866 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.326033 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.326292 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.326466 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:51Z","lastTransitionTime":"2026-03-12T14:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.336124 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.336188 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hllm5" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.336145 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 14:48:51 crc kubenswrapper[4869]: E0312 14:48:51.336319 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 14:48:51 crc kubenswrapper[4869]: E0312 14:48:51.336492 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hllm5" podUID="8415254a-55e8-451e-8be1-364b98f44196" Mar 12 14:48:51 crc kubenswrapper[4869]: E0312 14:48:51.336674 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.336720 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 14:48:51 crc kubenswrapper[4869]: E0312 14:48:51.336928 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.429313 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.429381 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.429407 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.429435 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.429455 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:51Z","lastTransitionTime":"2026-03-12T14:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.531647 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.531698 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.531707 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.531721 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.531731 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:51Z","lastTransitionTime":"2026-03-12T14:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.634197 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.634232 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.634241 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.634254 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.634262 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:51Z","lastTransitionTime":"2026-03-12T14:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.736817 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.736860 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.736871 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.736885 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.736895 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:51Z","lastTransitionTime":"2026-03-12T14:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.840426 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.840493 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.840510 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.840534 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.840577 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:51Z","lastTransitionTime":"2026-03-12T14:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.942710 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.942778 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.942800 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.942828 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:51 crc kubenswrapper[4869]: I0312 14:48:51.942850 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:51Z","lastTransitionTime":"2026-03-12T14:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.045755 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.045822 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.045847 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.045877 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.045900 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:52Z","lastTransitionTime":"2026-03-12T14:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.149027 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.149103 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.149127 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.149223 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.149250 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:52Z","lastTransitionTime":"2026-03-12T14:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.252308 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.252375 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.252393 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.252417 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.252433 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:52Z","lastTransitionTime":"2026-03-12T14:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.355073 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.355105 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.355114 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.355128 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.355138 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:52Z","lastTransitionTime":"2026-03-12T14:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.457215 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.457262 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.457274 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.457295 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.457307 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:52Z","lastTransitionTime":"2026-03-12T14:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.560243 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.560287 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.560298 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.560315 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.560326 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:52Z","lastTransitionTime":"2026-03-12T14:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.662808 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.662849 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.662862 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.662881 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.662893 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:52Z","lastTransitionTime":"2026-03-12T14:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.765943 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.765992 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.766012 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.766035 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.766053 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:52Z","lastTransitionTime":"2026-03-12T14:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.867844 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.867988 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.868001 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.868020 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.868031 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:52Z","lastTransitionTime":"2026-03-12T14:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.970372 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.970454 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.970474 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.970498 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.970522 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:52Z","lastTransitionTime":"2026-03-12T14:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.982939 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.983111 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.983159 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 14:48:52 crc kubenswrapper[4869]: E0312 14:48:52.983254 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:48:56.983219184 +0000 UTC m=+89.268444502 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:48:52 crc kubenswrapper[4869]: E0312 14:48:52.983272 4869 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 14:48:52 crc kubenswrapper[4869]: E0312 14:48:52.983304 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 14:48:52 crc kubenswrapper[4869]: E0312 14:48:52.983334 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 14:48:52 crc kubenswrapper[4869]: E0312 14:48:52.983355 4869 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 14:48:52 crc kubenswrapper[4869]: E0312 14:48:52.983361 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 14:48:56.983347607 +0000 UTC m=+89.268572925 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.983407 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 14:48:52 crc kubenswrapper[4869]: E0312 14:48:52.983426 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-12 14:48:56.983402479 +0000 UTC m=+89.268627797 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 14:48:52 crc kubenswrapper[4869]: I0312 14:48:52.983478 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 14:48:52 crc kubenswrapper[4869]: E0312 14:48:52.983711 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 14:48:52 crc kubenswrapper[4869]: E0312 14:48:52.983765 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 14:48:52 crc kubenswrapper[4869]: E0312 14:48:52.983773 4869 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 14:48:52 crc kubenswrapper[4869]: E0312 14:48:52.983783 4869 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 14:48:52 crc kubenswrapper[4869]: E0312 14:48:52.983837 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 14:48:56.983821602 +0000 UTC m=+89.269046910 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 14:48:52 crc kubenswrapper[4869]: E0312 14:48:52.983860 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-12 14:48:56.983849072 +0000 UTC m=+89.269074390 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.072754 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.072791 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.072816 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.072830 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.072839 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:53Z","lastTransitionTime":"2026-03-12T14:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.084405 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8415254a-55e8-451e-8be1-364b98f44196-metrics-certs\") pod \"network-metrics-daemon-hllm5\" (UID: \"8415254a-55e8-451e-8be1-364b98f44196\") " pod="openshift-multus/network-metrics-daemon-hllm5" Mar 12 14:48:53 crc kubenswrapper[4869]: E0312 14:48:53.084592 4869 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 14:48:53 crc kubenswrapper[4869]: E0312 14:48:53.084657 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8415254a-55e8-451e-8be1-364b98f44196-metrics-certs podName:8415254a-55e8-451e-8be1-364b98f44196 nodeName:}" failed. No retries permitted until 2026-03-12 14:48:57.084638997 +0000 UTC m=+89.369864275 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8415254a-55e8-451e-8be1-364b98f44196-metrics-certs") pod "network-metrics-daemon-hllm5" (UID: "8415254a-55e8-451e-8be1-364b98f44196") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.175668 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.175711 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.175721 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.175735 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.175744 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:53Z","lastTransitionTime":"2026-03-12T14:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.278598 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.278675 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.278697 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.278730 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.278752 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:53Z","lastTransitionTime":"2026-03-12T14:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.335853 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.335923 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.335869 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hllm5" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.336057 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 14:48:53 crc kubenswrapper[4869]: E0312 14:48:53.336274 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 14:48:53 crc kubenswrapper[4869]: E0312 14:48:53.336496 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hllm5" podUID="8415254a-55e8-451e-8be1-364b98f44196" Mar 12 14:48:53 crc kubenswrapper[4869]: E0312 14:48:53.336653 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 14:48:53 crc kubenswrapper[4869]: E0312 14:48:53.336749 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.381045 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.381086 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.381098 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.381115 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.381129 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:53Z","lastTransitionTime":"2026-03-12T14:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.483776 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.483833 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.483852 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.483875 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.483892 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:53Z","lastTransitionTime":"2026-03-12T14:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.586752 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.586816 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.586834 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.586858 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.586878 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:53Z","lastTransitionTime":"2026-03-12T14:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.689281 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.689328 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.689339 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.689360 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.689371 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:53Z","lastTransitionTime":"2026-03-12T14:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.791934 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.791963 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.791972 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.791989 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.791999 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:53Z","lastTransitionTime":"2026-03-12T14:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.895471 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.895581 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.895602 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.895637 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.895659 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:53Z","lastTransitionTime":"2026-03-12T14:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.999433 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.999505 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.999523 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.999591 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:53 crc kubenswrapper[4869]: I0312 14:48:53.999612 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:53Z","lastTransitionTime":"2026-03-12T14:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.102670 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.102724 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.102738 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.102760 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.102775 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:54Z","lastTransitionTime":"2026-03-12T14:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.206468 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.206565 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.206586 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.206613 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.206633 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:54Z","lastTransitionTime":"2026-03-12T14:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.310327 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.310610 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.310694 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.310773 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.310869 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:54Z","lastTransitionTime":"2026-03-12T14:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.413293 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.413338 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.413347 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.413363 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.413371 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:54Z","lastTransitionTime":"2026-03-12T14:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.515637 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.516043 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.516215 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.516356 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.516490 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:54Z","lastTransitionTime":"2026-03-12T14:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.619437 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.619479 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.619491 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.619508 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.619520 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:54Z","lastTransitionTime":"2026-03-12T14:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.722379 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.722447 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.722465 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.722488 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.722505 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:54Z","lastTransitionTime":"2026-03-12T14:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.825643 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.825723 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.825737 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.825756 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.825768 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:54Z","lastTransitionTime":"2026-03-12T14:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.928399 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.928446 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.928454 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.928469 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:54 crc kubenswrapper[4869]: I0312 14:48:54.928477 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:54Z","lastTransitionTime":"2026-03-12T14:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.030777 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.030841 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.030857 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.030882 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.030904 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:55Z","lastTransitionTime":"2026-03-12T14:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.133428 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.133476 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.133485 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.133498 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.133507 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:55Z","lastTransitionTime":"2026-03-12T14:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.235867 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.235910 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.235920 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.235934 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.235943 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:55Z","lastTransitionTime":"2026-03-12T14:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.336480 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hllm5" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.336513 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.336615 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.336513 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 14:48:55 crc kubenswrapper[4869]: E0312 14:48:55.336656 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hllm5" podUID="8415254a-55e8-451e-8be1-364b98f44196" Mar 12 14:48:55 crc kubenswrapper[4869]: E0312 14:48:55.336721 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 14:48:55 crc kubenswrapper[4869]: E0312 14:48:55.336777 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 14:48:55 crc kubenswrapper[4869]: E0312 14:48:55.336806 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.337921 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.337948 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.337957 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.337972 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.337982 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:55Z","lastTransitionTime":"2026-03-12T14:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.440159 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.440664 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.440832 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.440929 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.441003 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:55Z","lastTransitionTime":"2026-03-12T14:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.543939 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.544295 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.544408 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.544503 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.544610 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:55Z","lastTransitionTime":"2026-03-12T14:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.647524 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.647587 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.647606 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.647628 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.647639 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:55Z","lastTransitionTime":"2026-03-12T14:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.749936 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.750352 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.750452 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.750605 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.750733 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:55Z","lastTransitionTime":"2026-03-12T14:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.852861 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.852895 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.852903 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.852917 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.852925 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:55Z","lastTransitionTime":"2026-03-12T14:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.957044 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.957088 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.957100 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.957120 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:55 crc kubenswrapper[4869]: I0312 14:48:55.957132 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:55Z","lastTransitionTime":"2026-03-12T14:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.059865 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.059918 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.059930 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.059946 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.059960 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:56Z","lastTransitionTime":"2026-03-12T14:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.162426 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.162467 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.162478 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.162496 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.162508 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:56Z","lastTransitionTime":"2026-03-12T14:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.265004 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.265037 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.265045 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.265059 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.265067 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:56Z","lastTransitionTime":"2026-03-12T14:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.367371 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.367417 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.367432 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.367456 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.367469 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:56Z","lastTransitionTime":"2026-03-12T14:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.469218 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.469256 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.469275 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.469292 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.469303 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:56Z","lastTransitionTime":"2026-03-12T14:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.571924 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.571962 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.571972 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.571987 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.571998 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:56Z","lastTransitionTime":"2026-03-12T14:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.673369 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.673403 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.673411 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.673423 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.673431 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:56Z","lastTransitionTime":"2026-03-12T14:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.776318 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.776348 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.776356 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.776369 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.776379 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:56Z","lastTransitionTime":"2026-03-12T14:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.879364 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.879426 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.879439 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.879458 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.879470 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:56Z","lastTransitionTime":"2026-03-12T14:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.983240 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.983328 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.983347 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.983373 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:56 crc kubenswrapper[4869]: I0312 14:48:56.983393 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:56Z","lastTransitionTime":"2026-03-12T14:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.027069 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.027187 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.027216 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.027242 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.027263 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 14:48:57 crc kubenswrapper[4869]: E0312 14:48:57.027370 4869 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 14:48:57 crc kubenswrapper[4869]: E0312 14:48:57.027417 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 14:49:05.027403308 +0000 UTC m=+97.312628586 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 14:48:57 crc kubenswrapper[4869]: E0312 14:48:57.027565 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 14:48:57 crc kubenswrapper[4869]: E0312 14:48:57.027642 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 14:48:57 crc kubenswrapper[4869]: E0312 14:48:57.027639 4869 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 14:48:57 crc kubenswrapper[4869]: E0312 14:48:57.027568 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 14:48:57 crc kubenswrapper[4869]: E0312 14:48:57.027789 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 14:48:57 crc kubenswrapper[4869]: E0312 14:48:57.027811 4869 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 14:48:57 crc kubenswrapper[4869]: E0312 14:48:57.027533 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:05.0274689 +0000 UTC m=+97.312694218 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:48:57 crc kubenswrapper[4869]: E0312 14:48:57.027666 4869 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 14:48:57 crc kubenswrapper[4869]: E0312 14:48:57.027888 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 14:49:05.027860801 +0000 UTC m=+97.313086109 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 14:48:57 crc kubenswrapper[4869]: E0312 14:48:57.027910 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-12 14:49:05.027898962 +0000 UTC m=+97.313124270 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 14:48:57 crc kubenswrapper[4869]: E0312 14:48:57.027959 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-12 14:49:05.027929563 +0000 UTC m=+97.313154881 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.087895 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.087954 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.087964 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.087979 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.087988 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:57Z","lastTransitionTime":"2026-03-12T14:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.128630 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8415254a-55e8-451e-8be1-364b98f44196-metrics-certs\") pod \"network-metrics-daemon-hllm5\" (UID: \"8415254a-55e8-451e-8be1-364b98f44196\") " pod="openshift-multus/network-metrics-daemon-hllm5" Mar 12 14:48:57 crc kubenswrapper[4869]: E0312 14:48:57.128789 4869 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 14:48:57 crc kubenswrapper[4869]: E0312 14:48:57.128844 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8415254a-55e8-451e-8be1-364b98f44196-metrics-certs podName:8415254a-55e8-451e-8be1-364b98f44196 nodeName:}" failed. No retries permitted until 2026-03-12 14:49:05.128830173 +0000 UTC m=+97.414055451 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8415254a-55e8-451e-8be1-364b98f44196-metrics-certs") pod "network-metrics-daemon-hllm5" (UID: "8415254a-55e8-451e-8be1-364b98f44196") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.190465 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.190535 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.190589 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.190615 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.190632 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:57Z","lastTransitionTime":"2026-03-12T14:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.293102 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.293150 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.293160 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.293175 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.293186 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:57Z","lastTransitionTime":"2026-03-12T14:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.336261 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hllm5" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.336323 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 14:48:57 crc kubenswrapper[4869]: E0312 14:48:57.336407 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hllm5" podUID="8415254a-55e8-451e-8be1-364b98f44196" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.336530 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.336634 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 14:48:57 crc kubenswrapper[4869]: E0312 14:48:57.336796 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 14:48:57 crc kubenswrapper[4869]: E0312 14:48:57.336937 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 14:48:57 crc kubenswrapper[4869]: E0312 14:48:57.337147 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.395854 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.395904 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.395916 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.395935 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.395947 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:57Z","lastTransitionTime":"2026-03-12T14:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.498296 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.498336 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.498380 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.498397 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.498410 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:57Z","lastTransitionTime":"2026-03-12T14:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.607102 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.607148 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.607162 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.607181 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.607196 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:57Z","lastTransitionTime":"2026-03-12T14:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.709816 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.709859 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.709873 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.709891 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.709905 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:57Z","lastTransitionTime":"2026-03-12T14:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.812244 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.812288 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.812302 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.812321 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.812333 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:57Z","lastTransitionTime":"2026-03-12T14:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.913813 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.913866 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.913878 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.913895 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:57 crc kubenswrapper[4869]: I0312 14:48:57.913907 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:57Z","lastTransitionTime":"2026-03-12T14:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.016266 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.016307 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.016315 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.016329 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.016340 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:58Z","lastTransitionTime":"2026-03-12T14:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.118768 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.119031 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.119138 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.119234 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.119305 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:58Z","lastTransitionTime":"2026-03-12T14:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.221145 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.221184 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.221195 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.221210 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.221222 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:58Z","lastTransitionTime":"2026-03-12T14:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.323453 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.323501 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.323517 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.323560 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.323577 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:58Z","lastTransitionTime":"2026-03-12T14:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.356638 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7edaf111-2689-4453-ba78-00677e1b6316\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-42vwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.367713 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.377572 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.383837 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hllm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8415254a-55e8-451e-8be1-364b98f44196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-824vx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-824vx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hllm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.389751 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wkrgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5344bfd-e537-4710-abf4-24ece04a3ff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wkrgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.397267 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1621c994-94d2-4105-a988-f4739518ba91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2bz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2bz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2lgzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.404885 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.413806 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.422454 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.425628 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.425659 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.425671 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.425688 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.425699 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:58Z","lastTransitionTime":"2026-03-12T14:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.432908 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.439482 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9pvbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe3a984d-84f6-4421-85dc-ecb4bdb29707\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ss8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9pvbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.449284 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thzhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eee1745-fda9-4657-8cb0-d491ae450f82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thzhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.457858 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c553534-16cf-4a8f-8d01-518e9526a117\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78csh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78csh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zgf6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.465841 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-l8qfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd2bb3f-6860-4631-a95c-c910d33724b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dhml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-l8qfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.527395 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.527614 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.527719 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.527813 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.527911 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:58Z","lastTransitionTime":"2026-03-12T14:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.630250 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.630410 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.630516 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.630684 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.630707 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:58Z","lastTransitionTime":"2026-03-12T14:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.733106 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.733142 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.733151 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.733167 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.733177 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:58Z","lastTransitionTime":"2026-03-12T14:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.790223 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.790251 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.790262 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.790276 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.790286 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:58Z","lastTransitionTime":"2026-03-12T14:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:58 crc kubenswrapper[4869]: E0312 14:48:58.800962 4869 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0727d113-6abb-4498-952f-5280a3e03df5\\\",\\\"systemUUID\\\":\\\"2ba13367-485d-48d1-abc3-723587dc31cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.804017 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.804069 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.804086 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.804107 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.804123 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:58Z","lastTransitionTime":"2026-03-12T14:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:58 crc kubenswrapper[4869]: E0312 14:48:58.815807 4869 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0727d113-6abb-4498-952f-5280a3e03df5\\\",\\\"systemUUID\\\":\\\"2ba13367-485d-48d1-abc3-723587dc31cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.820685 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.820716 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.820733 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.820747 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.820761 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:58Z","lastTransitionTime":"2026-03-12T14:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:58 crc kubenswrapper[4869]: E0312 14:48:58.837079 4869 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0727d113-6abb-4498-952f-5280a3e03df5\\\",\\\"systemUUID\\\":\\\"2ba13367-485d-48d1-abc3-723587dc31cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.843568 4869 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.844643 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.844680 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.844699 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.844718 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.844731 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:58Z","lastTransitionTime":"2026-03-12T14:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:58 crc kubenswrapper[4869]: E0312 14:48:58.856829 4869 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0727d113-6abb-4498-952f-5280a3e03df5\\\",\\\"systemUUID\\\":\\\"2ba13367-485d-48d1-abc3-723587dc31cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.861231 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.861283 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.861299 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.861318 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.861331 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:58Z","lastTransitionTime":"2026-03-12T14:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:58 crc kubenswrapper[4869]: E0312 14:48:58.869868 4869 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0727d113-6abb-4498-952f-5280a3e03df5\\\",\\\"systemUUID\\\":\\\"2ba13367-485d-48d1-abc3-723587dc31cc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:48:58 crc kubenswrapper[4869]: E0312 14:48:58.870019 4869 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.872071 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.872124 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.872135 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.872156 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.872169 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:58Z","lastTransitionTime":"2026-03-12T14:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.975023 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.975059 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.975068 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.975083 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:58 crc kubenswrapper[4869]: I0312 14:48:58.975093 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:58Z","lastTransitionTime":"2026-03-12T14:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.076949 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.077590 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.077660 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.077730 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.077802 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:59Z","lastTransitionTime":"2026-03-12T14:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.179512 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.179578 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.179591 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.179609 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.179625 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:59Z","lastTransitionTime":"2026-03-12T14:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.281645 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.281694 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.281733 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.281757 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.281773 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:59Z","lastTransitionTime":"2026-03-12T14:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.336226 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.336227 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.336227 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hllm5" Mar 12 14:48:59 crc kubenswrapper[4869]: E0312 14:48:59.336807 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.336384 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 14:48:59 crc kubenswrapper[4869]: E0312 14:48:59.336732 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 14:48:59 crc kubenswrapper[4869]: E0312 14:48:59.336909 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hllm5" podUID="8415254a-55e8-451e-8be1-364b98f44196" Mar 12 14:48:59 crc kubenswrapper[4869]: E0312 14:48:59.336920 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.384442 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.384763 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.384880 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.384999 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.385138 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:59Z","lastTransitionTime":"2026-03-12T14:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.487656 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.487950 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.488038 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.488194 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.488289 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:59Z","lastTransitionTime":"2026-03-12T14:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.591798 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.591866 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.591881 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.591919 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.591935 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:59Z","lastTransitionTime":"2026-03-12T14:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.694971 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.695327 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.695582 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.695833 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.696036 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:59Z","lastTransitionTime":"2026-03-12T14:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.799245 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.799286 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.799297 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.799314 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.799325 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:59Z","lastTransitionTime":"2026-03-12T14:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.902960 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.903030 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.903053 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.903081 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:48:59 crc kubenswrapper[4869]: I0312 14:48:59.903098 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:48:59Z","lastTransitionTime":"2026-03-12T14:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.005918 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.005981 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.005997 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.006021 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.006037 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:00Z","lastTransitionTime":"2026-03-12T14:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.108658 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.108699 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.108712 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.108728 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.108739 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:00Z","lastTransitionTime":"2026-03-12T14:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.210981 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.211020 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.211032 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.211048 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.211059 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:00Z","lastTransitionTime":"2026-03-12T14:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.314526 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.314939 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.315069 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.315211 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.315354 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:00Z","lastTransitionTime":"2026-03-12T14:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.355712 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.355751 4869 scope.go:117] "RemoveContainer" containerID="0e6104f6e86200fc4f007b43b7b8c0c0dfb0cf70075ca81ff9773e4424d03e28" Mar 12 14:49:00 crc kubenswrapper[4869]: E0312 14:49:00.356007 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.418041 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.418084 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.418094 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.418112 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.418122 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:00Z","lastTransitionTime":"2026-03-12T14:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.520602 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.520648 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.520661 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.520678 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.520690 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:00Z","lastTransitionTime":"2026-03-12T14:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.623149 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.623200 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.623215 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.623234 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.623246 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:00Z","lastTransitionTime":"2026-03-12T14:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.646972 4869 scope.go:117] "RemoveContainer" containerID="0e6104f6e86200fc4f007b43b7b8c0c0dfb0cf70075ca81ff9773e4424d03e28" Mar 12 14:49:00 crc kubenswrapper[4869]: E0312 14:49:00.647123 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.725972 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.726031 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.726048 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.726076 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.726095 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:00Z","lastTransitionTime":"2026-03-12T14:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.828828 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.828870 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.828879 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.828895 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.828905 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:00Z","lastTransitionTime":"2026-03-12T14:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.931630 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.931762 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.931778 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.931794 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:00 crc kubenswrapper[4869]: I0312 14:49:00.931807 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:00Z","lastTransitionTime":"2026-03-12T14:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.034085 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.034129 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.034144 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.034161 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.034174 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:01Z","lastTransitionTime":"2026-03-12T14:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.136799 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.136840 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.136853 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.136869 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.136880 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:01Z","lastTransitionTime":"2026-03-12T14:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.238981 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.239022 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.239034 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.239067 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.239077 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:01Z","lastTransitionTime":"2026-03-12T14:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.336243 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hllm5" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.336259 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.336318 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.336604 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 14:49:01 crc kubenswrapper[4869]: E0312 14:49:01.336692 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hllm5" podUID="8415254a-55e8-451e-8be1-364b98f44196" Mar 12 14:49:01 crc kubenswrapper[4869]: E0312 14:49:01.336833 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 14:49:01 crc kubenswrapper[4869]: E0312 14:49:01.337024 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 14:49:01 crc kubenswrapper[4869]: E0312 14:49:01.337408 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.344557 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.344601 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.344618 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.344640 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.344655 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:01Z","lastTransitionTime":"2026-03-12T14:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.447025 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.447053 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.447061 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.447074 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.447084 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:01Z","lastTransitionTime":"2026-03-12T14:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.549334 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.549360 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.549369 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.549383 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.549392 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:01Z","lastTransitionTime":"2026-03-12T14:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.650177 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-thzhj" event={"ID":"9eee1745-fda9-4657-8cb0-d491ae450f82","Type":"ContainerStarted","Data":"c0f477e0d6b8bd23996b9d325c163aa35d20ee6b4f9e07ed131e8f1237e925eb"} Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.651715 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"10f1c08b7b40fe76759026dea3ec2da5339a80ec5e68b9abc804b758e3951d7b"} Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.653670 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" event={"ID":"2c553534-16cf-4a8f-8d01-518e9526a117","Type":"ContainerStarted","Data":"a72e3e16974f6fddb4523de95bc7146e30691b9c130cbc9fa827536a79fe8947"} Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.656525 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.656700 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.656760 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.656847 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.656916 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:01Z","lastTransitionTime":"2026-03-12T14:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.663070 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.673445 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.682017 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.689765 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.701451 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9pvbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe3a984d-84f6-4421-85dc-ecb4bdb29707\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ss8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9pvbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.710351 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b890e426-900b-4755-ba40-37ed7df4521e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be3fc94063ff64e8420a185bf53159150916de6355f8bb3ca727c6def21b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f2d0e37f450282dca2d3e08b24868d3862fcb72eca6f7a6a9aca2d13015f4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://518439416ac86b6aa626cb78deb23ea94de1043a09abb2dc0ae51ea876199b72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e6104f6e86200fc4f007b43b7b8c0c0dfb0cf70075ca81ff9773e4424d03e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e6104f6e86200fc4f007b43b7b8c0c0dfb0cf70075ca81ff9773e4424d03e28\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T14:48:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 14:48:37.218082 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 14:48:37.218186 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 14:48:37.218781 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2512027045/tls.crt::/tmp/serving-cert-2512027045/tls.key\\\\\\\"\\\\nI0312 14:48:37.402531 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 14:48:37.405394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 14:48:37.405407 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 14:48:37.405427 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 14:48:37.405432 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 14:48:37.410728 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 14:48:37.410748 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 14:48:37.410754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 14:48:37.410759 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 14:48:37.410764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 14:48:37.410767 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 14:48:37.410770 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 14:48:37.410942 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 14:48:37.412850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T14:48:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ade0e5dee1a3862128dbe2599fff70920f8af3d833f28981be57d7a67e0003c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ee40a3b1e31feae95747d08364edf9c9995181354849d3d3e55f2255664d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee40a3b1e31feae95747d08364edf9c9995181354849d3d3e55f2255664d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T14:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T14:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:47:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.720524 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thzhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eee1745-fda9-4657-8cb0-d491ae450f82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f477e0d6b8bd23996b9d325c163aa35d20ee6b4f9e07ed131e8f1237e925eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thzhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.728537 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c553534-16cf-4a8f-8d01-518e9526a117\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78csh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78csh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zgf6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.737011 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-l8qfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd2bb3f-6860-4631-a95c-c910d33724b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dhml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-l8qfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.744701 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1621c994-94d2-4105-a988-f4739518ba91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2bz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2bz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2lgzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.758152 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7edaf111-2689-4453-ba78-00677e1b6316\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-42vwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.759322 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.759361 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.759375 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.759394 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.759407 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:01Z","lastTransitionTime":"2026-03-12T14:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.766444 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.775608 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.781889 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hllm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8415254a-55e8-451e-8be1-364b98f44196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-824vx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-824vx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hllm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.788360 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wkrgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5344bfd-e537-4710-abf4-24ece04a3ff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wkrgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.794857 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hllm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8415254a-55e8-451e-8be1-364b98f44196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-824vx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-824vx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hllm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.801009 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wkrgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5344bfd-e537-4710-abf4-24ece04a3ff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wkrgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.807106 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1621c994-94d2-4105-a988-f4739518ba91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2bz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2bz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2lgzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.821339 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7edaf111-2689-4453-ba78-00677e1b6316\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-42vwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.830202 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.837781 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.846180 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.852012 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9pvbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe3a984d-84f6-4421-85dc-ecb4bdb29707\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ss8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9pvbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.860526 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f1c08b7b40fe76759026dea3ec2da5339a80ec5e68b9abc804b758e3951d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.861247 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.861276 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.861286 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.861302 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.861314 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:01Z","lastTransitionTime":"2026-03-12T14:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.868780 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.878343 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.887601 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b890e426-900b-4755-ba40-37ed7df4521e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be3fc94063ff64e8420a185bf53159150916de6355f8bb3ca727c6def21b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f2d0e37f450282dca2d3e08b24868d3862fcb72eca6f7a6a9aca2d13015f4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://518439416ac86b6aa626cb78deb23ea94de1043a09abb2dc0ae51ea876199b72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e6104f6e86200fc4f007b43b7b8c0c0dfb0cf70075ca81ff9773e4424d03e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e6104f6e86200fc4f007b43b7b8c0c0dfb0cf70075ca81ff9773e4424d03e28\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T14:48:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 14:48:37.218082 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 14:48:37.218186 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 14:48:37.218781 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2512027045/tls.crt::/tmp/serving-cert-2512027045/tls.key\\\\\\\"\\\\nI0312 14:48:37.402531 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 14:48:37.405394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 14:48:37.405407 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 14:48:37.405427 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 14:48:37.405432 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 14:48:37.410728 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 14:48:37.410748 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 14:48:37.410754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 14:48:37.410759 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 14:48:37.410764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 14:48:37.410767 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 14:48:37.410770 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 14:48:37.410942 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 14:48:37.412850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T14:48:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ade0e5dee1a3862128dbe2599fff70920f8af3d833f28981be57d7a67e0003c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ee40a3b1e31feae95747d08364edf9c9995181354849d3d3e55f2255664d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee40a3b1e31feae95747d08364edf9c9995181354849d3d3e55f2255664d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T14:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T14:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:47:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.898077 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thzhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eee1745-fda9-4657-8cb0-d491ae450f82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f477e0d6b8bd23996b9d325c163aa35d20ee6b4f9e07ed131e8f1237e925eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thzhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.905376 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c553534-16cf-4a8f-8d01-518e9526a117\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78csh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78csh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zgf6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.913853 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-l8qfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd2bb3f-6860-4631-a95c-c910d33724b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dhml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-l8qfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.964025 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.964067 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.964078 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.964092 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:01 crc kubenswrapper[4869]: I0312 14:49:01.964101 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:01Z","lastTransitionTime":"2026-03-12T14:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.066489 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.066576 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.066594 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.066616 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.066632 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:02Z","lastTransitionTime":"2026-03-12T14:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.168975 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.169019 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.169030 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.169049 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.169061 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:02Z","lastTransitionTime":"2026-03-12T14:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.271282 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.271324 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.271334 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.271348 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.271358 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:02Z","lastTransitionTime":"2026-03-12T14:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.376248 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.376311 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.376325 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.376345 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.376363 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:02Z","lastTransitionTime":"2026-03-12T14:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.478780 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.479133 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.479142 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.479155 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.479163 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:02Z","lastTransitionTime":"2026-03-12T14:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.581184 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.581220 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.581232 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.581245 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.581254 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:02Z","lastTransitionTime":"2026-03-12T14:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.658322 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-l8qfx" event={"ID":"2fd2bb3f-6860-4631-a95c-c910d33724b6","Type":"ContainerStarted","Data":"a83093a69c54535bd941bdaf89e585ba3f4af93800644747e8e2bdfcfd3bb0c6"} Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.659979 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerStarted","Data":"f426ea0e5ab79ee4591867b34f5081ee4879d1f2504afa9dd9e6f27592c8fc8d"} Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.660137 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerStarted","Data":"9e48f067c773716c7d24fab1c2ac1e1bfd0b073b1e56d62472b739aafe4d8ef4"} Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.662345 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" event={"ID":"2c553534-16cf-4a8f-8d01-518e9526a117","Type":"ContainerStarted","Data":"871652decbaa741a986c3b80072047f40fd1079b6714a8c22fed0863a7295869"} Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.664006 4869 generic.go:334] "Generic (PLEG): container finished" podID="9eee1745-fda9-4657-8cb0-d491ae450f82" containerID="c0f477e0d6b8bd23996b9d325c163aa35d20ee6b4f9e07ed131e8f1237e925eb" exitCode=0 Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.664047 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-thzhj" event={"ID":"9eee1745-fda9-4657-8cb0-d491ae450f82","Type":"ContainerDied","Data":"c0f477e0d6b8bd23996b9d325c163aa35d20ee6b4f9e07ed131e8f1237e925eb"} Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.676093 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.683668 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.683864 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.684017 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.684110 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.684178 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:02Z","lastTransitionTime":"2026-03-12T14:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.687520 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f1c08b7b40fe76759026dea3ec2da5339a80ec5e68b9abc804b758e3951d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.699968 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.708915 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.716406 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9pvbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe3a984d-84f6-4421-85dc-ecb4bdb29707\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ss8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9pvbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.727210 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b890e426-900b-4755-ba40-37ed7df4521e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be3fc94063ff64e8420a185bf53159150916de6355f8bb3ca727c6def21b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f2d0e37f450282dca2d3e08b24868d3862fcb72eca6f7a6a9aca2d13015f4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://518439416ac86b6aa626cb78deb23ea94de1043a09abb2dc0ae51ea876199b72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e6104f6e86200fc4f007b43b7b8c0c0dfb0cf70075ca81ff9773e4424d03e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e6104f6e86200fc4f007b43b7b8c0c0dfb0cf70075ca81ff9773e4424d03e28\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T14:48:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 14:48:37.218082 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 14:48:37.218186 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 14:48:37.218781 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2512027045/tls.crt::/tmp/serving-cert-2512027045/tls.key\\\\\\\"\\\\nI0312 14:48:37.402531 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 14:48:37.405394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 14:48:37.405407 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 14:48:37.405427 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 14:48:37.405432 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 14:48:37.410728 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 14:48:37.410748 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 14:48:37.410754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 14:48:37.410759 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 14:48:37.410764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 14:48:37.410767 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 14:48:37.410770 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 14:48:37.410942 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 14:48:37.412850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T14:48:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ade0e5dee1a3862128dbe2599fff70920f8af3d833f28981be57d7a67e0003c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ee40a3b1e31feae95747d08364edf9c9995181354849d3d3e55f2255664d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee40a3b1e31feae95747d08364edf9c9995181354849d3d3e55f2255664d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T14:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T14:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:47:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.735639 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thzhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eee1745-fda9-4657-8cb0-d491ae450f82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f477e0d6b8bd23996b9d325c163aa35d20ee6b4f9e07ed131e8f1237e925eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thzhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.742681 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c553534-16cf-4a8f-8d01-518e9526a117\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78csh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78csh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zgf6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.749769 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-l8qfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd2bb3f-6860-4631-a95c-c910d33724b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83093a69c54535bd941bdaf89e585ba3f4af93800644747e8e2bdfcfd3bb0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dhml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-l8qfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.755699 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wkrgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5344bfd-e537-4710-abf4-24ece04a3ff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wkrgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.762388 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1621c994-94d2-4105-a988-f4739518ba91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2bz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2bz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2lgzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.778775 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7edaf111-2689-4453-ba78-00677e1b6316\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-42vwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.787478 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.789607 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.789735 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.789900 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.790045 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:02Z","lastTransitionTime":"2026-03-12T14:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.788651 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.800738 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.807510 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hllm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8415254a-55e8-451e-8be1-364b98f44196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-824vx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-824vx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hllm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.815805 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.823748 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hllm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8415254a-55e8-451e-8be1-364b98f44196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-824vx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-824vx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hllm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.831626 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wkrgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5344bfd-e537-4710-abf4-24ece04a3ff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wkrgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.841466 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1621c994-94d2-4105-a988-f4739518ba91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f426ea0e5ab79ee4591867b34f5081ee4879d1f2504afa9dd9e6f27592c8fc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2bz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e48f067c773716c7d24fab1c2ac1e1bfd0b073b1e56d62472b739aafe4d8ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2bz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2lgzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.857590 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7edaf111-2689-4453-ba78-00677e1b6316\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-42vwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.866026 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.876831 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.885737 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.892400 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9pvbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe3a984d-84f6-4421-85dc-ecb4bdb29707\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ss8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9pvbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.892773 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.892804 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.892812 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.892825 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.892834 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:02Z","lastTransitionTime":"2026-03-12T14:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.901098 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f1c08b7b40fe76759026dea3ec2da5339a80ec5e68b9abc804b758e3951d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.909075 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.918906 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-l8qfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd2bb3f-6860-4631-a95c-c910d33724b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83093a69c54535bd941bdaf89e585ba3f4af93800644747e8e2bdfcfd3bb0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dhml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-l8qfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.929022 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b890e426-900b-4755-ba40-37ed7df4521e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be3fc94063ff64e8420a185bf53159150916de6355f8bb3ca727c6def21b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f2d0e37f450282dca2d3e08b24868d3862fcb72eca6f7a6a9aca2d13015f4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://518439416ac86b6aa626cb78deb23ea94de1043a09abb2dc0ae51ea876199b72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e6104f6e86200fc4f007b43b7b8c0c0dfb0cf70075ca81ff9773e4424d03e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e6104f6e86200fc4f007b43b7b8c0c0dfb0cf70075ca81ff9773e4424d03e28\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T14:48:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 14:48:37.218082 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 14:48:37.218186 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 14:48:37.218781 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2512027045/tls.crt::/tmp/serving-cert-2512027045/tls.key\\\\\\\"\\\\nI0312 14:48:37.402531 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 14:48:37.405394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 14:48:37.405407 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 14:48:37.405427 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 14:48:37.405432 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 14:48:37.410728 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 14:48:37.410748 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 14:48:37.410754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 14:48:37.410759 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 14:48:37.410764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 14:48:37.410767 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 14:48:37.410770 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 14:48:37.410942 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 14:48:37.412850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T14:48:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ade0e5dee1a3862128dbe2599fff70920f8af3d833f28981be57d7a67e0003c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ee40a3b1e31feae95747d08364edf9c9995181354849d3d3e55f2255664d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee40a3b1e31feae95747d08364edf9c9995181354849d3d3e55f2255664d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T14:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T14:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:47:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.939126 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thzhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eee1745-fda9-4657-8cb0-d491ae450f82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f477e0d6b8bd23996b9d325c163aa35d20ee6b4f9e07ed131e8f1237e925eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f477e0d6b8bd23996b9d325c163aa35d20ee6b4f9e07ed131e8f1237e925eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T14:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T14:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thzhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.946605 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c553534-16cf-4a8f-8d01-518e9526a117\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72e3e16974f6fddb4523de95bc7146e30691b9c130cbc9fa827536a79fe8947\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78csh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://871652decbaa741a986c3b80072047f40fd1079b6714a8c22fed0863a7295869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78csh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zgf6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.995449 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.995765 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.995852 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.995946 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:02 crc kubenswrapper[4869]: I0312 14:49:02.996024 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:02Z","lastTransitionTime":"2026-03-12T14:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.099092 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.099139 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.099154 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.099172 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.099184 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:03Z","lastTransitionTime":"2026-03-12T14:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.201217 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.201259 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.201270 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.201284 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.201294 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:03Z","lastTransitionTime":"2026-03-12T14:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.310832 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.310868 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.310877 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.310890 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.310900 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:03Z","lastTransitionTime":"2026-03-12T14:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.336004 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hllm5" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.336036 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.336131 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.336172 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 14:49:03 crc kubenswrapper[4869]: E0312 14:49:03.336401 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hllm5" podUID="8415254a-55e8-451e-8be1-364b98f44196" Mar 12 14:49:03 crc kubenswrapper[4869]: E0312 14:49:03.336518 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 14:49:03 crc kubenswrapper[4869]: E0312 14:49:03.336599 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 14:49:03 crc kubenswrapper[4869]: E0312 14:49:03.336666 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.413633 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.413797 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.414038 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.414110 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.414179 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:03Z","lastTransitionTime":"2026-03-12T14:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.521660 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.522099 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.522111 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.522127 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.522140 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:03Z","lastTransitionTime":"2026-03-12T14:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.624685 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.624716 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.624726 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.624739 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.624747 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:03Z","lastTransitionTime":"2026-03-12T14:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.669350 4869 generic.go:334] "Generic (PLEG): container finished" podID="9eee1745-fda9-4657-8cb0-d491ae450f82" containerID="a62a537dca7c1d80e6470237da425bece1b502c3e01d99499ea043fafba5bfb3" exitCode=0 Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.669457 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-thzhj" event={"ID":"9eee1745-fda9-4657-8cb0-d491ae450f82","Type":"ContainerDied","Data":"a62a537dca7c1d80e6470237da425bece1b502c3e01d99499ea043fafba5bfb3"} Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.681571 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.691474 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f1c08b7b40fe76759026dea3ec2da5339a80ec5e68b9abc804b758e3951d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.699819 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.710427 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.717700 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9pvbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe3a984d-84f6-4421-85dc-ecb4bdb29707\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ss8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9pvbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.728501 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b890e426-900b-4755-ba40-37ed7df4521e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be3fc94063ff64e8420a185bf53159150916de6355f8bb3ca727c6def21b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f2d0e37f450282dca2d3e08b24868d3862fcb72eca6f7a6a9aca2d13015f4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://518439416ac86b6aa626cb78deb23ea94de1043a09abb2dc0ae51ea876199b72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e6104f6e86200fc4f007b43b7b8c0c0dfb0cf70075ca81ff9773e4424d03e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e6104f6e86200fc4f007b43b7b8c0c0dfb0cf70075ca81ff9773e4424d03e28\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T14:48:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 14:48:37.218082 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 14:48:37.218186 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 14:48:37.218781 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2512027045/tls.crt::/tmp/serving-cert-2512027045/tls.key\\\\\\\"\\\\nI0312 14:48:37.402531 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 14:48:37.405394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 14:48:37.405407 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 14:48:37.405427 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 14:48:37.405432 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 14:48:37.410728 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 14:48:37.410748 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 14:48:37.410754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 14:48:37.410759 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 14:48:37.410764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 14:48:37.410767 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 14:48:37.410770 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 14:48:37.410942 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 14:48:37.412850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T14:48:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ade0e5dee1a3862128dbe2599fff70920f8af3d833f28981be57d7a67e0003c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ee40a3b1e31feae95747d08364edf9c9995181354849d3d3e55f2255664d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee40a3b1e31feae95747d08364edf9c9995181354849d3d3e55f2255664d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T14:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T14:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:47:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.737820 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.737872 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.737885 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.737900 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.737915 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:03Z","lastTransitionTime":"2026-03-12T14:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.740503 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thzhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eee1745-fda9-4657-8cb0-d491ae450f82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f477e0d6b8bd23996b9d325c163aa35d20ee6b4f9e07ed131e8f1237e925eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f477e0d6b8bd23996b9d325c163aa35d20ee6b4f9e07ed131e8f1237e925eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T14:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T14:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a62a537dca7c1d80e6470237da425bece1b502c3e01d99499ea043fafba5bfb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a62a537dca7c1d80e6470237da425bece1b502c3e01d99499ea043fafba5bfb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T14:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T14:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thzhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.749168 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c553534-16cf-4a8f-8d01-518e9526a117\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72e3e16974f6fddb4523de95bc7146e30691b9c130cbc9fa827536a79fe8947\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78csh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://871652decbaa741a986c3b80072047f40fd1079b6714a8c22fed0863a7295869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78csh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zgf6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.757587 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-l8qfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd2bb3f-6860-4631-a95c-c910d33724b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83093a69c54535bd941bdaf89e585ba3f4af93800644747e8e2bdfcfd3bb0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dhml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-l8qfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.764354 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wkrgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5344bfd-e537-4710-abf4-24ece04a3ff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wkrgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.774204 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1621c994-94d2-4105-a988-f4739518ba91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f426ea0e5ab79ee4591867b34f5081ee4879d1f2504afa9dd9e6f27592c8fc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2bz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e48f067c773716c7d24fab1c2ac1e1bfd0b073b1e56d62472b739aafe4d8ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2bz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2lgzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.787661 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7edaf111-2689-4453-ba78-00677e1b6316\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-42vwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.796137 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.808221 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.818043 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hllm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8415254a-55e8-451e-8be1-364b98f44196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-824vx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-824vx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hllm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.840291 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.840348 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.840361 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.840378 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.840390 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:03Z","lastTransitionTime":"2026-03-12T14:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.942693 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.942730 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.942740 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.942753 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:03 crc kubenswrapper[4869]: I0312 14:49:03.942762 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:03Z","lastTransitionTime":"2026-03-12T14:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.045101 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.045177 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.045191 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.045209 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.045221 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:04Z","lastTransitionTime":"2026-03-12T14:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.148251 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.148635 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.148680 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.148697 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.148733 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:04Z","lastTransitionTime":"2026-03-12T14:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.251135 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.251165 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.251173 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.251187 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.251196 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:04Z","lastTransitionTime":"2026-03-12T14:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.353415 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.353454 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.353465 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.353483 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.353495 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:04Z","lastTransitionTime":"2026-03-12T14:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.456671 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.456725 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.456738 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.456756 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.456769 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:04Z","lastTransitionTime":"2026-03-12T14:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.559718 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.559753 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.559762 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.559775 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.559785 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:04Z","lastTransitionTime":"2026-03-12T14:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.666078 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.666551 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.666562 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.666576 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.666585 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:04Z","lastTransitionTime":"2026-03-12T14:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.675208 4869 generic.go:334] "Generic (PLEG): container finished" podID="9eee1745-fda9-4657-8cb0-d491ae450f82" containerID="1f42c503dc4d674d9609273617660aaa8ae2979687759acbe283c61a58909295" exitCode=0 Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.675296 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-thzhj" event={"ID":"9eee1745-fda9-4657-8cb0-d491ae450f82","Type":"ContainerDied","Data":"1f42c503dc4d674d9609273617660aaa8ae2979687759acbe283c61a58909295"} Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.684197 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wkrgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5344bfd-e537-4710-abf4-24ece04a3ff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wkrgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.684744 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9pvbj" event={"ID":"fe3a984d-84f6-4421-85dc-ecb4bdb29707","Type":"ContainerStarted","Data":"0b388272704901dd5d4d191687ea9c65ebd7e95b7613df1ecad194895cebfe7a"} Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.686765 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wkrgx" event={"ID":"f5344bfd-e537-4710-abf4-24ece04a3ff0","Type":"ContainerStarted","Data":"5df5779361669f5732f13aaefc4675083c22827e00910d42335e17369e04e933"} Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.688347 4869 generic.go:334] "Generic (PLEG): container finished" podID="7edaf111-2689-4453-ba78-00677e1b6316" containerID="70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a" exitCode=0 Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.688376 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" event={"ID":"7edaf111-2689-4453-ba78-00677e1b6316","Type":"ContainerDied","Data":"70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a"} Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.692796 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1621c994-94d2-4105-a988-f4739518ba91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f426ea0e5ab79ee4591867b34f5081ee4879d1f2504afa9dd9e6f27592c8fc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2bz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e48f067c773716c7d24fab1c2ac1e1bfd0b073b1e56d62472b739aafe4d8ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2bz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2lgzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.706119 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7edaf111-2689-4453-ba78-00677e1b6316\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-42vwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.715598 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.723489 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.730648 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hllm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8415254a-55e8-451e-8be1-364b98f44196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-824vx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-824vx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hllm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.740821 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.751592 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f1c08b7b40fe76759026dea3ec2da5339a80ec5e68b9abc804b758e3951d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.759885 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.768369 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.768405 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.768414 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.768431 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.768441 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:04Z","lastTransitionTime":"2026-03-12T14:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.770755 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.779818 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9pvbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe3a984d-84f6-4421-85dc-ecb4bdb29707\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ss8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9pvbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.789912 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b890e426-900b-4755-ba40-37ed7df4521e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be3fc94063ff64e8420a185bf53159150916de6355f8bb3ca727c6def21b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f2d0e37f450282dca2d3e08b24868d3862fcb72eca6f7a6a9aca2d13015f4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://518439416ac86b6aa626cb78deb23ea94de1043a09abb2dc0ae51ea876199b72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e6104f6e86200fc4f007b43b7b8c0c0dfb0cf70075ca81ff9773e4424d03e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e6104f6e86200fc4f007b43b7b8c0c0dfb0cf70075ca81ff9773e4424d03e28\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T14:48:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 14:48:37.218082 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 14:48:37.218186 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 14:48:37.218781 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2512027045/tls.crt::/tmp/serving-cert-2512027045/tls.key\\\\\\\"\\\\nI0312 14:48:37.402531 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 14:48:37.405394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 14:48:37.405407 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 14:48:37.405427 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 14:48:37.405432 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 14:48:37.410728 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 14:48:37.410748 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 14:48:37.410754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 14:48:37.410759 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 14:48:37.410764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 14:48:37.410767 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 14:48:37.410770 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 14:48:37.410942 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 14:48:37.412850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T14:48:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ade0e5dee1a3862128dbe2599fff70920f8af3d833f28981be57d7a67e0003c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ee40a3b1e31feae95747d08364edf9c9995181354849d3d3e55f2255664d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee40a3b1e31feae95747d08364edf9c9995181354849d3d3e55f2255664d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T14:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T14:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:47:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.800449 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thzhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eee1745-fda9-4657-8cb0-d491ae450f82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f477e0d6b8bd23996b9d325c163aa35d20ee6b4f9e07ed131e8f1237e925eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f477e0d6b8bd23996b9d325c163aa35d20ee6b4f9e07ed131e8f1237e925eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T14:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T14:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a62a537dca7c1d80e6470237da425bece1b502c3e01d99499ea043fafba5bfb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a62a537dca7c1d80e6470237da425bece1b502c3e01d99499ea043fafba5bfb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T14:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T14:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f42c503dc4d674d9609273617660aaa8ae2979687759acbe283c61a58909295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f42c503dc4d674d9609273617660aaa8ae2979687759acbe283c61a58909295\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T14:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T14:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thzhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.808974 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c553534-16cf-4a8f-8d01-518e9526a117\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72e3e16974f6fddb4523de95bc7146e30691b9c130cbc9fa827536a79fe8947\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78csh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://871652decbaa741a986c3b80072047f40fd1079b6714a8c22fed0863a7295869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78csh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zgf6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.818595 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-l8qfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd2bb3f-6860-4631-a95c-c910d33724b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83093a69c54535bd941bdaf89e585ba3f4af93800644747e8e2bdfcfd3bb0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dhml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-l8qfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.827929 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b890e426-900b-4755-ba40-37ed7df4521e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be3fc94063ff64e8420a185bf53159150916de6355f8bb3ca727c6def21b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f2d0e37f450282dca2d3e08b24868d3862fcb72eca6f7a6a9aca2d13015f4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://518439416ac86b6aa626cb78deb23ea94de1043a09abb2dc0ae51ea876199b72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e6104f6e86200fc4f007b43b7b8c0c0dfb0cf70075ca81ff9773e4424d03e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e6104f6e86200fc4f007b43b7b8c0c0dfb0cf70075ca81ff9773e4424d03e28\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T14:48:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 14:48:37.218082 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 14:48:37.218186 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 14:48:37.218781 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2512027045/tls.crt::/tmp/serving-cert-2512027045/tls.key\\\\\\\"\\\\nI0312 14:48:37.402531 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 14:48:37.405394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 14:48:37.405407 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 14:48:37.405427 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 14:48:37.405432 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 14:48:37.410728 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 14:48:37.410748 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 14:48:37.410754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 14:48:37.410759 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 14:48:37.410764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 14:48:37.410767 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 14:48:37.410770 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 14:48:37.410942 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 14:48:37.412850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T14:48:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ade0e5dee1a3862128dbe2599fff70920f8af3d833f28981be57d7a67e0003c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ee40a3b1e31feae95747d08364edf9c9995181354849d3d3e55f2255664d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee40a3b1e31feae95747d08364edf9c9995181354849d3d3e55f2255664d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T14:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T14:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:47:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.871964 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thzhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eee1745-fda9-4657-8cb0-d491ae450f82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f477e0d6b8bd23996b9d325c163aa35d20ee6b4f9e07ed131e8f1237e925eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f477e0d6b8bd23996b9d325c163aa35d20ee6b4f9e07ed131e8f1237e925eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T14:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T14:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a62a537dca7c1d80e6470237da425bece1b502c3e01d99499ea043fafba5bfb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a62a537dca7c1d80e6470237da425bece1b502c3e01d99499ea043fafba5bfb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T14:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T14:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f42c503dc4d674d9609273617660aaa8ae2979687759acbe283c61a58909295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f42c503dc4d674d9609273617660aaa8ae2979687759acbe283c61a58909295\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T14:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T14:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thzhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.874764 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.874791 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.874801 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.874816 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.874825 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:04Z","lastTransitionTime":"2026-03-12T14:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.879203 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c553534-16cf-4a8f-8d01-518e9526a117\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72e3e16974f6fddb4523de95bc7146e30691b9c130cbc9fa827536a79fe8947\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78csh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://871652decbaa741a986c3b80072047f40fd1079b6714a8c22fed0863a7295869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78csh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zgf6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.887027 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-l8qfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd2bb3f-6860-4631-a95c-c910d33724b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83093a69c54535bd941bdaf89e585ba3f4af93800644747e8e2bdfcfd3bb0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dhml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-l8qfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.893794 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hllm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8415254a-55e8-451e-8be1-364b98f44196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-824vx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-824vx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hllm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.900054 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wkrgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5344bfd-e537-4710-abf4-24ece04a3ff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df5779361669f5732f13aaefc4675083c22827e00910d42335e17369e04e933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wkrgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.909342 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1621c994-94d2-4105-a988-f4739518ba91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f426ea0e5ab79ee4591867b34f5081ee4879d1f2504afa9dd9e6f27592c8fc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2bz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e48f067c773716c7d24fab1c2ac1e1bfd0b073b1e56d62472b739aafe4d8ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2bz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2lgzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.924568 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7edaf111-2689-4453-ba78-00677e1b6316\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T14:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T14:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-42vwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.935655 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.944632 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.956588 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.963682 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9pvbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe3a984d-84f6-4421-85dc-ecb4bdb29707\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b388272704901dd5d4d191687ea9c65ebd7e95b7613df1ecad194895cebfe7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ss8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9pvbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.973653 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f1c08b7b40fe76759026dea3ec2da5339a80ec5e68b9abc804b758e3951d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.978477 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.978515 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.978528 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.978563 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.978578 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:04Z","lastTransitionTime":"2026-03-12T14:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.981902 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:04 crc kubenswrapper[4869]: I0312 14:49:04.990366 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.081055 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.081125 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.081133 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.081146 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.081154 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:05Z","lastTransitionTime":"2026-03-12T14:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.113756 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.113846 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.113872 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 14:49:05 crc kubenswrapper[4869]: E0312 14:49:05.113924 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:21.113897871 +0000 UTC m=+113.399123199 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.114006 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 14:49:05 crc kubenswrapper[4869]: E0312 14:49:05.114015 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.114063 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 14:49:05 crc kubenswrapper[4869]: E0312 14:49:05.114072 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 14:49:05 crc kubenswrapper[4869]: E0312 14:49:05.114136 4869 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 14:49:05 crc kubenswrapper[4869]: E0312 14:49:05.113968 4869 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 14:49:05 crc kubenswrapper[4869]: E0312 14:49:05.114202 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-12 14:49:21.11417419 +0000 UTC m=+113.399399468 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 14:49:05 crc kubenswrapper[4869]: E0312 14:49:05.114232 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 14:49:21.114215241 +0000 UTC m=+113.399440519 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 14:49:05 crc kubenswrapper[4869]: E0312 14:49:05.114051 4869 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 14:49:05 crc kubenswrapper[4869]: E0312 14:49:05.114259 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 14:49:21.114253552 +0000 UTC m=+113.399478830 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 14:49:05 crc kubenswrapper[4869]: E0312 14:49:05.114110 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 14:49:05 crc kubenswrapper[4869]: E0312 14:49:05.114275 4869 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 14:49:05 crc kubenswrapper[4869]: E0312 14:49:05.114285 4869 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 14:49:05 crc kubenswrapper[4869]: E0312 14:49:05.114303 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-12 14:49:21.114298183 +0000 UTC m=+113.399523461 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.183470 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.183511 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.183522 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.183551 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.183559 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:05Z","lastTransitionTime":"2026-03-12T14:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.214654 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8415254a-55e8-451e-8be1-364b98f44196-metrics-certs\") pod \"network-metrics-daemon-hllm5\" (UID: \"8415254a-55e8-451e-8be1-364b98f44196\") " pod="openshift-multus/network-metrics-daemon-hllm5" Mar 12 14:49:05 crc kubenswrapper[4869]: E0312 14:49:05.214766 4869 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 14:49:05 crc kubenswrapper[4869]: E0312 14:49:05.214813 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8415254a-55e8-451e-8be1-364b98f44196-metrics-certs podName:8415254a-55e8-451e-8be1-364b98f44196 nodeName:}" failed. No retries permitted until 2026-03-12 14:49:21.214800031 +0000 UTC m=+113.500025299 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8415254a-55e8-451e-8be1-364b98f44196-metrics-certs") pod "network-metrics-daemon-hllm5" (UID: "8415254a-55e8-451e-8be1-364b98f44196") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.285946 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.285977 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.285988 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.286004 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.286015 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:05Z","lastTransitionTime":"2026-03-12T14:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.335649 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 14:49:05 crc kubenswrapper[4869]: E0312 14:49:05.335765 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.336764 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 14:49:05 crc kubenswrapper[4869]: E0312 14:49:05.336859 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.336908 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hllm5" Mar 12 14:49:05 crc kubenswrapper[4869]: E0312 14:49:05.336952 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hllm5" podUID="8415254a-55e8-451e-8be1-364b98f44196" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.336991 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 14:49:05 crc kubenswrapper[4869]: E0312 14:49:05.337028 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.387819 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.387856 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.387865 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.387879 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.387888 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:05Z","lastTransitionTime":"2026-03-12T14:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.490174 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.490215 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.490227 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.490244 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.490259 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:05Z","lastTransitionTime":"2026-03-12T14:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.592819 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.592859 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.592873 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.592921 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.592934 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:05Z","lastTransitionTime":"2026-03-12T14:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.694883 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.694908 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.694918 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.694930 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.694938 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:05Z","lastTransitionTime":"2026-03-12T14:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.699365 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b3eaaed1cfb825734cd0457b925e1fe95db0f46a62e626735b2d36da69439b3c"} Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.699438 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f550d525817e97eeac4cb0660adf6a8c309e65cb5e1253480d70be992f709ed7"} Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.705334 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" event={"ID":"7edaf111-2689-4453-ba78-00677e1b6316","Type":"ContainerStarted","Data":"51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58"} Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.705372 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" event={"ID":"7edaf111-2689-4453-ba78-00677e1b6316","Type":"ContainerStarted","Data":"6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e"} Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.705400 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" event={"ID":"7edaf111-2689-4453-ba78-00677e1b6316","Type":"ContainerStarted","Data":"a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89"} Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.705413 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" event={"ID":"7edaf111-2689-4453-ba78-00677e1b6316","Type":"ContainerStarted","Data":"4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004"} Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.705424 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" event={"ID":"7edaf111-2689-4453-ba78-00677e1b6316","Type":"ContainerStarted","Data":"8bb7615660554d3eb2719e46045dc9d6699152e0f276f8c1d6065ed4060f9f7a"} Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.705432 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" event={"ID":"7edaf111-2689-4453-ba78-00677e1b6316","Type":"ContainerStarted","Data":"5d8e2551217a24af090084a54df75ab1d7ed1cc066577b6385867e41c18115f5"} Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.712823 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3eaaed1cfb825734cd0457b925e1fe95db0f46a62e626735b2d36da69439b3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f550d525817e97eeac4cb0660adf6a8c309e65cb5e1253480d70be992f709ed7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.715002 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"65c9dbb743a40f96dce4290625a76183fb5c201e27b534bf955a94149e339db6"} Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.718392 4869 generic.go:334] "Generic (PLEG): container finished" podID="9eee1745-fda9-4657-8cb0-d491ae450f82" containerID="72d714354995f1822a9a681fb2e448f3295a2229126a9c826846df21df263519" exitCode=0 Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.718458 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-thzhj" event={"ID":"9eee1745-fda9-4657-8cb0-d491ae450f82","Type":"ContainerDied","Data":"72d714354995f1822a9a681fb2e448f3295a2229126a9c826846df21df263519"} Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.728274 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f1c08b7b40fe76759026dea3ec2da5339a80ec5e68b9abc804b758e3951d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.739454 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.748932 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.755614 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9pvbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe3a984d-84f6-4421-85dc-ecb4bdb29707\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b388272704901dd5d4d191687ea9c65ebd7e95b7613df1ecad194895cebfe7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ss8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9pvbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.767805 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b890e426-900b-4755-ba40-37ed7df4521e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be3fc94063ff64e8420a185bf53159150916de6355f8bb3ca727c6def21b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f2d0e37f450282dca2d3e08b24868d3862fcb72eca6f7a6a9aca2d13015f4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://518439416ac86b6aa626cb78deb23ea94de1043a09abb2dc0ae51ea876199b72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e6104f6e86200fc4f007b43b7b8c0c0dfb0cf70075ca81ff9773e4424d03e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e6104f6e86200fc4f007b43b7b8c0c0dfb0cf70075ca81ff9773e4424d03e28\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T14:48:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 14:48:37.218082 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 14:48:37.218186 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 14:48:37.218781 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2512027045/tls.crt::/tmp/serving-cert-2512027045/tls.key\\\\\\\"\\\\nI0312 14:48:37.402531 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 14:48:37.405394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 14:48:37.405407 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 14:48:37.405427 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 14:48:37.405432 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 14:48:37.410728 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 14:48:37.410748 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 14:48:37.410754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 14:48:37.410759 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 14:48:37.410764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 14:48:37.410767 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 14:48:37.410770 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 14:48:37.410942 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 14:48:37.412850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T14:48:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ade0e5dee1a3862128dbe2599fff70920f8af3d833f28981be57d7a67e0003c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ee40a3b1e31feae95747d08364edf9c9995181354849d3d3e55f2255664d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee40a3b1e31feae95747d08364edf9c9995181354849d3d3e55f2255664d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T14:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T14:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:47:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.781195 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thzhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eee1745-fda9-4657-8cb0-d491ae450f82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f477e0d6b8bd23996b9d325c163aa35d20ee6b4f9e07ed131e8f1237e925eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f477e0d6b8bd23996b9d325c163aa35d20ee6b4f9e07ed131e8f1237e925eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T14:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T14:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a62a537dca7c1d80e6470237da425bece1b502c3e01d99499ea043fafba5bfb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a62a537dca7c1d80e6470237da425bece1b502c3e01d99499ea043fafba5bfb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T14:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T14:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f42c503dc4d674d9609273617660aaa8ae2979687759acbe283c61a58909295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f42c503dc4d674d9609273617660aaa8ae2979687759acbe283c61a58909295\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T14:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T14:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thzhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.793916 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c553534-16cf-4a8f-8d01-518e9526a117\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72e3e16974f6fddb4523de95bc7146e30691b9c130cbc9fa827536a79fe8947\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78csh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://871652decbaa741a986c3b80072047f40fd1079b6714a8c22fed0863a7295869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78csh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zgf6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.798623 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.798683 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.798693 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.798710 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.798721 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:05Z","lastTransitionTime":"2026-03-12T14:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.811744 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-l8qfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd2bb3f-6860-4631-a95c-c910d33724b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83093a69c54535bd941bdaf89e585ba3f4af93800644747e8e2bdfcfd3bb0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dhml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-l8qfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.825368 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1621c994-94d2-4105-a988-f4739518ba91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f426ea0e5ab79ee4591867b34f5081ee4879d1f2504afa9dd9e6f27592c8fc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2bz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e48f067c773716c7d24fab1c2ac1e1bfd0b073b1e56d62472b739aafe4d8ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2bz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2lgzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.842008 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7edaf111-2689-4453-ba78-00677e1b6316\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T14:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T14:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-42vwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.857298 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.867279 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.873774 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hllm5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8415254a-55e8-451e-8be1-364b98f44196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-824vx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-824vx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hllm5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.880404 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wkrgx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5344bfd-e537-4710-abf4-24ece04a3ff0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5df5779361669f5732f13aaefc4675083c22827e00910d42335e17369e04e933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxrpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wkrgx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.895608 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b890e426-900b-4755-ba40-37ed7df4521e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:47:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be3fc94063ff64e8420a185bf53159150916de6355f8bb3ca727c6def21b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7f2d0e37f450282dca2d3e08b24868d3862fcb72eca6f7a6a9aca2d13015f4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://518439416ac86b6aa626cb78deb23ea94de1043a09abb2dc0ae51ea876199b72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e6104f6e86200fc4f007b43b7b8c0c0dfb0cf70075ca81ff9773e4424d03e28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e6104f6e86200fc4f007b43b7b8c0c0dfb0cf70075ca81ff9773e4424d03e28\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T14:48:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0312 14:48:37.218082 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 14:48:37.218186 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 14:48:37.218781 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2512027045/tls.crt::/tmp/serving-cert-2512027045/tls.key\\\\\\\"\\\\nI0312 14:48:37.402531 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 14:48:37.405394 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 14:48:37.405407 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 14:48:37.405427 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 14:48:37.405432 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 14:48:37.410728 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0312 14:48:37.410748 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 14:48:37.410754 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 14:48:37.410759 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 14:48:37.410764 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 14:48:37.410767 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 14:48:37.410770 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0312 14:48:37.410942 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0312 14:48:37.412850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T14:48:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ade0e5dee1a3862128dbe2599fff70920f8af3d833f28981be57d7a67e0003c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:47:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ee40a3b1e31feae95747d08364edf9c9995181354849d3d3e55f2255664d4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ee40a3b1e31feae95747d08364edf9c9995181354849d3d3e55f2255664d4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T14:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T14:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:47:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.900355 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.900390 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.900401 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.900415 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.900424 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:05Z","lastTransitionTime":"2026-03-12T14:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.905905 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thzhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eee1745-fda9-4657-8cb0-d491ae450f82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f477e0d6b8bd23996b9d325c163aa35d20ee6b4f9e07ed131e8f1237e925eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0f477e0d6b8bd23996b9d325c163aa35d20ee6b4f9e07ed131e8f1237e925eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T14:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T14:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a62a537dca7c1d80e6470237da425bece1b502c3e01d99499ea043fafba5bfb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a62a537dca7c1d80e6470237da425bece1b502c3e01d99499ea043fafba5bfb3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T14:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T14:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f42c503dc4d674d9609273617660aaa8ae2979687759acbe283c61a58909295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f42c503dc4d674d9609273617660aaa8ae2979687759acbe283c61a58909295\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T14:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T14:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72d714354995f1822a9a681fb2e448f3295a2229126a9c826846df21df263519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72d714354995f1822a9a681fb2e448f3295a2229126a9c826846df21df263519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T14:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T14:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6qq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thzhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.914733 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c553534-16cf-4a8f-8d01-518e9526a117\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72e3e16974f6fddb4523de95bc7146e30691b9c130cbc9fa827536a79fe8947\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78csh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://871652decbaa741a986c3b80072047f40fd1079b6714a8c22fed0863a7295869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78csh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zgf6h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.923473 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-l8qfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2fd2bb3f-6860-4631-a95c-c910d33724b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a83093a69c54535bd941bdaf89e585ba3f4af93800644747e8e2bdfcfd3bb0c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dhml\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-l8qfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.935066 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1621c994-94d2-4105-a988-f4739518ba91\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f426ea0e5ab79ee4591867b34f5081ee4879d1f2504afa9dd9e6f27592c8fc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2bz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e48f067c773716c7d24fab1c2ac1e1bfd0b073b1e56d62472b739aafe4d8ef4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T14:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2bz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2lgzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.949768 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7edaf111-2689-4453-ba78-00677e1b6316\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T14:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T14:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7rnt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T14:48:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-42vwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:05 crc kubenswrapper[4869]: I0312 14:49:05.959533 4869 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T14:48:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.004279 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.004337 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.004351 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.004380 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.004393 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:06Z","lastTransitionTime":"2026-03-12T14:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.018078 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-wkrgx" podStartSLOduration=62.018051997 podStartE2EDuration="1m2.018051997s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:06.004187454 +0000 UTC m=+98.289412732" watchObservedRunningTime="2026-03-12 14:49:06.018051997 +0000 UTC m=+98.303277275" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.107475 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.107518 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.107528 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.107802 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.107830 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:06Z","lastTransitionTime":"2026-03-12T14:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.209917 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.209955 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.209964 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.209985 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.209995 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:06Z","lastTransitionTime":"2026-03-12T14:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.313231 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.313313 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.313331 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.313361 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.313380 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:06Z","lastTransitionTime":"2026-03-12T14:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.416978 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.417036 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.417048 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.417107 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.417130 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:06Z","lastTransitionTime":"2026-03-12T14:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.519820 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.519864 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.519876 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.519891 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.519902 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:06Z","lastTransitionTime":"2026-03-12T14:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.622610 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.622653 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.622666 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.622683 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.622694 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:06Z","lastTransitionTime":"2026-03-12T14:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.724848 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.724892 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.724901 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.724919 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.724932 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:06Z","lastTransitionTime":"2026-03-12T14:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.726694 4869 generic.go:334] "Generic (PLEG): container finished" podID="9eee1745-fda9-4657-8cb0-d491ae450f82" containerID="658f543c9cb085045e50f8c758544c5f76e471e275fd61948f161a03bfc807dc" exitCode=0 Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.726754 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-thzhj" event={"ID":"9eee1745-fda9-4657-8cb0-d491ae450f82","Type":"ContainerDied","Data":"658f543c9cb085045e50f8c758544c5f76e471e275fd61948f161a03bfc807dc"} Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.745872 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9pvbj" podStartSLOduration=62.745837263 podStartE2EDuration="1m2.745837263s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:06.109191026 +0000 UTC m=+98.394416304" watchObservedRunningTime="2026-03-12 14:49:06.745837263 +0000 UTC m=+99.031062571" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.746120 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podStartSLOduration=62.746110242 podStartE2EDuration="1m2.746110242s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:06.74537699 +0000 UTC m=+99.030602308" watchObservedRunningTime="2026-03-12 14:49:06.746110242 +0000 UTC m=+99.031335560" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.827483 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.827736 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.827761 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.827777 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.827788 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:06Z","lastTransitionTime":"2026-03-12T14:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.853439 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zgf6h" podStartSLOduration=62.853418342 podStartE2EDuration="1m2.853418342s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:06.853219876 +0000 UTC m=+99.138445184" watchObservedRunningTime="2026-03-12 14:49:06.853418342 +0000 UTC m=+99.138643610" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.871875 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-l8qfx" podStartSLOduration=62.871834461 podStartE2EDuration="1m2.871834461s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:06.871142461 +0000 UTC m=+99.156367799" watchObservedRunningTime="2026-03-12 14:49:06.871834461 +0000 UTC m=+99.157059739" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.930162 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.930189 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.930197 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.930211 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:06 crc kubenswrapper[4869]: I0312 14:49:06.930220 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:06Z","lastTransitionTime":"2026-03-12T14:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.033524 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.033618 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.033639 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.033667 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.033685 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:07Z","lastTransitionTime":"2026-03-12T14:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.137525 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.137570 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.137579 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.137594 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.137606 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:07Z","lastTransitionTime":"2026-03-12T14:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.240200 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.240239 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.240248 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.240264 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.240274 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:07Z","lastTransitionTime":"2026-03-12T14:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.336183 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 14:49:07 crc kubenswrapper[4869]: E0312 14:49:07.336324 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.336697 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 14:49:07 crc kubenswrapper[4869]: E0312 14:49:07.336761 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.336852 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 14:49:07 crc kubenswrapper[4869]: E0312 14:49:07.336963 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.336864 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hllm5" Mar 12 14:49:07 crc kubenswrapper[4869]: E0312 14:49:07.337089 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hllm5" podUID="8415254a-55e8-451e-8be1-364b98f44196" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.343065 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.343103 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.343114 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.343130 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.343144 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:07Z","lastTransitionTime":"2026-03-12T14:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.445387 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.445420 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.445430 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.445444 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.445454 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:07Z","lastTransitionTime":"2026-03-12T14:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.547606 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.547646 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.547655 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.547696 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.547707 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:07Z","lastTransitionTime":"2026-03-12T14:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.650210 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.650253 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.650265 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.650282 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.650294 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:07Z","lastTransitionTime":"2026-03-12T14:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.734994 4869 generic.go:334] "Generic (PLEG): container finished" podID="9eee1745-fda9-4657-8cb0-d491ae450f82" containerID="4d5864a513b635f795146810717810abece50f6f76aa34f598d385a062c00274" exitCode=0 Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.735051 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-thzhj" event={"ID":"9eee1745-fda9-4657-8cb0-d491ae450f82","Type":"ContainerDied","Data":"4d5864a513b635f795146810717810abece50f6f76aa34f598d385a062c00274"} Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.739065 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" event={"ID":"7edaf111-2689-4453-ba78-00677e1b6316","Type":"ContainerStarted","Data":"c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da"} Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.752852 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.752903 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.752917 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.752934 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.752947 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:07Z","lastTransitionTime":"2026-03-12T14:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.856326 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.856400 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.856413 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.856432 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.856441 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:07Z","lastTransitionTime":"2026-03-12T14:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.959192 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.959609 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.959623 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.959646 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:07 crc kubenswrapper[4869]: I0312 14:49:07.959673 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:07Z","lastTransitionTime":"2026-03-12T14:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.062372 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.062418 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.062433 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.062459 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.062473 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:08Z","lastTransitionTime":"2026-03-12T14:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.191245 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.191280 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.191293 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.191308 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.191320 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:08Z","lastTransitionTime":"2026-03-12T14:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.302386 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.302428 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.302442 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.302457 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.302469 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:08Z","lastTransitionTime":"2026-03-12T14:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.405467 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.405564 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.405590 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.405619 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.405642 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:08Z","lastTransitionTime":"2026-03-12T14:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.508868 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.508920 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.508949 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.508970 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.508985 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:08Z","lastTransitionTime":"2026-03-12T14:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.610842 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.611255 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.611267 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.611287 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.611299 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:08Z","lastTransitionTime":"2026-03-12T14:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.713273 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.713316 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.713334 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.713349 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.713362 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:08Z","lastTransitionTime":"2026-03-12T14:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.746211 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-thzhj" event={"ID":"9eee1745-fda9-4657-8cb0-d491ae450f82","Type":"ContainerStarted","Data":"fe15e0160e50b45da411f0cdc16a2a187211af5db179073e633414e5b11ef885"} Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.773117 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-thzhj" podStartSLOduration=64.773101274 podStartE2EDuration="1m4.773101274s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:08.773095764 +0000 UTC m=+101.058321132" watchObservedRunningTime="2026-03-12 14:49:08.773101274 +0000 UTC m=+101.058326552" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.815377 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.815449 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.815470 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.815493 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.815510 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:08Z","lastTransitionTime":"2026-03-12T14:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.917583 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.917635 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.917647 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.917669 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.917682 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:08Z","lastTransitionTime":"2026-03-12T14:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.969279 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.969321 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.969329 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.969345 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 14:49:08 crc kubenswrapper[4869]: I0312 14:49:08.969354 4869 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T14:49:08Z","lastTransitionTime":"2026-03-12T14:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.023913 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-p5fdx"] Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.024429 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p5fdx" Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.027521 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.028997 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.029288 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.029499 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.161380 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8b7887b3-baf6-4e30-b847-772b4602ec56-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-p5fdx\" (UID: \"8b7887b3-baf6-4e30-b847-772b4602ec56\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p5fdx" Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.161428 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8b7887b3-baf6-4e30-b847-772b4602ec56-service-ca\") pod \"cluster-version-operator-5c965bbfc6-p5fdx\" (UID: \"8b7887b3-baf6-4e30-b847-772b4602ec56\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p5fdx" Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.161533 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b7887b3-baf6-4e30-b847-772b4602ec56-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-p5fdx\" (UID: \"8b7887b3-baf6-4e30-b847-772b4602ec56\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p5fdx" Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.161668 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8b7887b3-baf6-4e30-b847-772b4602ec56-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-p5fdx\" (UID: \"8b7887b3-baf6-4e30-b847-772b4602ec56\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p5fdx" Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.161692 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b7887b3-baf6-4e30-b847-772b4602ec56-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-p5fdx\" (UID: \"8b7887b3-baf6-4e30-b847-772b4602ec56\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p5fdx" Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.262945 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8b7887b3-baf6-4e30-b847-772b4602ec56-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-p5fdx\" (UID: \"8b7887b3-baf6-4e30-b847-772b4602ec56\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p5fdx" Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.263258 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8b7887b3-baf6-4e30-b847-772b4602ec56-service-ca\") pod \"cluster-version-operator-5c965bbfc6-p5fdx\" (UID: \"8b7887b3-baf6-4e30-b847-772b4602ec56\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p5fdx" Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.263341 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b7887b3-baf6-4e30-b847-772b4602ec56-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-p5fdx\" (UID: \"8b7887b3-baf6-4e30-b847-772b4602ec56\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p5fdx" Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.263149 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8b7887b3-baf6-4e30-b847-772b4602ec56-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-p5fdx\" (UID: \"8b7887b3-baf6-4e30-b847-772b4602ec56\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p5fdx" Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.263507 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8b7887b3-baf6-4e30-b847-772b4602ec56-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-p5fdx\" (UID: \"8b7887b3-baf6-4e30-b847-772b4602ec56\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p5fdx" Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.263624 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b7887b3-baf6-4e30-b847-772b4602ec56-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-p5fdx\" (UID: \"8b7887b3-baf6-4e30-b847-772b4602ec56\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p5fdx" Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.263625 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8b7887b3-baf6-4e30-b847-772b4602ec56-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-p5fdx\" (UID: \"8b7887b3-baf6-4e30-b847-772b4602ec56\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p5fdx" Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.265711 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8b7887b3-baf6-4e30-b847-772b4602ec56-service-ca\") pod \"cluster-version-operator-5c965bbfc6-p5fdx\" (UID: \"8b7887b3-baf6-4e30-b847-772b4602ec56\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p5fdx" Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.269296 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b7887b3-baf6-4e30-b847-772b4602ec56-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-p5fdx\" (UID: \"8b7887b3-baf6-4e30-b847-772b4602ec56\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p5fdx" Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.281898 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b7887b3-baf6-4e30-b847-772b4602ec56-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-p5fdx\" (UID: \"8b7887b3-baf6-4e30-b847-772b4602ec56\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p5fdx" Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.335440 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hllm5" Mar 12 14:49:09 crc kubenswrapper[4869]: E0312 14:49:09.335788 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hllm5" podUID="8415254a-55e8-451e-8be1-364b98f44196" Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.335488 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 14:49:09 crc kubenswrapper[4869]: E0312 14:49:09.336029 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.335444 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 14:49:09 crc kubenswrapper[4869]: E0312 14:49:09.336187 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.335516 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 14:49:09 crc kubenswrapper[4869]: E0312 14:49:09.336395 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.339579 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p5fdx" Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.353644 4869 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.362529 4869 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.750673 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p5fdx" event={"ID":"8b7887b3-baf6-4e30-b847-772b4602ec56","Type":"ContainerStarted","Data":"74d0912785adedce16984072dd05557d0df416e61e8621fb0ad580cd779849f6"} Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.750715 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p5fdx" event={"ID":"8b7887b3-baf6-4e30-b847-772b4602ec56","Type":"ContainerStarted","Data":"2eea875b017720543f86484b4c5c2b47ad05afd58ab42aa0192dcebc6feee439"} Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.762043 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" event={"ID":"7edaf111-2689-4453-ba78-00677e1b6316","Type":"ContainerStarted","Data":"571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160"} Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.763437 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.763475 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.763487 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.794609 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.795167 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.801398 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p5fdx" podStartSLOduration=65.801379913 podStartE2EDuration="1m5.801379913s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:09.767328247 +0000 UTC m=+102.052553565" watchObservedRunningTime="2026-03-12 14:49:09.801379913 +0000 UTC m=+102.086605181" Mar 12 14:49:09 crc kubenswrapper[4869]: I0312 14:49:09.827357 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" podStartSLOduration=65.827332337 podStartE2EDuration="1m5.827332337s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:09.802121855 +0000 UTC m=+102.087347143" watchObservedRunningTime="2026-03-12 14:49:09.827332337 +0000 UTC m=+102.112557615" Mar 12 14:49:11 crc kubenswrapper[4869]: I0312 14:49:11.335828 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 14:49:11 crc kubenswrapper[4869]: I0312 14:49:11.335898 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 14:49:11 crc kubenswrapper[4869]: E0312 14:49:11.336381 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 14:49:11 crc kubenswrapper[4869]: I0312 14:49:11.335968 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 14:49:11 crc kubenswrapper[4869]: E0312 14:49:11.336454 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 14:49:11 crc kubenswrapper[4869]: I0312 14:49:11.335920 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hllm5" Mar 12 14:49:11 crc kubenswrapper[4869]: E0312 14:49:11.336564 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 14:49:11 crc kubenswrapper[4869]: E0312 14:49:11.336817 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hllm5" podUID="8415254a-55e8-451e-8be1-364b98f44196" Mar 12 14:49:11 crc kubenswrapper[4869]: I0312 14:49:11.931764 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hllm5"] Mar 12 14:49:11 crc kubenswrapper[4869]: I0312 14:49:11.931884 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hllm5" Mar 12 14:49:11 crc kubenswrapper[4869]: E0312 14:49:11.932047 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hllm5" podUID="8415254a-55e8-451e-8be1-364b98f44196" Mar 12 14:49:12 crc kubenswrapper[4869]: I0312 14:49:12.337292 4869 scope.go:117] "RemoveContainer" containerID="0e6104f6e86200fc4f007b43b7b8c0c0dfb0cf70075ca81ff9773e4424d03e28" Mar 12 14:49:12 crc kubenswrapper[4869]: E0312 14:49:12.337475 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 14:49:12 crc kubenswrapper[4869]: I0312 14:49:12.350900 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 12 14:49:13 crc kubenswrapper[4869]: I0312 14:49:13.335951 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hllm5" Mar 12 14:49:13 crc kubenswrapper[4869]: I0312 14:49:13.335987 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 14:49:13 crc kubenswrapper[4869]: E0312 14:49:13.336109 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hllm5" podUID="8415254a-55e8-451e-8be1-364b98f44196" Mar 12 14:49:13 crc kubenswrapper[4869]: I0312 14:49:13.336137 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 14:49:13 crc kubenswrapper[4869]: I0312 14:49:13.335951 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 14:49:13 crc kubenswrapper[4869]: E0312 14:49:13.336312 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 14:49:13 crc kubenswrapper[4869]: E0312 14:49:13.336396 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 14:49:13 crc kubenswrapper[4869]: E0312 14:49:13.336472 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.466617 4869 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.467246 4869 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.514789 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dv4p4"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.521966 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5hczk"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.522336 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dv4p4" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.523873 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vnnq4"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.524095 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5hczk" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.526124 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-65jxf"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.526582 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-65jxf" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.526864 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-vnnq4" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.527091 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-tcc6m"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.527802 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tcc6m" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.528286 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-2drd7"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.528718 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-2drd7" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.529351 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.530085 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.540698 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fs9qm"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.541401 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jz4tc"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.542078 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.542804 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-fs9qm" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.547860 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fdwk9"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.548571 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.601801 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.602004 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.602238 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.602625 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.602941 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 12 14:49:14 crc kubenswrapper[4869]: W0312 14:49:14.603205 4869 reflector.go:561] object-"openshift-machine-api"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Mar 12 14:49:14 crc kubenswrapper[4869]: E0312 14:49:14.603240 4869 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 12 14:49:14 crc kubenswrapper[4869]: W0312 14:49:14.603437 4869 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-tls": failed to list *v1.Secret: secrets "machine-api-operator-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Mar 12 14:49:14 crc kubenswrapper[4869]: E0312 14:49:14.603460 4869 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-api-operator-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 12 14:49:14 crc kubenswrapper[4869]: W0312 14:49:14.603527 4869 reflector.go:561] object-"openshift-machine-api"/"kube-rbac-proxy": failed to list *v1.ConfigMap: configmaps "kube-rbac-proxy" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Mar 12 14:49:14 crc kubenswrapper[4869]: E0312 14:49:14.603559 4869 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"kube-rbac-proxy\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 12 14:49:14 crc kubenswrapper[4869]: W0312 14:49:14.603936 4869 reflector.go:561] object-"openshift-oauth-apiserver"/"audit-1": failed to list *v1.ConfigMap: configmaps "audit-1" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Mar 12 14:49:14 crc kubenswrapper[4869]: E0312 14:49:14.603975 4869 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"audit-1\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"audit-1\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 12 14:49:14 crc kubenswrapper[4869]: W0312 14:49:14.604029 4869 reflector.go:561] object-"openshift-oauth-apiserver"/"trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Mar 12 14:49:14 crc kubenswrapper[4869]: E0312 14:49:14.604043 4869 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.604175 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.604325 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.604475 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 12 14:49:14 crc kubenswrapper[4869]: W0312 14:49:14.604620 4869 reflector.go:561] object-"openshift-controller-manager"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Mar 12 14:49:14 crc kubenswrapper[4869]: E0312 14:49:14.604640 4869 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.604708 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.604829 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 12 14:49:14 crc kubenswrapper[4869]: W0312 14:49:14.604853 4869 reflector.go:561] object-"openshift-oauth-apiserver"/"encryption-config-1": failed to list *v1.Secret: secrets "encryption-config-1" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Mar 12 14:49:14 crc kubenswrapper[4869]: E0312 14:49:14.604879 4869 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"encryption-config-1\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"encryption-config-1\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 12 14:49:14 crc kubenswrapper[4869]: W0312 14:49:14.604945 4869 reflector.go:561] object-"openshift-controller-manager"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Mar 12 14:49:14 crc kubenswrapper[4869]: E0312 14:49:14.604962 4869 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.604979 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 12 14:49:14 crc kubenswrapper[4869]: W0312 14:49:14.605088 4869 reflector.go:561] object-"openshift-controller-manager"/"client-ca": failed to list *v1.ConfigMap: configmaps "client-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Mar 12 14:49:14 crc kubenswrapper[4869]: E0312 14:49:14.605105 4869 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"client-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"client-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 12 14:49:14 crc kubenswrapper[4869]: W0312 14:49:14.605150 4869 reflector.go:561] object-"openshift-authentication-operator"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Mar 12 14:49:14 crc kubenswrapper[4869]: W0312 14:49:14.605166 4869 reflector.go:561] object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c": failed to list *v1.Secret: secrets "openshift-controller-manager-sa-dockercfg-msq4c" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Mar 12 14:49:14 crc kubenswrapper[4869]: E0312 14:49:14.605169 4869 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 12 14:49:14 crc kubenswrapper[4869]: E0312 14:49:14.605180 4869 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-controller-manager-sa-dockercfg-msq4c\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-controller-manager-sa-dockercfg-msq4c\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.605228 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 12 14:49:14 crc kubenswrapper[4869]: W0312 14:49:14.605331 4869 reflector.go:561] object-"openshift-controller-manager"/"openshift-global-ca": failed to list *v1.ConfigMap: configmaps "openshift-global-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Mar 12 14:49:14 crc kubenswrapper[4869]: E0312 14:49:14.605345 4869 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-global-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-global-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.605356 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.605399 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 12 14:49:14 crc kubenswrapper[4869]: W0312 14:49:14.605497 4869 reflector.go:561] object-"openshift-controller-manager"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Mar 12 14:49:14 crc kubenswrapper[4869]: W0312 14:49:14.605508 4869 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-images": failed to list *v1.ConfigMap: configmaps "machine-api-operator-images" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Mar 12 14:49:14 crc kubenswrapper[4869]: E0312 14:49:14.605512 4869 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 12 14:49:14 crc kubenswrapper[4869]: E0312 14:49:14.605526 4869 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-images\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"machine-api-operator-images\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 12 14:49:14 crc kubenswrapper[4869]: W0312 14:49:14.605602 4869 reflector.go:561] object-"openshift-machine-api"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Mar 12 14:49:14 crc kubenswrapper[4869]: E0312 14:49:14.605618 4869 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 12 14:49:14 crc kubenswrapper[4869]: W0312 14:49:14.605668 4869 reflector.go:561] object-"openshift-oauth-apiserver"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Mar 12 14:49:14 crc kubenswrapper[4869]: E0312 14:49:14.605681 4869 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.605702 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 12 14:49:14 crc kubenswrapper[4869]: W0312 14:49:14.605768 4869 reflector.go:561] object-"openshift-oauth-apiserver"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Mar 12 14:49:14 crc kubenswrapper[4869]: E0312 14:49:14.605784 4869 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.605822 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 12 14:49:14 crc kubenswrapper[4869]: W0312 14:49:14.605847 4869 reflector.go:561] object-"openshift-oauth-apiserver"/"etcd-serving-ca": failed to list *v1.ConfigMap: configmaps "etcd-serving-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Mar 12 14:49:14 crc kubenswrapper[4869]: E0312 14:49:14.605863 4869 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"etcd-serving-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"etcd-serving-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.605887 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-z6x5x"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.605945 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.606021 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.606431 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.606445 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z6x5x" Mar 12 14:49:14 crc kubenswrapper[4869]: W0312 14:49:14.606922 4869 reflector.go:561] object-"openshift-oauth-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Mar 12 14:49:14 crc kubenswrapper[4869]: E0312 14:49:14.606946 4869 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.607003 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 12 14:49:14 crc kubenswrapper[4869]: W0312 14:49:14.607646 4869 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7": failed to list *v1.Secret: secrets "machine-api-operator-dockercfg-mfbb7" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Mar 12 14:49:14 crc kubenswrapper[4869]: E0312 14:49:14.607674 4869 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-dockercfg-mfbb7\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-api-operator-dockercfg-mfbb7\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.613924 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 12 14:49:14 crc kubenswrapper[4869]: W0312 14:49:14.613976 4869 reflector.go:561] object-"openshift-oauth-apiserver"/"etcd-client": failed to list *v1.Secret: secrets "etcd-client" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Mar 12 14:49:14 crc kubenswrapper[4869]: E0312 14:49:14.614014 4869 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"etcd-client\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"etcd-client\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.614076 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 12 14:49:14 crc kubenswrapper[4869]: W0312 14:49:14.613936 4869 reflector.go:561] object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq": failed to list *v1.Secret: secrets "oauth-apiserver-sa-dockercfg-6r2bq" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Mar 12 14:49:14 crc kubenswrapper[4869]: E0312 14:49:14.614149 4869 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"oauth-apiserver-sa-dockercfg-6r2bq\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"oauth-apiserver-sa-dockercfg-6r2bq\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.622926 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ca821a8-9fb1-4ba9-8955-0969735ee00a-config\") pod \"route-controller-manager-6576b87f9c-dv4p4\" (UID: \"3ca821a8-9fb1-4ba9-8955-0969735ee00a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dv4p4" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.622969 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.622993 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/993b8ae9-69a2-4cb2-806a-888528215561-etcd-client\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.623017 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47d3c91d-a968-413d-a941-bc67279bf905-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fs9qm\" (UID: \"47d3c91d-a968-413d-a941-bc67279bf905\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fs9qm" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.623039 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/993b8ae9-69a2-4cb2-806a-888528215561-audit-dir\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.623061 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.623082 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47d3c91d-a968-413d-a941-bc67279bf905-config\") pod \"authentication-operator-69f744f599-fs9qm\" (UID: \"47d3c91d-a968-413d-a941-bc67279bf905\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fs9qm" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.623103 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7251ddc-bcae-4fa6-9396-2bf23a1e030d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-65jxf\" (UID: \"b7251ddc-bcae-4fa6-9396-2bf23a1e030d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-65jxf" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.623121 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/993b8ae9-69a2-4cb2-806a-888528215561-serving-cert\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.623142 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcbaf341-4b19-41a8-8120-b09206d52dc1-config\") pod \"machine-approver-56656f9798-tcc6m\" (UID: \"fcbaf341-4b19-41a8-8120-b09206d52dc1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tcc6m" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.623162 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqk5t\" (UniqueName: \"kubernetes.io/projected/90ae9f42-560b-4b79-a947-25c6de331025-kube-api-access-tqk5t\") pod \"machine-api-operator-5694c8668f-2drd7\" (UID: \"90ae9f42-560b-4b79-a947-25c6de331025\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2drd7" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.623185 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/90ae9f42-560b-4b79-a947-25c6de331025-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-2drd7\" (UID: \"90ae9f42-560b-4b79-a947-25c6de331025\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2drd7" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.623207 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.623227 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.623248 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47d3c91d-a968-413d-a941-bc67279bf905-service-ca-bundle\") pod \"authentication-operator-69f744f599-fs9qm\" (UID: \"47d3c91d-a968-413d-a941-bc67279bf905\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fs9qm" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.623271 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf9qv\" (UniqueName: \"kubernetes.io/projected/47d3c91d-a968-413d-a941-bc67279bf905-kube-api-access-gf9qv\") pod \"authentication-operator-69f744f599-fs9qm\" (UID: \"47d3c91d-a968-413d-a941-bc67279bf905\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fs9qm" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.623291 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5br8w\" (UniqueName: \"kubernetes.io/projected/ca5302b5-90b9-412e-b378-a1fdedf81184-kube-api-access-5br8w\") pod \"controller-manager-879f6c89f-jz4tc\" (UID: \"ca5302b5-90b9-412e-b378-a1fdedf81184\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.623315 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ca821a8-9fb1-4ba9-8955-0969735ee00a-client-ca\") pod \"route-controller-manager-6576b87f9c-dv4p4\" (UID: \"3ca821a8-9fb1-4ba9-8955-0969735ee00a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dv4p4" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.623336 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.623356 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hj26\" (UniqueName: \"kubernetes.io/projected/fcbaf341-4b19-41a8-8120-b09206d52dc1-kube-api-access-8hj26\") pod \"machine-approver-56656f9798-tcc6m\" (UID: \"fcbaf341-4b19-41a8-8120-b09206d52dc1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tcc6m" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.623382 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fcbaf341-4b19-41a8-8120-b09206d52dc1-auth-proxy-config\") pod \"machine-approver-56656f9798-tcc6m\" (UID: \"fcbaf341-4b19-41a8-8120-b09206d52dc1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tcc6m" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.623402 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2kml\" (UniqueName: \"kubernetes.io/projected/d17df7e2-d85d-4172-aff7-0b5e63605a77-kube-api-access-r2kml\") pod \"openshift-config-operator-7777fb866f-z6x5x\" (UID: \"d17df7e2-d85d-4172-aff7-0b5e63605a77\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z6x5x" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.623458 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.623481 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57f786f4-7046-4810-ab69-d692be82903f-serving-cert\") pod \"console-operator-58897d9998-vnnq4\" (UID: \"57f786f4-7046-4810-ab69-d692be82903f\") " pod="openshift-console-operator/console-operator-58897d9998-vnnq4" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.623504 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d640d651-bd78-49dc-b945-f0c749666c66-audit-dir\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.623524 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.623564 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47d3c91d-a968-413d-a941-bc67279bf905-serving-cert\") pod \"authentication-operator-69f744f599-fs9qm\" (UID: \"47d3c91d-a968-413d-a941-bc67279bf905\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fs9qm" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.623588 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgpkw\" (UniqueName: \"kubernetes.io/projected/b6c30462-fb75-4888-a29b-bd63b39159ee-kube-api-access-wgpkw\") pod \"openshift-apiserver-operator-796bbdcf4f-5hczk\" (UID: \"b6c30462-fb75-4888-a29b-bd63b39159ee\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5hczk" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.623609 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ca821a8-9fb1-4ba9-8955-0969735ee00a-serving-cert\") pod \"route-controller-manager-6576b87f9c-dv4p4\" (UID: \"3ca821a8-9fb1-4ba9-8955-0969735ee00a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dv4p4" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.623629 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.623652 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp5mh\" (UniqueName: \"kubernetes.io/projected/3ca821a8-9fb1-4ba9-8955-0969735ee00a-kube-api-access-tp5mh\") pod \"route-controller-manager-6576b87f9c-dv4p4\" (UID: \"3ca821a8-9fb1-4ba9-8955-0969735ee00a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dv4p4" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.623673 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57f786f4-7046-4810-ab69-d692be82903f-config\") pod \"console-operator-58897d9998-vnnq4\" (UID: \"57f786f4-7046-4810-ab69-d692be82903f\") " pod="openshift-console-operator/console-operator-58897d9998-vnnq4" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.623695 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d17df7e2-d85d-4172-aff7-0b5e63605a77-available-featuregates\") pod \"openshift-config-operator-7777fb866f-z6x5x\" (UID: \"d17df7e2-d85d-4172-aff7-0b5e63605a77\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z6x5x" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.623987 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6c30462-fb75-4888-a29b-bd63b39159ee-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5hczk\" (UID: \"b6c30462-fb75-4888-a29b-bd63b39159ee\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5hczk" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.624031 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/993b8ae9-69a2-4cb2-806a-888528215561-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.624062 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca5302b5-90b9-412e-b378-a1fdedf81184-config\") pod \"controller-manager-879f6c89f-jz4tc\" (UID: \"ca5302b5-90b9-412e-b378-a1fdedf81184\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.624086 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ca5302b5-90b9-412e-b378-a1fdedf81184-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-jz4tc\" (UID: \"ca5302b5-90b9-412e-b378-a1fdedf81184\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.624465 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.624700 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.624848 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.624990 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.625138 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.625276 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.624123 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tgsn\" (UniqueName: \"kubernetes.io/projected/57f786f4-7046-4810-ab69-d692be82903f-kube-api-access-9tgsn\") pod \"console-operator-58897d9998-vnnq4\" (UID: \"57f786f4-7046-4810-ab69-d692be82903f\") " pod="openshift-console-operator/console-operator-58897d9998-vnnq4" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.625417 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.625516 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.625645 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.625745 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.625969 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.626000 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.626144 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.626160 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.626305 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.626403 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.626590 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.626686 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.626940 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.628341 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-qfqjj"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.628900 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-qfqjj" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.629303 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.630295 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.626824 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d640d651-bd78-49dc-b945-f0c749666c66-audit-policies\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.630465 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.630513 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/fcbaf341-4b19-41a8-8120-b09206d52dc1-machine-approver-tls\") pod \"machine-approver-56656f9798-tcc6m\" (UID: \"fcbaf341-4b19-41a8-8120-b09206d52dc1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tcc6m" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.630533 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ca5302b5-90b9-412e-b378-a1fdedf81184-client-ca\") pod \"controller-manager-879f6c89f-jz4tc\" (UID: \"ca5302b5-90b9-412e-b378-a1fdedf81184\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.630588 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57f786f4-7046-4810-ab69-d692be82903f-trusted-ca\") pod \"console-operator-58897d9998-vnnq4\" (UID: \"57f786f4-7046-4810-ab69-d692be82903f\") " pod="openshift-console-operator/console-operator-58897d9998-vnnq4" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.630609 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpkzh\" (UniqueName: \"kubernetes.io/projected/993b8ae9-69a2-4cb2-806a-888528215561-kube-api-access-hpkzh\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.630643 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.630668 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/993b8ae9-69a2-4cb2-806a-888528215561-encryption-config\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.630699 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90ae9f42-560b-4b79-a947-25c6de331025-config\") pod \"machine-api-operator-5694c8668f-2drd7\" (UID: \"90ae9f42-560b-4b79-a947-25c6de331025\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2drd7" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.630721 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d17df7e2-d85d-4172-aff7-0b5e63605a77-serving-cert\") pod \"openshift-config-operator-7777fb866f-z6x5x\" (UID: \"d17df7e2-d85d-4172-aff7-0b5e63605a77\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z6x5x" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.630744 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6c30462-fb75-4888-a29b-bd63b39159ee-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5hczk\" (UID: \"b6c30462-fb75-4888-a29b-bd63b39159ee\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5hczk" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.630767 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4c5h\" (UniqueName: \"kubernetes.io/projected/d640d651-bd78-49dc-b945-f0c749666c66-kube-api-access-x4c5h\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.630787 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/993b8ae9-69a2-4cb2-806a-888528215561-audit-policies\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.630813 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.630832 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/90ae9f42-560b-4b79-a947-25c6de331025-images\") pod \"machine-api-operator-5694c8668f-2drd7\" (UID: \"90ae9f42-560b-4b79-a947-25c6de331025\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2drd7" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.630852 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqdw4\" (UniqueName: \"kubernetes.io/projected/b7251ddc-bcae-4fa6-9396-2bf23a1e030d-kube-api-access-sqdw4\") pod \"cluster-samples-operator-665b6dd947-65jxf\" (UID: \"b7251ddc-bcae-4fa6-9396-2bf23a1e030d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-65jxf" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.630873 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/993b8ae9-69a2-4cb2-806a-888528215561-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.630892 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca5302b5-90b9-412e-b378-a1fdedf81184-serving-cert\") pod \"controller-manager-879f6c89f-jz4tc\" (UID: \"ca5302b5-90b9-412e-b378-a1fdedf81184\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.632174 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.632821 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.633470 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.633979 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.635019 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6r22c"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.635690 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-6xr6k"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.635741 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-6r22c" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.636346 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-6xr6k" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.671659 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-2hvvd"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.673685 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.674059 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.685757 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.685955 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.686505 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.687228 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.687602 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.687657 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4dlk8"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.688127 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-4dlk8" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.689800 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.691633 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.692599 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.697950 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.698367 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.698464 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.698687 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.699239 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.699403 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.699572 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.699837 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.700122 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.700847 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.707676 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n6t79"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.708138 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5lmx8"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.708393 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vxlrb"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.708526 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.708706 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vxlrb" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.708949 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n6t79" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.709074 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.709427 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.709568 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.713125 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-65jxf"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.716797 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fs9qm"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.717682 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.717812 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.717986 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.718083 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.718152 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.718226 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.718300 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.718366 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.718514 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.718612 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.717778 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.718814 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.718168 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.718198 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.720220 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-l4g7m"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.720387 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.720733 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-5f46k"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.720925 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l4g7m" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.721032 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-5f46k" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.721351 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w6p7c"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.727389 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w6p7c" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.728147 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.729038 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.731761 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47d3c91d-a968-413d-a941-bc67279bf905-serving-cert\") pod \"authentication-operator-69f744f599-fs9qm\" (UID: \"47d3c91d-a968-413d-a941-bc67279bf905\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fs9qm" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.731800 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgpkw\" (UniqueName: \"kubernetes.io/projected/b6c30462-fb75-4888-a29b-bd63b39159ee-kube-api-access-wgpkw\") pod \"openshift-apiserver-operator-796bbdcf4f-5hczk\" (UID: \"b6c30462-fb75-4888-a29b-bd63b39159ee\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5hczk" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.731824 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.731847 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ca821a8-9fb1-4ba9-8955-0969735ee00a-serving-cert\") pod \"route-controller-manager-6576b87f9c-dv4p4\" (UID: \"3ca821a8-9fb1-4ba9-8955-0969735ee00a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dv4p4" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.731870 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp5mh\" (UniqueName: \"kubernetes.io/projected/3ca821a8-9fb1-4ba9-8955-0969735ee00a-kube-api-access-tp5mh\") pod \"route-controller-manager-6576b87f9c-dv4p4\" (UID: \"3ca821a8-9fb1-4ba9-8955-0969735ee00a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dv4p4" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.731903 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57f786f4-7046-4810-ab69-d692be82903f-config\") pod \"console-operator-58897d9998-vnnq4\" (UID: \"57f786f4-7046-4810-ab69-d692be82903f\") " pod="openshift-console-operator/console-operator-58897d9998-vnnq4" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.731936 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d17df7e2-d85d-4172-aff7-0b5e63605a77-available-featuregates\") pod \"openshift-config-operator-7777fb866f-z6x5x\" (UID: \"d17df7e2-d85d-4172-aff7-0b5e63605a77\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z6x5x" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.731969 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6c30462-fb75-4888-a29b-bd63b39159ee-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5hczk\" (UID: \"b6c30462-fb75-4888-a29b-bd63b39159ee\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5hczk" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.731992 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/993b8ae9-69a2-4cb2-806a-888528215561-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.732019 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca5302b5-90b9-412e-b378-a1fdedf81184-config\") pod \"controller-manager-879f6c89f-jz4tc\" (UID: \"ca5302b5-90b9-412e-b378-a1fdedf81184\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.732046 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ca5302b5-90b9-412e-b378-a1fdedf81184-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-jz4tc\" (UID: \"ca5302b5-90b9-412e-b378-a1fdedf81184\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.732079 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d640d651-bd78-49dc-b945-f0c749666c66-audit-policies\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.732107 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.732155 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tgsn\" (UniqueName: \"kubernetes.io/projected/57f786f4-7046-4810-ab69-d692be82903f-kube-api-access-9tgsn\") pod \"console-operator-58897d9998-vnnq4\" (UID: \"57f786f4-7046-4810-ab69-d692be82903f\") " pod="openshift-console-operator/console-operator-58897d9998-vnnq4" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.732180 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ca5302b5-90b9-412e-b378-a1fdedf81184-client-ca\") pod \"controller-manager-879f6c89f-jz4tc\" (UID: \"ca5302b5-90b9-412e-b378-a1fdedf81184\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.732209 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/fcbaf341-4b19-41a8-8120-b09206d52dc1-machine-approver-tls\") pod \"machine-approver-56656f9798-tcc6m\" (UID: \"fcbaf341-4b19-41a8-8120-b09206d52dc1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tcc6m" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.732232 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57f786f4-7046-4810-ab69-d692be82903f-trusted-ca\") pod \"console-operator-58897d9998-vnnq4\" (UID: \"57f786f4-7046-4810-ab69-d692be82903f\") " pod="openshift-console-operator/console-operator-58897d9998-vnnq4" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.732253 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpkzh\" (UniqueName: \"kubernetes.io/projected/993b8ae9-69a2-4cb2-806a-888528215561-kube-api-access-hpkzh\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.732283 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.732308 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/993b8ae9-69a2-4cb2-806a-888528215561-encryption-config\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.732337 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90ae9f42-560b-4b79-a947-25c6de331025-config\") pod \"machine-api-operator-5694c8668f-2drd7\" (UID: \"90ae9f42-560b-4b79-a947-25c6de331025\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2drd7" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.732361 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d17df7e2-d85d-4172-aff7-0b5e63605a77-serving-cert\") pod \"openshift-config-operator-7777fb866f-z6x5x\" (UID: \"d17df7e2-d85d-4172-aff7-0b5e63605a77\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z6x5x" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.732384 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6c30462-fb75-4888-a29b-bd63b39159ee-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5hczk\" (UID: \"b6c30462-fb75-4888-a29b-bd63b39159ee\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5hczk" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.732410 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4c5h\" (UniqueName: \"kubernetes.io/projected/d640d651-bd78-49dc-b945-f0c749666c66-kube-api-access-x4c5h\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.732434 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/993b8ae9-69a2-4cb2-806a-888528215561-audit-policies\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.732457 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.732487 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqdw4\" (UniqueName: \"kubernetes.io/projected/b7251ddc-bcae-4fa6-9396-2bf23a1e030d-kube-api-access-sqdw4\") pod \"cluster-samples-operator-665b6dd947-65jxf\" (UID: \"b7251ddc-bcae-4fa6-9396-2bf23a1e030d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-65jxf" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.733829 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57f786f4-7046-4810-ab69-d692be82903f-trusted-ca\") pod \"console-operator-58897d9998-vnnq4\" (UID: \"57f786f4-7046-4810-ab69-d692be82903f\") " pod="openshift-console-operator/console-operator-58897d9998-vnnq4" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.734177 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57f786f4-7046-4810-ab69-d692be82903f-config\") pod \"console-operator-58897d9998-vnnq4\" (UID: \"57f786f4-7046-4810-ab69-d692be82903f\") " pod="openshift-console-operator/console-operator-58897d9998-vnnq4" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.734971 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d17df7e2-d85d-4172-aff7-0b5e63605a77-available-featuregates\") pod \"openshift-config-operator-7777fb866f-z6x5x\" (UID: \"d17df7e2-d85d-4172-aff7-0b5e63605a77\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z6x5x" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.735352 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6c30462-fb75-4888-a29b-bd63b39159ee-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5hczk\" (UID: \"b6c30462-fb75-4888-a29b-bd63b39159ee\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5hczk" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.738628 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8p2d8"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.739202 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5hczk"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.739221 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jbg9h"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.741844 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jbg9h" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.742782 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/993b8ae9-69a2-4cb2-806a-888528215561-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.742825 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca5302b5-90b9-412e-b378-a1fdedf81184-serving-cert\") pod \"controller-manager-879f6c89f-jz4tc\" (UID: \"ca5302b5-90b9-412e-b378-a1fdedf81184\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.742847 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/90ae9f42-560b-4b79-a947-25c6de331025-images\") pod \"machine-api-operator-5694c8668f-2drd7\" (UID: \"90ae9f42-560b-4b79-a947-25c6de331025\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2drd7" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.742886 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ca821a8-9fb1-4ba9-8955-0969735ee00a-config\") pod \"route-controller-manager-6576b87f9c-dv4p4\" (UID: \"3ca821a8-9fb1-4ba9-8955-0969735ee00a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dv4p4" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.742909 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.742929 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/993b8ae9-69a2-4cb2-806a-888528215561-etcd-client\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.742948 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/993b8ae9-69a2-4cb2-806a-888528215561-audit-dir\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.742996 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/993b8ae9-69a2-4cb2-806a-888528215561-audit-dir\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.743104 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47d3c91d-a968-413d-a941-bc67279bf905-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fs9qm\" (UID: \"47d3c91d-a968-413d-a941-bc67279bf905\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fs9qm" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.743184 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d640d651-bd78-49dc-b945-f0c749666c66-audit-policies\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.743240 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.743188 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.743397 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47d3c91d-a968-413d-a941-bc67279bf905-config\") pod \"authentication-operator-69f744f599-fs9qm\" (UID: \"47d3c91d-a968-413d-a941-bc67279bf905\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fs9qm" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.743410 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8p2d8" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.743554 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/993b8ae9-69a2-4cb2-806a-888528215561-serving-cert\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.744172 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcbaf341-4b19-41a8-8120-b09206d52dc1-config\") pod \"machine-approver-56656f9798-tcc6m\" (UID: \"fcbaf341-4b19-41a8-8120-b09206d52dc1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tcc6m" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.744468 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47d3c91d-a968-413d-a941-bc67279bf905-config\") pod \"authentication-operator-69f744f599-fs9qm\" (UID: \"47d3c91d-a968-413d-a941-bc67279bf905\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fs9qm" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.744529 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqk5t\" (UniqueName: \"kubernetes.io/projected/90ae9f42-560b-4b79-a947-25c6de331025-kube-api-access-tqk5t\") pod \"machine-api-operator-5694c8668f-2drd7\" (UID: \"90ae9f42-560b-4b79-a947-25c6de331025\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2drd7" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.744569 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7251ddc-bcae-4fa6-9396-2bf23a1e030d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-65jxf\" (UID: \"b7251ddc-bcae-4fa6-9396-2bf23a1e030d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-65jxf" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.744591 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/90ae9f42-560b-4b79-a947-25c6de331025-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-2drd7\" (UID: \"90ae9f42-560b-4b79-a947-25c6de331025\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2drd7" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.744650 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ca821a8-9fb1-4ba9-8955-0969735ee00a-config\") pod \"route-controller-manager-6576b87f9c-dv4p4\" (UID: \"3ca821a8-9fb1-4ba9-8955-0969735ee00a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dv4p4" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.744659 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.744701 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.744723 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47d3c91d-a968-413d-a941-bc67279bf905-service-ca-bundle\") pod \"authentication-operator-69f744f599-fs9qm\" (UID: \"47d3c91d-a968-413d-a941-bc67279bf905\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fs9qm" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.744747 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf9qv\" (UniqueName: \"kubernetes.io/projected/47d3c91d-a968-413d-a941-bc67279bf905-kube-api-access-gf9qv\") pod \"authentication-operator-69f744f599-fs9qm\" (UID: \"47d3c91d-a968-413d-a941-bc67279bf905\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fs9qm" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.744770 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5br8w\" (UniqueName: \"kubernetes.io/projected/ca5302b5-90b9-412e-b378-a1fdedf81184-kube-api-access-5br8w\") pod \"controller-manager-879f6c89f-jz4tc\" (UID: \"ca5302b5-90b9-412e-b378-a1fdedf81184\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.744936 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.745010 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47d3c91d-a968-413d-a941-bc67279bf905-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fs9qm\" (UID: \"47d3c91d-a968-413d-a941-bc67279bf905\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fs9qm" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.745021 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.745062 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hj26\" (UniqueName: \"kubernetes.io/projected/fcbaf341-4b19-41a8-8120-b09206d52dc1-kube-api-access-8hj26\") pod \"machine-approver-56656f9798-tcc6m\" (UID: \"fcbaf341-4b19-41a8-8120-b09206d52dc1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tcc6m" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.745619 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ca821a8-9fb1-4ba9-8955-0969735ee00a-client-ca\") pod \"route-controller-manager-6576b87f9c-dv4p4\" (UID: \"3ca821a8-9fb1-4ba9-8955-0969735ee00a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dv4p4" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.745752 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fcbaf341-4b19-41a8-8120-b09206d52dc1-auth-proxy-config\") pod \"machine-approver-56656f9798-tcc6m\" (UID: \"fcbaf341-4b19-41a8-8120-b09206d52dc1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tcc6m" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.745869 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2kml\" (UniqueName: \"kubernetes.io/projected/d17df7e2-d85d-4172-aff7-0b5e63605a77-kube-api-access-r2kml\") pod \"openshift-config-operator-7777fb866f-z6x5x\" (UID: \"d17df7e2-d85d-4172-aff7-0b5e63605a77\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z6x5x" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.745953 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57f786f4-7046-4810-ab69-d692be82903f-serving-cert\") pod \"console-operator-58897d9998-vnnq4\" (UID: \"57f786f4-7046-4810-ab69-d692be82903f\") " pod="openshift-console-operator/console-operator-58897d9998-vnnq4" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.746027 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcbaf341-4b19-41a8-8120-b09206d52dc1-config\") pod \"machine-approver-56656f9798-tcc6m\" (UID: \"fcbaf341-4b19-41a8-8120-b09206d52dc1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tcc6m" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.746028 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d640d651-bd78-49dc-b945-f0c749666c66-audit-dir\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.746084 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.746113 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.745648 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d17df7e2-d85d-4172-aff7-0b5e63605a77-serving-cert\") pod \"openshift-config-operator-7777fb866f-z6x5x\" (UID: \"d17df7e2-d85d-4172-aff7-0b5e63605a77\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z6x5x" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.746085 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d640d651-bd78-49dc-b945-f0c749666c66-audit-dir\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.745801 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.746486 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47d3c91d-a968-413d-a941-bc67279bf905-service-ca-bundle\") pod \"authentication-operator-69f744f599-fs9qm\" (UID: \"47d3c91d-a968-413d-a941-bc67279bf905\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fs9qm" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.746846 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fcbaf341-4b19-41a8-8120-b09206d52dc1-auth-proxy-config\") pod \"machine-approver-56656f9798-tcc6m\" (UID: \"fcbaf341-4b19-41a8-8120-b09206d52dc1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tcc6m" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.745991 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.747233 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ca821a8-9fb1-4ba9-8955-0969735ee00a-serving-cert\") pod \"route-controller-manager-6576b87f9c-dv4p4\" (UID: \"3ca821a8-9fb1-4ba9-8955-0969735ee00a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dv4p4" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.746110 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.747447 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ca821a8-9fb1-4ba9-8955-0969735ee00a-client-ca\") pod \"route-controller-manager-6576b87f9c-dv4p4\" (UID: \"3ca821a8-9fb1-4ba9-8955-0969735ee00a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dv4p4" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.747783 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.748207 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47d3c91d-a968-413d-a941-bc67279bf905-serving-cert\") pod \"authentication-operator-69f744f599-fs9qm\" (UID: \"47d3c91d-a968-413d-a941-bc67279bf905\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fs9qm" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.749619 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cg4mg"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.750238 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57f786f4-7046-4810-ab69-d692be82903f-serving-cert\") pod \"console-operator-58897d9998-vnnq4\" (UID: \"57f786f4-7046-4810-ab69-d692be82903f\") " pod="openshift-console-operator/console-operator-58897d9998-vnnq4" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.750588 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.750783 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.751006 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6c30462-fb75-4888-a29b-bd63b39159ee-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5hczk\" (UID: \"b6c30462-fb75-4888-a29b-bd63b39159ee\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5hczk" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.751193 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7251ddc-bcae-4fa6-9396-2bf23a1e030d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-65jxf\" (UID: \"b7251ddc-bcae-4fa6-9396-2bf23a1e030d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-65jxf" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.751198 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.752121 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cg4mg" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.752505 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dv4p4"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.753388 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.768652 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.753678 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.771827 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vnnq4"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.772172 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.774639 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-c55nj"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.775414 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xq5fz"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.775995 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xq5fz" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.776119 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c55nj" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.776188 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-56f66"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.776804 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-56f66" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.777648 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/fcbaf341-4b19-41a8-8120-b09206d52dc1-machine-approver-tls\") pod \"machine-approver-56656f9798-tcc6m\" (UID: \"fcbaf341-4b19-41a8-8120-b09206d52dc1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tcc6m" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.778453 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9krm6"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.781408 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9krm6" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.784059 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-h2p9q"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.784765 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-h2p9q" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.786702 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qdm27"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.787111 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qdm27" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.790229 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8jwlf"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.790758 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8jwlf" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.791143 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.791282 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9xp9n"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.792038 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9xp9n" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.794648 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z4kzx"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.796001 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z4kzx" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.797333 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z5d9m"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.797892 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z5d9m" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.799861 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nsnw9"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.800683 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-nsnw9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.800714 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ld8nq"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.801147 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ld8nq" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.802025 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555445-z4j5n"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.802791 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6r22c"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.802879 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-z4j5n" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.803354 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-z6x5x"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.803958 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.804164 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4dlk8"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.805100 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-qfqjj"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.805733 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-njkp6"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.806590 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-njkp6" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.806911 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-l4g7m"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.808017 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w6p7c"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.810069 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jz4tc"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.810314 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fdwk9"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.811502 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8p2d8"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.814696 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-6xr6k"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.815819 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vxlrb"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.818355 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xq5fz"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.819962 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n6t79"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.821017 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5lmx8"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.822470 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-h2p9q"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.823513 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-2hvvd"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.824312 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.825454 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-2drd7"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.826450 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ld8nq"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.827387 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-dq7n7"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.829431 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-dq7n7" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.830652 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cg4mg"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.834014 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9xp9n"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.835504 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9krm6"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.836848 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-c55nj"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.848656 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555445-z4j5n"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.848938 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jbg9h"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.849752 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.849955 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z4kzx"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.853258 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-njkp6"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.857505 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-56f66"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.857775 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z5d9m"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.857906 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8jwlf"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.861203 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nsnw9"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.865183 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qdm27"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.866910 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.868052 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-7zmqf"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.868915 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-7zmqf" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.869252 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-624kd"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.870099 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-624kd" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.870149 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-624kd"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.877976 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-bsjt9"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.878501 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bsjt9" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.889032 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.890634 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bsjt9"] Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.904410 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.924252 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.945457 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 12 14:49:14 crc kubenswrapper[4869]: I0312 14:49:14.964675 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.004885 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.024620 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.044620 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.066324 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.085123 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.109833 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.124437 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.145224 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.164592 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.184688 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.204110 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.224159 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.244855 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.265111 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.284815 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.304039 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.335745 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.335777 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.335899 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.336270 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hllm5" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.343941 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp5mh\" (UniqueName: \"kubernetes.io/projected/3ca821a8-9fb1-4ba9-8955-0969735ee00a-kube-api-access-tp5mh\") pod \"route-controller-manager-6576b87f9c-dv4p4\" (UID: \"3ca821a8-9fb1-4ba9-8955-0969735ee00a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dv4p4" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.357918 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqdw4\" (UniqueName: \"kubernetes.io/projected/b7251ddc-bcae-4fa6-9396-2bf23a1e030d-kube-api-access-sqdw4\") pod \"cluster-samples-operator-665b6dd947-65jxf\" (UID: \"b7251ddc-bcae-4fa6-9396-2bf23a1e030d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-65jxf" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.377631 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgpkw\" (UniqueName: \"kubernetes.io/projected/b6c30462-fb75-4888-a29b-bd63b39159ee-kube-api-access-wgpkw\") pod \"openshift-apiserver-operator-796bbdcf4f-5hczk\" (UID: \"b6c30462-fb75-4888-a29b-bd63b39159ee\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5hczk" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.397408 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4c5h\" (UniqueName: \"kubernetes.io/projected/d640d651-bd78-49dc-b945-f0c749666c66-kube-api-access-x4c5h\") pod \"oauth-openshift-558db77b4-fdwk9\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.418111 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tgsn\" (UniqueName: \"kubernetes.io/projected/57f786f4-7046-4810-ab69-d692be82903f-kube-api-access-9tgsn\") pod \"console-operator-58897d9998-vnnq4\" (UID: \"57f786f4-7046-4810-ab69-d692be82903f\") " pod="openshift-console-operator/console-operator-58897d9998-vnnq4" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.424908 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.450597 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dv4p4" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.458682 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5hczk" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.464610 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.471360 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-65jxf" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.481030 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-vnnq4" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.485097 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.505421 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.524578 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.546282 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.564857 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.585623 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.592898 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.605132 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.653754 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-65jxf"] Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.655227 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dv4p4"] Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.682909 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vnnq4"] Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.701048 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5hczk"] Mar 12 14:49:15 crc kubenswrapper[4869]: W0312 14:49:15.704149 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57f786f4_7046_4810_ab69_d692be82903f.slice/crio-d7a2d5bf5850a91e33b0685102c3e91d135a256b49bedfe32a7afe39ae3b70e5 WatchSource:0}: Error finding container d7a2d5bf5850a91e33b0685102c3e91d135a256b49bedfe32a7afe39ae3b70e5: Status 404 returned error can't find the container with id d7a2d5bf5850a91e33b0685102c3e91d135a256b49bedfe32a7afe39ae3b70e5 Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.707768 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hj26\" (UniqueName: \"kubernetes.io/projected/fcbaf341-4b19-41a8-8120-b09206d52dc1-kube-api-access-8hj26\") pod \"machine-approver-56656f9798-tcc6m\" (UID: \"fcbaf341-4b19-41a8-8120-b09206d52dc1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tcc6m" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.718173 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2kml\" (UniqueName: \"kubernetes.io/projected/d17df7e2-d85d-4172-aff7-0b5e63605a77-kube-api-access-r2kml\") pod \"openshift-config-operator-7777fb866f-z6x5x\" (UID: \"d17df7e2-d85d-4172-aff7-0b5e63605a77\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z6x5x" Mar 12 14:49:15 crc kubenswrapper[4869]: E0312 14:49:15.734238 4869 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Mar 12 14:49:15 crc kubenswrapper[4869]: E0312 14:49:15.734297 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca5302b5-90b9-412e-b378-a1fdedf81184-config podName:ca5302b5-90b9-412e-b378-a1fdedf81184 nodeName:}" failed. No retries permitted until 2026-03-12 14:49:16.234282016 +0000 UTC m=+108.519507294 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/ca5302b5-90b9-412e-b378-a1fdedf81184-config") pod "controller-manager-879f6c89f-jz4tc" (UID: "ca5302b5-90b9-412e-b378-a1fdedf81184") : failed to sync configmap cache: timed out waiting for the condition Mar 12 14:49:15 crc kubenswrapper[4869]: E0312 14:49:15.734980 4869 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/audit-1: failed to sync configmap cache: timed out waiting for the condition Mar 12 14:49:15 crc kubenswrapper[4869]: E0312 14:49:15.735017 4869 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/etcd-serving-ca: failed to sync configmap cache: timed out waiting for the condition Mar 12 14:49:15 crc kubenswrapper[4869]: E0312 14:49:15.735071 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/993b8ae9-69a2-4cb2-806a-888528215561-audit-policies podName:993b8ae9-69a2-4cb2-806a-888528215561 nodeName:}" failed. No retries permitted until 2026-03-12 14:49:16.235052199 +0000 UTC m=+108.520277477 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit-policies" (UniqueName: "kubernetes.io/configmap/993b8ae9-69a2-4cb2-806a-888528215561-audit-policies") pod "apiserver-7bbb656c7d-qmrc7" (UID: "993b8ae9-69a2-4cb2-806a-888528215561") : failed to sync configmap cache: timed out waiting for the condition Mar 12 14:49:15 crc kubenswrapper[4869]: E0312 14:49:15.735317 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/993b8ae9-69a2-4cb2-806a-888528215561-etcd-serving-ca podName:993b8ae9-69a2-4cb2-806a-888528215561 nodeName:}" failed. No retries permitted until 2026-03-12 14:49:16.235095291 +0000 UTC m=+108.520320669 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/993b8ae9-69a2-4cb2-806a-888528215561-etcd-serving-ca") pod "apiserver-7bbb656c7d-qmrc7" (UID: "993b8ae9-69a2-4cb2-806a-888528215561") : failed to sync configmap cache: timed out waiting for the condition Mar 12 14:49:15 crc kubenswrapper[4869]: E0312 14:49:15.735326 4869 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 12 14:49:15 crc kubenswrapper[4869]: E0312 14:49:15.735497 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca5302b5-90b9-412e-b378-a1fdedf81184-client-ca podName:ca5302b5-90b9-412e-b378-a1fdedf81184 nodeName:}" failed. No retries permitted until 2026-03-12 14:49:16.235488112 +0000 UTC m=+108.520713390 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/ca5302b5-90b9-412e-b378-a1fdedf81184-client-ca") pod "controller-manager-879f6c89f-jz4tc" (UID: "ca5302b5-90b9-412e-b378-a1fdedf81184") : failed to sync configmap cache: timed out waiting for the condition Mar 12 14:49:15 crc kubenswrapper[4869]: E0312 14:49:15.735580 4869 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Mar 12 14:49:15 crc kubenswrapper[4869]: E0312 14:49:15.735634 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca5302b5-90b9-412e-b378-a1fdedf81184-proxy-ca-bundles podName:ca5302b5-90b9-412e-b378-a1fdedf81184 nodeName:}" failed. No retries permitted until 2026-03-12 14:49:16.235607356 +0000 UTC m=+108.520832634 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/ca5302b5-90b9-412e-b378-a1fdedf81184-proxy-ca-bundles") pod "controller-manager-879f6c89f-jz4tc" (UID: "ca5302b5-90b9-412e-b378-a1fdedf81184") : failed to sync configmap cache: timed out waiting for the condition Mar 12 14:49:15 crc kubenswrapper[4869]: E0312 14:49:15.736570 4869 secret.go:188] Couldn't get secret openshift-oauth-apiserver/encryption-config-1: failed to sync secret cache: timed out waiting for the condition Mar 12 14:49:15 crc kubenswrapper[4869]: E0312 14:49:15.736603 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/993b8ae9-69a2-4cb2-806a-888528215561-encryption-config podName:993b8ae9-69a2-4cb2-806a-888528215561 nodeName:}" failed. No retries permitted until 2026-03-12 14:49:16.236594625 +0000 UTC m=+108.521819903 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "encryption-config" (UniqueName: "kubernetes.io/secret/993b8ae9-69a2-4cb2-806a-888528215561-encryption-config") pod "apiserver-7bbb656c7d-qmrc7" (UID: "993b8ae9-69a2-4cb2-806a-888528215561") : failed to sync secret cache: timed out waiting for the condition Mar 12 14:49:15 crc kubenswrapper[4869]: E0312 14:49:15.736622 4869 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 12 14:49:15 crc kubenswrapper[4869]: E0312 14:49:15.736698 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/90ae9f42-560b-4b79-a947-25c6de331025-config podName:90ae9f42-560b-4b79-a947-25c6de331025 nodeName:}" failed. No retries permitted until 2026-03-12 14:49:16.236671018 +0000 UTC m=+108.521896296 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/90ae9f42-560b-4b79-a947-25c6de331025-config") pod "machine-api-operator-5694c8668f-2drd7" (UID: "90ae9f42-560b-4b79-a947-25c6de331025") : failed to sync configmap cache: timed out waiting for the condition Mar 12 14:49:15 crc kubenswrapper[4869]: E0312 14:49:15.743631 4869 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 12 14:49:15 crc kubenswrapper[4869]: E0312 14:49:15.743714 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/993b8ae9-69a2-4cb2-806a-888528215561-trusted-ca-bundle podName:993b8ae9-69a2-4cb2-806a-888528215561 nodeName:}" failed. No retries permitted until 2026-03-12 14:49:16.243694417 +0000 UTC m=+108.528919695 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/993b8ae9-69a2-4cb2-806a-888528215561-trusted-ca-bundle") pod "apiserver-7bbb656c7d-qmrc7" (UID: "993b8ae9-69a2-4cb2-806a-888528215561") : failed to sync configmap cache: timed out waiting for the condition Mar 12 14:49:15 crc kubenswrapper[4869]: E0312 14:49:15.743744 4869 secret.go:188] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 12 14:49:15 crc kubenswrapper[4869]: E0312 14:49:15.743778 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca5302b5-90b9-412e-b378-a1fdedf81184-serving-cert podName:ca5302b5-90b9-412e-b378-a1fdedf81184 nodeName:}" failed. No retries permitted until 2026-03-12 14:49:16.243765419 +0000 UTC m=+108.528990767 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/ca5302b5-90b9-412e-b378-a1fdedf81184-serving-cert") pod "controller-manager-879f6c89f-jz4tc" (UID: "ca5302b5-90b9-412e-b378-a1fdedf81184") : failed to sync secret cache: timed out waiting for the condition Mar 12 14:49:15 crc kubenswrapper[4869]: E0312 14:49:15.743804 4869 secret.go:188] Couldn't get secret openshift-oauth-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 12 14:49:15 crc kubenswrapper[4869]: E0312 14:49:15.743833 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/993b8ae9-69a2-4cb2-806a-888528215561-serving-cert podName:993b8ae9-69a2-4cb2-806a-888528215561 nodeName:}" failed. No retries permitted until 2026-03-12 14:49:16.243825991 +0000 UTC m=+108.529051359 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/993b8ae9-69a2-4cb2-806a-888528215561-serving-cert") pod "apiserver-7bbb656c7d-qmrc7" (UID: "993b8ae9-69a2-4cb2-806a-888528215561") : failed to sync secret cache: timed out waiting for the condition Mar 12 14:49:15 crc kubenswrapper[4869]: E0312 14:49:15.743856 4869 secret.go:188] Couldn't get secret openshift-oauth-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Mar 12 14:49:15 crc kubenswrapper[4869]: E0312 14:49:15.743872 4869 configmap.go:193] Couldn't get configMap openshift-machine-api/machine-api-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 12 14:49:15 crc kubenswrapper[4869]: E0312 14:49:15.743881 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/993b8ae9-69a2-4cb2-806a-888528215561-etcd-client podName:993b8ae9-69a2-4cb2-806a-888528215561 nodeName:}" failed. No retries permitted until 2026-03-12 14:49:16.243873172 +0000 UTC m=+108.529098550 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/993b8ae9-69a2-4cb2-806a-888528215561-etcd-client") pod "apiserver-7bbb656c7d-qmrc7" (UID: "993b8ae9-69a2-4cb2-806a-888528215561") : failed to sync secret cache: timed out waiting for the condition Mar 12 14:49:15 crc kubenswrapper[4869]: E0312 14:49:15.743932 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/90ae9f42-560b-4b79-a947-25c6de331025-images podName:90ae9f42-560b-4b79-a947-25c6de331025 nodeName:}" failed. No retries permitted until 2026-03-12 14:49:16.243920364 +0000 UTC m=+108.529145722 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/90ae9f42-560b-4b79-a947-25c6de331025-images") pod "machine-api-operator-5694c8668f-2drd7" (UID: "90ae9f42-560b-4b79-a947-25c6de331025") : failed to sync configmap cache: timed out waiting for the condition Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.743992 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 12 14:49:15 crc kubenswrapper[4869]: E0312 14:49:15.745687 4869 secret.go:188] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 12 14:49:15 crc kubenswrapper[4869]: E0312 14:49:15.745731 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90ae9f42-560b-4b79-a947-25c6de331025-machine-api-operator-tls podName:90ae9f42-560b-4b79-a947-25c6de331025 nodeName:}" failed. No retries permitted until 2026-03-12 14:49:16.245720427 +0000 UTC m=+108.530945785 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/90ae9f42-560b-4b79-a947-25c6de331025-machine-api-operator-tls") pod "machine-api-operator-5694c8668f-2drd7" (UID: "90ae9f42-560b-4b79-a947-25c6de331025") : failed to sync secret cache: timed out waiting for the condition Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.747416 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=3.7474043679999998 podStartE2EDuration="3.747404368s" podCreationTimestamp="2026-03-12 14:49:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:15.746165951 +0000 UTC m=+108.031391239" watchObservedRunningTime="2026-03-12 14:49:15.747404368 +0000 UTC m=+108.032629646" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.763451 4869 request.go:700] Waited for 1.011015703s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager-operator/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.764390 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.781699 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vnnq4" event={"ID":"57f786f4-7046-4810-ab69-d692be82903f","Type":"ContainerStarted","Data":"d7a2d5bf5850a91e33b0685102c3e91d135a256b49bedfe32a7afe39ae3b70e5"} Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.783281 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-65jxf" event={"ID":"b7251ddc-bcae-4fa6-9396-2bf23a1e030d","Type":"ContainerStarted","Data":"880e0a0a24f1e6a45ae49d11678cbb32976169ab82efec884d5c25453578ab63"} Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.784562 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.784879 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dv4p4" event={"ID":"3ca821a8-9fb1-4ba9-8955-0969735ee00a","Type":"ContainerStarted","Data":"2904cab08b0207415816fc7088f59777b0dba6a18061e1a51ead0d965d990440"} Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.785954 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5hczk" event={"ID":"b6c30462-fb75-4888-a29b-bd63b39159ee","Type":"ContainerStarted","Data":"eb3c50ac916999439f93d6c193f411be64db9a147433e94283353bebafc975a0"} Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.792205 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tcc6m" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.804386 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.825586 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.845822 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.864888 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.884957 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.902815 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z6x5x" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.904064 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.925092 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.946178 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.964675 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 12 14:49:15 crc kubenswrapper[4869]: I0312 14:49:15.985469 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.006163 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.007758 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fdwk9"] Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.024850 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.045118 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.071237 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.085203 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.093437 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-z6x5x"] Mar 12 14:49:16 crc kubenswrapper[4869]: W0312 14:49:16.102134 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd17df7e2_d85d_4172_aff7_0b5e63605a77.slice/crio-9dea55e5e8669759153557b6fb4db733c0eac9c6b2134b64eaafac6bac947d1b WatchSource:0}: Error finding container 9dea55e5e8669759153557b6fb4db733c0eac9c6b2134b64eaafac6bac947d1b: Status 404 returned error can't find the container with id 9dea55e5e8669759153557b6fb4db733c0eac9c6b2134b64eaafac6bac947d1b Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.105031 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.125294 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.145658 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.164157 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.185579 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.207031 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.226873 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.244885 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.264115 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/993b8ae9-69a2-4cb2-806a-888528215561-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.264154 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca5302b5-90b9-412e-b378-a1fdedf81184-config\") pod \"controller-manager-879f6c89f-jz4tc\" (UID: \"ca5302b5-90b9-412e-b378-a1fdedf81184\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.264174 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ca5302b5-90b9-412e-b378-a1fdedf81184-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-jz4tc\" (UID: \"ca5302b5-90b9-412e-b378-a1fdedf81184\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.264202 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ca5302b5-90b9-412e-b378-a1fdedf81184-client-ca\") pod \"controller-manager-879f6c89f-jz4tc\" (UID: \"ca5302b5-90b9-412e-b378-a1fdedf81184\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.264231 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/993b8ae9-69a2-4cb2-806a-888528215561-encryption-config\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.264253 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90ae9f42-560b-4b79-a947-25c6de331025-config\") pod \"machine-api-operator-5694c8668f-2drd7\" (UID: \"90ae9f42-560b-4b79-a947-25c6de331025\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2drd7" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.264283 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/993b8ae9-69a2-4cb2-806a-888528215561-audit-policies\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.264329 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca5302b5-90b9-412e-b378-a1fdedf81184-serving-cert\") pod \"controller-manager-879f6c89f-jz4tc\" (UID: \"ca5302b5-90b9-412e-b378-a1fdedf81184\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.264346 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/90ae9f42-560b-4b79-a947-25c6de331025-images\") pod \"machine-api-operator-5694c8668f-2drd7\" (UID: \"90ae9f42-560b-4b79-a947-25c6de331025\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2drd7" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.264361 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/993b8ae9-69a2-4cb2-806a-888528215561-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.264388 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/993b8ae9-69a2-4cb2-806a-888528215561-etcd-client\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.264411 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/993b8ae9-69a2-4cb2-806a-888528215561-serving-cert\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.264427 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/90ae9f42-560b-4b79-a947-25c6de331025-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-2drd7\" (UID: \"90ae9f42-560b-4b79-a947-25c6de331025\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2drd7" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.266191 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.285294 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.306096 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.326222 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.345077 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.364169 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.384772 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.404731 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.425031 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.445083 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 12 14:49:16 crc kubenswrapper[4869]: E0312 14:49:16.461151 4869 projected.go:288] Couldn't get configMap openshift-oauth-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.464836 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.485289 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.504796 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.524283 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.543855 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.565116 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.584436 4869 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.606078 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.624394 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 12 14:49:16 crc kubenswrapper[4869]: E0312 14:49:16.637100 4869 projected.go:288] Couldn't get configMap openshift-machine-api/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.644961 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 12 14:49:16 crc kubenswrapper[4869]: E0312 14:49:16.656746 4869 projected.go:288] Couldn't get configMap openshift-controller-manager/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 12 14:49:16 crc kubenswrapper[4869]: E0312 14:49:16.656782 4869 projected.go:194] Error preparing data for projected volume kube-api-access-5br8w for pod openshift-controller-manager/controller-manager-879f6c89f-jz4tc: failed to sync configmap cache: timed out waiting for the condition Mar 12 14:49:16 crc kubenswrapper[4869]: E0312 14:49:16.656882 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ca5302b5-90b9-412e-b378-a1fdedf81184-kube-api-access-5br8w podName:ca5302b5-90b9-412e-b378-a1fdedf81184 nodeName:}" failed. No retries permitted until 2026-03-12 14:49:17.156833471 +0000 UTC m=+109.442058749 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5br8w" (UniqueName: "kubernetes.io/projected/ca5302b5-90b9-412e-b378-a1fdedf81184-kube-api-access-5br8w") pod "controller-manager-879f6c89f-jz4tc" (UID: "ca5302b5-90b9-412e-b378-a1fdedf81184") : failed to sync configmap cache: timed out waiting for the condition Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.665133 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 12 14:49:16 crc kubenswrapper[4869]: E0312 14:49:16.675420 4869 projected.go:288] Couldn't get configMap openshift-authentication-operator/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 12 14:49:16 crc kubenswrapper[4869]: E0312 14:49:16.675455 4869 projected.go:194] Error preparing data for projected volume kube-api-access-gf9qv for pod openshift-authentication-operator/authentication-operator-69f744f599-fs9qm: failed to sync configmap cache: timed out waiting for the condition Mar 12 14:49:16 crc kubenswrapper[4869]: E0312 14:49:16.675519 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/47d3c91d-a968-413d-a941-bc67279bf905-kube-api-access-gf9qv podName:47d3c91d-a968-413d-a941-bc67279bf905 nodeName:}" failed. No retries permitted until 2026-03-12 14:49:17.175500928 +0000 UTC m=+109.460726206 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-gf9qv" (UniqueName: "kubernetes.io/projected/47d3c91d-a968-413d-a941-bc67279bf905-kube-api-access-gf9qv") pod "authentication-operator-69f744f599-fs9qm" (UID: "47d3c91d-a968-413d-a941-bc67279bf905") : failed to sync configmap cache: timed out waiting for the condition Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.684024 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.705163 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.724739 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.744246 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.763724 4869 request.go:700] Waited for 1.88502862s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/secrets?fieldSelector=metadata.name%3Ddefault-dockercfg-2llfx&limit=500&resourceVersion=0 Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.766144 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.784277 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.790360 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tcc6m" event={"ID":"fcbaf341-4b19-41a8-8120-b09206d52dc1","Type":"ContainerStarted","Data":"b80d32f43c6be9c79cdfba4490d2eae06fd73abccf488b6f65f97b07625028ef"} Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.790418 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tcc6m" event={"ID":"fcbaf341-4b19-41a8-8120-b09206d52dc1","Type":"ContainerStarted","Data":"8360535b589fed0e1613bb07b105c77addd7925969301bfabda18eb618077132"} Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.790433 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tcc6m" event={"ID":"fcbaf341-4b19-41a8-8120-b09206d52dc1","Type":"ContainerStarted","Data":"1b2d7786c79bd1799f864568a8d7f47fe26aa1168436ed48fcd234c2971e9efa"} Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.791317 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dv4p4" event={"ID":"3ca821a8-9fb1-4ba9-8955-0969735ee00a","Type":"ContainerStarted","Data":"df4e08e72f43fd35b430a46743215055ead330b913d6de1df46d0281e82f96b4"} Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.791579 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dv4p4" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.792658 4869 generic.go:334] "Generic (PLEG): container finished" podID="d17df7e2-d85d-4172-aff7-0b5e63605a77" containerID="547ba9d92bec586ff858882d472d533ba31f5f850e5e711d56cc81fa12bc9d64" exitCode=0 Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.792742 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z6x5x" event={"ID":"d17df7e2-d85d-4172-aff7-0b5e63605a77","Type":"ContainerDied","Data":"547ba9d92bec586ff858882d472d533ba31f5f850e5e711d56cc81fa12bc9d64"} Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.792774 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z6x5x" event={"ID":"d17df7e2-d85d-4172-aff7-0b5e63605a77","Type":"ContainerStarted","Data":"9dea55e5e8669759153557b6fb4db733c0eac9c6b2134b64eaafac6bac947d1b"} Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.793988 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" event={"ID":"d640d651-bd78-49dc-b945-f0c749666c66","Type":"ContainerStarted","Data":"ef1e4918cba5339ca58ec7ddc01989527447fcc18a406f5b98ae1f11a77ddb57"} Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.794024 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" event={"ID":"d640d651-bd78-49dc-b945-f0c749666c66","Type":"ContainerStarted","Data":"17d4e6358251c9e9f4695fc7bf5401464ccc338c55529f48cc85e98f8d00e189"} Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.794190 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.794798 4869 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-fdwk9 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" start-of-body= Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.794832 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" podUID="d640d651-bd78-49dc-b945-f0c749666c66" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.797088 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5hczk" event={"ID":"b6c30462-fb75-4888-a29b-bd63b39159ee","Type":"ContainerStarted","Data":"021a2f0d97e86e839da2984508d6485ea083e960b52de3bf233f6219e9bf46b6"} Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.804461 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.805221 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vnnq4" event={"ID":"57f786f4-7046-4810-ab69-d692be82903f","Type":"ContainerStarted","Data":"898bcdbd4e83a48ac34535c8a190cc18e72df47fec2093e890f8c2df7db7e99e"} Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.805382 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-vnnq4" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.808433 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-65jxf" event={"ID":"b7251ddc-bcae-4fa6-9396-2bf23a1e030d","Type":"ContainerStarted","Data":"3f42447b5142a12b53c348bb224953b9d51c23da4213848c8025d157a5ad338f"} Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.808487 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dv4p4" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.808504 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-65jxf" event={"ID":"b7251ddc-bcae-4fa6-9396-2bf23a1e030d","Type":"ContainerStarted","Data":"72d66f152ffdd0070462ef3fb57b8f7bd9c88c0abf20bc3e09560b9b65f51ef9"} Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.829097 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.864599 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.877115 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85fb0eb0-18ab-4d38-a6e6-f9b2220249e3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-n6t79\" (UID: \"85fb0eb0-18ab-4d38-a6e6-f9b2220249e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n6t79" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.877166 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e61f813a-db17-46a6-a380-9f13452ef07b-console-config\") pod \"console-f9d7485db-qfqjj\" (UID: \"e61f813a-db17-46a6-a380-9f13452ef07b\") " pod="openshift-console/console-f9d7485db-qfqjj" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.877554 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b89c3c57-363d-472f-a7cf-32b0e1542f60-metrics-tls\") pod \"dns-operator-744455d44c-6r22c\" (UID: \"b89c3c57-363d-472f-a7cf-32b0e1542f60\") " pod="openshift-dns-operator/dns-operator-744455d44c-6r22c" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.877591 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtjtt\" (UniqueName: \"kubernetes.io/projected/da89f990-01f3-4fc4-bc66-6f33c9639082-kube-api-access-jtjtt\") pod \"openshift-controller-manager-operator-756b6f6bc6-vxlrb\" (UID: \"da89f990-01f3-4fc4-bc66-6f33c9639082\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vxlrb" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.877617 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/32af1403-874a-49e0-ab8f-96511da15218-registry-certificates\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.877678 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b996bdda-9e5f-403c-9ab3-a1a371388e08-node-pullsecrets\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.877699 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b996bdda-9e5f-403c-9ab3-a1a371388e08-serving-cert\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.877747 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhv8f\" (UniqueName: \"kubernetes.io/projected/85fb0eb0-18ab-4d38-a6e6-f9b2220249e3-kube-api-access-xhv8f\") pod \"cluster-image-registry-operator-dc59b4c8b-n6t79\" (UID: \"85fb0eb0-18ab-4d38-a6e6-f9b2220249e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n6t79" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.877781 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e61f813a-db17-46a6-a380-9f13452ef07b-trusted-ca-bundle\") pod \"console-f9d7485db-qfqjj\" (UID: \"e61f813a-db17-46a6-a380-9f13452ef07b\") " pod="openshift-console/console-f9d7485db-qfqjj" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.877803 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b996bdda-9e5f-403c-9ab3-a1a371388e08-etcd-serving-ca\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.877825 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b996bdda-9e5f-403c-9ab3-a1a371388e08-encryption-config\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.877915 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da89f990-01f3-4fc4-bc66-6f33c9639082-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vxlrb\" (UID: \"da89f990-01f3-4fc4-bc66-6f33c9639082\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vxlrb" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.877968 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da89f990-01f3-4fc4-bc66-6f33c9639082-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vxlrb\" (UID: \"da89f990-01f3-4fc4-bc66-6f33c9639082\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vxlrb" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.878004 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b996bdda-9e5f-403c-9ab3-a1a371388e08-audit-dir\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.878058 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6663b93f-3968-4397-94f9-57c42e90fad8-etcd-ca\") pod \"etcd-operator-b45778765-4dlk8\" (UID: \"6663b93f-3968-4397-94f9-57c42e90fad8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4dlk8" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.878137 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6663b93f-3968-4397-94f9-57c42e90fad8-serving-cert\") pod \"etcd-operator-b45778765-4dlk8\" (UID: \"6663b93f-3968-4397-94f9-57c42e90fad8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4dlk8" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.878377 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6663b93f-3968-4397-94f9-57c42e90fad8-etcd-client\") pod \"etcd-operator-b45778765-4dlk8\" (UID: \"6663b93f-3968-4397-94f9-57c42e90fad8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4dlk8" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.878471 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6663b93f-3968-4397-94f9-57c42e90fad8-config\") pod \"etcd-operator-b45778765-4dlk8\" (UID: \"6663b93f-3968-4397-94f9-57c42e90fad8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4dlk8" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.878723 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:16 crc kubenswrapper[4869]: E0312 14:49:16.879513 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:17.379496952 +0000 UTC m=+109.664722230 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.880396 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7g5n\" (UniqueName: \"kubernetes.io/projected/32af1403-874a-49e0-ab8f-96511da15218-kube-api-access-n7g5n\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.881096 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/32af1403-874a-49e0-ab8f-96511da15218-installation-pull-secrets\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.881168 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xssq2\" (UniqueName: \"kubernetes.io/projected/b89c3c57-363d-472f-a7cf-32b0e1542f60-kube-api-access-xssq2\") pod \"dns-operator-744455d44c-6r22c\" (UID: \"b89c3c57-363d-472f-a7cf-32b0e1542f60\") " pod="openshift-dns-operator/dns-operator-744455d44c-6r22c" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.881623 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b996bdda-9e5f-403c-9ab3-a1a371388e08-audit\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.882707 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m48bv\" (UniqueName: \"kubernetes.io/projected/e61f813a-db17-46a6-a380-9f13452ef07b-kube-api-access-m48bv\") pod \"console-f9d7485db-qfqjj\" (UID: \"e61f813a-db17-46a6-a380-9f13452ef07b\") " pod="openshift-console/console-f9d7485db-qfqjj" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.883336 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32af1403-874a-49e0-ab8f-96511da15218-trusted-ca\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.883440 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/32af1403-874a-49e0-ab8f-96511da15218-ca-trust-extracted\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.883471 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b996bdda-9e5f-403c-9ab3-a1a371388e08-config\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.883622 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b996bdda-9e5f-403c-9ab3-a1a371388e08-image-import-ca\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.883754 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e61f813a-db17-46a6-a380-9f13452ef07b-console-oauth-config\") pod \"console-f9d7485db-qfqjj\" (UID: \"e61f813a-db17-46a6-a380-9f13452ef07b\") " pod="openshift-console/console-f9d7485db-qfqjj" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.884163 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85fb0eb0-18ab-4d38-a6e6-f9b2220249e3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-n6t79\" (UID: \"85fb0eb0-18ab-4d38-a6e6-f9b2220249e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n6t79" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.884205 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e61f813a-db17-46a6-a380-9f13452ef07b-service-ca\") pod \"console-f9d7485db-qfqjj\" (UID: \"e61f813a-db17-46a6-a380-9f13452ef07b\") " pod="openshift-console/console-f9d7485db-qfqjj" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.884232 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b996bdda-9e5f-403c-9ab3-a1a371388e08-trusted-ca-bundle\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.884258 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32af1403-874a-49e0-ab8f-96511da15218-bound-sa-token\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.884289 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/85fb0eb0-18ab-4d38-a6e6-f9b2220249e3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-n6t79\" (UID: \"85fb0eb0-18ab-4d38-a6e6-f9b2220249e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n6t79" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.884315 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6663b93f-3968-4397-94f9-57c42e90fad8-etcd-service-ca\") pod \"etcd-operator-b45778765-4dlk8\" (UID: \"6663b93f-3968-4397-94f9-57c42e90fad8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4dlk8" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.884398 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgkn4\" (UniqueName: \"kubernetes.io/projected/7d89d513-f587-4072-9038-09aa4e0a6b0d-kube-api-access-cgkn4\") pod \"downloads-7954f5f757-6xr6k\" (UID: \"7d89d513-f587-4072-9038-09aa4e0a6b0d\") " pod="openshift-console/downloads-7954f5f757-6xr6k" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.884433 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp45t\" (UniqueName: \"kubernetes.io/projected/b996bdda-9e5f-403c-9ab3-a1a371388e08-kube-api-access-bp45t\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.884486 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b996bdda-9e5f-403c-9ab3-a1a371388e08-etcd-client\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.884531 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/32af1403-874a-49e0-ab8f-96511da15218-registry-tls\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.884567 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e61f813a-db17-46a6-a380-9f13452ef07b-console-serving-cert\") pod \"console-f9d7485db-qfqjj\" (UID: \"e61f813a-db17-46a6-a380-9f13452ef07b\") " pod="openshift-console/console-f9d7485db-qfqjj" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.884593 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e61f813a-db17-46a6-a380-9f13452ef07b-oauth-serving-cert\") pod \"console-f9d7485db-qfqjj\" (UID: \"e61f813a-db17-46a6-a380-9f13452ef07b\") " pod="openshift-console/console-f9d7485db-qfqjj" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.884695 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22wc2\" (UniqueName: \"kubernetes.io/projected/6663b93f-3968-4397-94f9-57c42e90fad8-kube-api-access-22wc2\") pod \"etcd-operator-b45778765-4dlk8\" (UID: \"6663b93f-3968-4397-94f9-57c42e90fad8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4dlk8" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.891220 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.905862 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.925368 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.946674 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.965035 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.984600 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.985366 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.985643 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m48bv\" (UniqueName: \"kubernetes.io/projected/e61f813a-db17-46a6-a380-9f13452ef07b-kube-api-access-m48bv\") pod \"console-f9d7485db-qfqjj\" (UID: \"e61f813a-db17-46a6-a380-9f13452ef07b\") " pod="openshift-console/console-f9d7485db-qfqjj" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.985699 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/93b6b5e8-18cd-419e-9a09-f8e6c29febd2-mountpoint-dir\") pod \"csi-hostpathplugin-njkp6\" (UID: \"93b6b5e8-18cd-419e-9a09-f8e6c29febd2\") " pod="hostpath-provisioner/csi-hostpathplugin-njkp6" Mar 12 14:49:16 crc kubenswrapper[4869]: E0312 14:49:16.985741 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:17.48571564 +0000 UTC m=+109.770940918 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.985798 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32af1403-874a-49e0-ab8f-96511da15218-trusted-ca\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.985830 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4ea68feb-ae95-41ab-9d2f-43f111c3721c-profile-collector-cert\") pod \"catalog-operator-68c6474976-z5d9m\" (UID: \"4ea68feb-ae95-41ab-9d2f-43f111c3721c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z5d9m" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.985850 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6tgj\" (UniqueName: \"kubernetes.io/projected/a33a8a30-b9bd-4d21-a5f5-aca28766b920-kube-api-access-l6tgj\") pod \"service-ca-operator-777779d784-ld8nq\" (UID: \"a33a8a30-b9bd-4d21-a5f5-aca28766b920\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ld8nq" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.985875 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/32af1403-874a-49e0-ab8f-96511da15218-ca-trust-extracted\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.985896 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b996bdda-9e5f-403c-9ab3-a1a371388e08-config\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.985913 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxkc7\" (UniqueName: \"kubernetes.io/projected/e5060045-fb36-499c-ab6e-d09eee39afd8-kube-api-access-gxkc7\") pod \"migrator-59844c95c7-c55nj\" (UID: \"e5060045-fb36-499c-ab6e-d09eee39afd8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c55nj" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.985928 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4ea68feb-ae95-41ab-9d2f-43f111c3721c-srv-cert\") pod \"catalog-operator-68c6474976-z5d9m\" (UID: \"4ea68feb-ae95-41ab-9d2f-43f111c3721c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z5d9m" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.985945 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7tzr\" (UniqueName: \"kubernetes.io/projected/93b6b5e8-18cd-419e-9a09-f8e6c29febd2-kube-api-access-c7tzr\") pod \"csi-hostpathplugin-njkp6\" (UID: \"93b6b5e8-18cd-419e-9a09-f8e6c29febd2\") " pod="hostpath-provisioner/csi-hostpathplugin-njkp6" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.985968 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h74tj\" (UniqueName: \"kubernetes.io/projected/f0c481b0-7a0f-4737-8660-fcb14728964b-kube-api-access-h74tj\") pod \"ingress-operator-5b745b69d9-l4g7m\" (UID: \"f0c481b0-7a0f-4737-8660-fcb14728964b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l4g7m" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.985992 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e61f813a-db17-46a6-a380-9f13452ef07b-console-oauth-config\") pod \"console-f9d7485db-qfqjj\" (UID: \"e61f813a-db17-46a6-a380-9f13452ef07b\") " pod="openshift-console/console-f9d7485db-qfqjj" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986009 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0cf1e554-88c7-44e8-8f5c-2487a10fa32a-certs\") pod \"machine-config-server-dq7n7\" (UID: \"0cf1e554-88c7-44e8-8f5c-2487a10fa32a\") " pod="openshift-machine-config-operator/machine-config-server-dq7n7" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986051 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85fb0eb0-18ab-4d38-a6e6-f9b2220249e3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-n6t79\" (UID: \"85fb0eb0-18ab-4d38-a6e6-f9b2220249e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n6t79" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986068 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b996bdda-9e5f-403c-9ab3-a1a371388e08-trusted-ca-bundle\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986085 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a0a0f5f3-30b3-45bf-9545-877b619b94f5-config-volume\") pod \"dns-default-624kd\" (UID: \"a0a0f5f3-30b3-45bf-9545-877b619b94f5\") " pod="openshift-dns/dns-default-624kd" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986107 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/85fb0eb0-18ab-4d38-a6e6-f9b2220249e3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-n6t79\" (UID: \"85fb0eb0-18ab-4d38-a6e6-f9b2220249e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n6t79" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986127 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/15598969-d30d-4c3d-a3ca-e9a75c54fb90-signing-key\") pod \"service-ca-9c57cc56f-nsnw9\" (UID: \"15598969-d30d-4c3d-a3ca-e9a75c54fb90\") " pod="openshift-service-ca/service-ca-9c57cc56f-nsnw9" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986157 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6663b93f-3968-4397-94f9-57c42e90fad8-etcd-service-ca\") pod \"etcd-operator-b45778765-4dlk8\" (UID: \"6663b93f-3968-4397-94f9-57c42e90fad8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4dlk8" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986198 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bc23dbe-e45c-4c9d-99c8-b7cec390a6b0-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-w6p7c\" (UID: \"4bc23dbe-e45c-4c9d-99c8-b7cec390a6b0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w6p7c" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986216 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2d3a496a-d6d4-474b-8ab8-fffa4e661e07-tmpfs\") pod \"packageserver-d55dfcdfc-9xp9n\" (UID: \"2d3a496a-d6d4-474b-8ab8-fffa4e661e07\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9xp9n" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986235 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c84fv\" (UniqueName: \"kubernetes.io/projected/22b1e4b6-8e9c-4e12-8627-469e056beee5-kube-api-access-c84fv\") pod \"collect-profiles-29555445-z4j5n\" (UID: \"22b1e4b6-8e9c-4e12-8627-469e056beee5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-z4j5n" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986255 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqj8g\" (UniqueName: \"kubernetes.io/projected/2d3a496a-d6d4-474b-8ab8-fffa4e661e07-kube-api-access-zqj8g\") pod \"packageserver-d55dfcdfc-9xp9n\" (UID: \"2d3a496a-d6d4-474b-8ab8-fffa4e661e07\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9xp9n" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986274 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b996bdda-9e5f-403c-9ab3-a1a371388e08-etcd-client\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986296 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bc23dbe-e45c-4c9d-99c8-b7cec390a6b0-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-w6p7c\" (UID: \"4bc23dbe-e45c-4c9d-99c8-b7cec390a6b0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w6p7c" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986322 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c2c29741-810a-48bb-a987-127fe5d45625-profile-collector-cert\") pod \"olm-operator-6b444d44fb-z4kzx\" (UID: \"c2c29741-810a-48bb-a987-127fe5d45625\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z4kzx" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986355 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/46d3c5a6-c886-4ae0-b381-95ffb9902718-ready\") pod \"cni-sysctl-allowlist-ds-7zmqf\" (UID: \"46d3c5a6-c886-4ae0-b381-95ffb9902718\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7zmqf" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986372 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22b1e4b6-8e9c-4e12-8627-469e056beee5-config-volume\") pod \"collect-profiles-29555445-z4j5n\" (UID: \"22b1e4b6-8e9c-4e12-8627-469e056beee5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-z4j5n" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986387 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a0a0f5f3-30b3-45bf-9545-877b619b94f5-metrics-tls\") pod \"dns-default-624kd\" (UID: \"a0a0f5f3-30b3-45bf-9545-877b619b94f5\") " pod="openshift-dns/dns-default-624kd" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986421 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22wc2\" (UniqueName: \"kubernetes.io/projected/6663b93f-3968-4397-94f9-57c42e90fad8-kube-api-access-22wc2\") pod \"etcd-operator-b45778765-4dlk8\" (UID: \"6663b93f-3968-4397-94f9-57c42e90fad8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4dlk8" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986438 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0c481b0-7a0f-4737-8660-fcb14728964b-metrics-tls\") pod \"ingress-operator-5b745b69d9-l4g7m\" (UID: \"f0c481b0-7a0f-4737-8660-fcb14728964b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l4g7m" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986461 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e61f813a-db17-46a6-a380-9f13452ef07b-console-config\") pod \"console-f9d7485db-qfqjj\" (UID: \"e61f813a-db17-46a6-a380-9f13452ef07b\") " pod="openshift-console/console-f9d7485db-qfqjj" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986482 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ae0168ae-d62e-4c9e-be20-6e9a37751c7a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-56f66\" (UID: \"ae0168ae-d62e-4c9e-be20-6e9a37751c7a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-56f66" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986499 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7fpl\" (UniqueName: \"kubernetes.io/projected/46d3c5a6-c886-4ae0-b381-95ffb9902718-kube-api-access-v7fpl\") pod \"cni-sysctl-allowlist-ds-7zmqf\" (UID: \"46d3c5a6-c886-4ae0-b381-95ffb9902718\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7zmqf" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986518 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e279336b-ac69-4574-9910-11c1fe663252-stats-auth\") pod \"router-default-5444994796-5f46k\" (UID: \"e279336b-ac69-4574-9910-11c1fe663252\") " pod="openshift-ingress/router-default-5444994796-5f46k" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986553 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b89c3c57-363d-472f-a7cf-32b0e1542f60-metrics-tls\") pod \"dns-operator-744455d44c-6r22c\" (UID: \"b89c3c57-363d-472f-a7cf-32b0e1542f60\") " pod="openshift-dns-operator/dns-operator-744455d44c-6r22c" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986570 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxz29\" (UniqueName: \"kubernetes.io/projected/4ea68feb-ae95-41ab-9d2f-43f111c3721c-kube-api-access-bxz29\") pod \"catalog-operator-68c6474976-z5d9m\" (UID: \"4ea68feb-ae95-41ab-9d2f-43f111c3721c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z5d9m" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986588 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e279336b-ac69-4574-9910-11c1fe663252-service-ca-bundle\") pod \"router-default-5444994796-5f46k\" (UID: \"e279336b-ac69-4574-9910-11c1fe663252\") " pod="openshift-ingress/router-default-5444994796-5f46k" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986607 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/32af1403-874a-49e0-ab8f-96511da15218-registry-certificates\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986628 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b996bdda-9e5f-403c-9ab3-a1a371388e08-serving-cert\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986651 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtjtt\" (UniqueName: \"kubernetes.io/projected/da89f990-01f3-4fc4-bc66-6f33c9639082-kube-api-access-jtjtt\") pod \"openshift-controller-manager-operator-756b6f6bc6-vxlrb\" (UID: \"da89f990-01f3-4fc4-bc66-6f33c9639082\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vxlrb" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986679 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e61f813a-db17-46a6-a380-9f13452ef07b-trusted-ca-bundle\") pod \"console-f9d7485db-qfqjj\" (UID: \"e61f813a-db17-46a6-a380-9f13452ef07b\") " pod="openshift-console/console-f9d7485db-qfqjj" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986702 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b996bdda-9e5f-403c-9ab3-a1a371388e08-etcd-serving-ca\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986730 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwxng\" (UniqueName: \"kubernetes.io/projected/a46b2dfd-f10c-4963-bf75-2068a886b420-kube-api-access-fwxng\") pod \"kube-storage-version-migrator-operator-b67b599dd-8p2d8\" (UID: \"a46b2dfd-f10c-4963-bf75-2068a886b420\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8p2d8" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986751 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e279336b-ac69-4574-9910-11c1fe663252-metrics-certs\") pod \"router-default-5444994796-5f46k\" (UID: \"e279336b-ac69-4574-9910-11c1fe663252\") " pod="openshift-ingress/router-default-5444994796-5f46k" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986773 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/93b6b5e8-18cd-419e-9a09-f8e6c29febd2-registration-dir\") pod \"csi-hostpathplugin-njkp6\" (UID: \"93b6b5e8-18cd-419e-9a09-f8e6c29febd2\") " pod="hostpath-provisioner/csi-hostpathplugin-njkp6" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986802 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b996bdda-9e5f-403c-9ab3-a1a371388e08-encryption-config\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986820 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da89f990-01f3-4fc4-bc66-6f33c9639082-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vxlrb\" (UID: \"da89f990-01f3-4fc4-bc66-6f33c9639082\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vxlrb" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986838 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dcd0af06-c1e3-44b1-9dfc-af1683bf9893-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-h2p9q\" (UID: \"dcd0af06-c1e3-44b1-9dfc-af1683bf9893\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-h2p9q" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986855 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22b1e4b6-8e9c-4e12-8627-469e056beee5-secret-volume\") pod \"collect-profiles-29555445-z4j5n\" (UID: \"22b1e4b6-8e9c-4e12-8627-469e056beee5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-z4j5n" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986873 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6663b93f-3968-4397-94f9-57c42e90fad8-etcd-ca\") pod \"etcd-operator-b45778765-4dlk8\" (UID: \"6663b93f-3968-4397-94f9-57c42e90fad8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4dlk8" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986889 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a33a8a30-b9bd-4d21-a5f5-aca28766b920-serving-cert\") pod \"service-ca-operator-777779d784-ld8nq\" (UID: \"a33a8a30-b9bd-4d21-a5f5-aca28766b920\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ld8nq" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986913 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8dd7972d-1866-4706-93b7-66fd45227c7f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8jwlf\" (UID: \"8dd7972d-1866-4706-93b7-66fd45227c7f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8jwlf" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986939 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986963 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6663b93f-3968-4397-94f9-57c42e90fad8-config\") pod \"etcd-operator-b45778765-4dlk8\" (UID: \"6663b93f-3968-4397-94f9-57c42e90fad8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4dlk8" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.986986 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltf77\" (UniqueName: \"kubernetes.io/projected/0cf1e554-88c7-44e8-8f5c-2487a10fa32a-kube-api-access-ltf77\") pod \"machine-config-server-dq7n7\" (UID: \"0cf1e554-88c7-44e8-8f5c-2487a10fa32a\") " pod="openshift-machine-config-operator/machine-config-server-dq7n7" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.987008 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d3a496a-d6d4-474b-8ab8-fffa4e661e07-webhook-cert\") pod \"packageserver-d55dfcdfc-9xp9n\" (UID: \"2d3a496a-d6d4-474b-8ab8-fffa4e661e07\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9xp9n" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.987028 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcrz5\" (UniqueName: \"kubernetes.io/projected/68451e19-64a1-471e-85c8-7238bb88e14c-kube-api-access-bcrz5\") pod \"control-plane-machine-set-operator-78cbb6b69f-xq5fz\" (UID: \"68451e19-64a1-471e-85c8-7238bb88e14c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xq5fz" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.987065 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ae0168ae-d62e-4c9e-be20-6e9a37751c7a-proxy-tls\") pod \"machine-config-operator-74547568cd-56f66\" (UID: \"ae0168ae-d62e-4c9e-be20-6e9a37751c7a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-56f66" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.987083 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0cf1e554-88c7-44e8-8f5c-2487a10fa32a-node-bootstrap-token\") pod \"machine-config-server-dq7n7\" (UID: \"0cf1e554-88c7-44e8-8f5c-2487a10fa32a\") " pod="openshift-machine-config-operator/machine-config-server-dq7n7" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.987110 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xssq2\" (UniqueName: \"kubernetes.io/projected/b89c3c57-363d-472f-a7cf-32b0e1542f60-kube-api-access-xssq2\") pod \"dns-operator-744455d44c-6r22c\" (UID: \"b89c3c57-363d-472f-a7cf-32b0e1542f60\") " pod="openshift-dns-operator/dns-operator-744455d44c-6r22c" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.987124 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b996bdda-9e5f-403c-9ab3-a1a371388e08-audit\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.987140 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a46b2dfd-f10c-4963-bf75-2068a886b420-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8p2d8\" (UID: \"a46b2dfd-f10c-4963-bf75-2068a886b420\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8p2d8" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.987162 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/93b6b5e8-18cd-419e-9a09-f8e6c29febd2-plugins-dir\") pod \"csi-hostpathplugin-njkp6\" (UID: \"93b6b5e8-18cd-419e-9a09-f8e6c29febd2\") " pod="hostpath-provisioner/csi-hostpathplugin-njkp6" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.987182 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/46d3c5a6-c886-4ae0-b381-95ffb9902718-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-7zmqf\" (UID: \"46d3c5a6-c886-4ae0-b381-95ffb9902718\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7zmqf" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.987200 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b996bdda-9e5f-403c-9ab3-a1a371388e08-image-import-ca\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.987209 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6663b93f-3968-4397-94f9-57c42e90fad8-etcd-service-ca\") pod \"etcd-operator-b45778765-4dlk8\" (UID: \"6663b93f-3968-4397-94f9-57c42e90fad8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4dlk8" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.987216 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb16ce4b-e604-45d9-9635-c2565dcbd228-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qdm27\" (UID: \"bb16ce4b-e604-45d9-9635-c2565dcbd228\") " pod="openshift-marketplace/marketplace-operator-79b997595-qdm27" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.987749 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32af1403-874a-49e0-ab8f-96511da15218-trusted-ca\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.987880 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/32af1403-874a-49e0-ab8f-96511da15218-ca-trust-extracted\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:16 crc kubenswrapper[4869]: E0312 14:49:16.988090 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:17.4880778 +0000 UTC m=+109.773303078 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.988394 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b996bdda-9e5f-403c-9ab3-a1a371388e08-config\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.988923 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6663b93f-3968-4397-94f9-57c42e90fad8-etcd-ca\") pod \"etcd-operator-b45778765-4dlk8\" (UID: \"6663b93f-3968-4397-94f9-57c42e90fad8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4dlk8" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.988940 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6663b93f-3968-4397-94f9-57c42e90fad8-config\") pod \"etcd-operator-b45778765-4dlk8\" (UID: \"6663b93f-3968-4397-94f9-57c42e90fad8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4dlk8" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.993377 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00636396-50ae-4e3b-b21f-f83d6dcb1ddc-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jbg9h\" (UID: \"00636396-50ae-4e3b-b21f-f83d6dcb1ddc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jbg9h" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.993498 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c2c29741-810a-48bb-a987-127fe5d45625-srv-cert\") pod \"olm-operator-6b444d44fb-z4kzx\" (UID: \"c2c29741-810a-48bb-a987-127fe5d45625\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z4kzx" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.993531 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/68451e19-64a1-471e-85c8-7238bb88e14c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xq5fz\" (UID: \"68451e19-64a1-471e-85c8-7238bb88e14c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xq5fz" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.993631 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e61f813a-db17-46a6-a380-9f13452ef07b-service-ca\") pod \"console-f9d7485db-qfqjj\" (UID: \"e61f813a-db17-46a6-a380-9f13452ef07b\") " pod="openshift-console/console-f9d7485db-qfqjj" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.993661 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32af1403-874a-49e0-ab8f-96511da15218-bound-sa-token\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.993691 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95flb\" (UniqueName: \"kubernetes.io/projected/bb16ce4b-e604-45d9-9635-c2565dcbd228-kube-api-access-95flb\") pod \"marketplace-operator-79b997595-qdm27\" (UID: \"bb16ce4b-e604-45d9-9635-c2565dcbd228\") " pod="openshift-marketplace/marketplace-operator-79b997595-qdm27" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.993713 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq9gk\" (UniqueName: \"kubernetes.io/projected/c2c29741-810a-48bb-a987-127fe5d45625-kube-api-access-vq9gk\") pod \"olm-operator-6b444d44fb-z4kzx\" (UID: \"c2c29741-810a-48bb-a987-127fe5d45625\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z4kzx" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.993737 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f0c481b0-7a0f-4737-8660-fcb14728964b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-l4g7m\" (UID: \"f0c481b0-7a0f-4737-8660-fcb14728964b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l4g7m" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.993759 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgkn4\" (UniqueName: \"kubernetes.io/projected/7d89d513-f587-4072-9038-09aa4e0a6b0d-kube-api-access-cgkn4\") pod \"downloads-7954f5f757-6xr6k\" (UID: \"7d89d513-f587-4072-9038-09aa4e0a6b0d\") " pod="openshift-console/downloads-7954f5f757-6xr6k" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.993784 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8nk6\" (UniqueName: \"kubernetes.io/projected/a0a0f5f3-30b3-45bf-9545-877b619b94f5-kube-api-access-h8nk6\") pod \"dns-default-624kd\" (UID: \"a0a0f5f3-30b3-45bf-9545-877b619b94f5\") " pod="openshift-dns/dns-default-624kd" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.994506 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e61f813a-db17-46a6-a380-9f13452ef07b-service-ca\") pod \"console-f9d7485db-qfqjj\" (UID: \"e61f813a-db17-46a6-a380-9f13452ef07b\") " pod="openshift-console/console-f9d7485db-qfqjj" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.994813 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b996bdda-9e5f-403c-9ab3-a1a371388e08-trusted-ca-bundle\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.995060 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b996bdda-9e5f-403c-9ab3-a1a371388e08-etcd-serving-ca\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.995311 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b996bdda-9e5f-403c-9ab3-a1a371388e08-etcd-client\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.995368 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp45t\" (UniqueName: \"kubernetes.io/projected/b996bdda-9e5f-403c-9ab3-a1a371388e08-kube-api-access-bp45t\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.995417 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a33a8a30-b9bd-4d21-a5f5-aca28766b920-config\") pod \"service-ca-operator-777779d784-ld8nq\" (UID: \"a33a8a30-b9bd-4d21-a5f5-aca28766b920\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ld8nq" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.995582 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/46d3c5a6-c886-4ae0-b381-95ffb9902718-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-7zmqf\" (UID: \"46d3c5a6-c886-4ae0-b381-95ffb9902718\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7zmqf" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.995763 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w9hc\" (UniqueName: \"kubernetes.io/projected/7b0de265-9c71-4fbf-a42b-8bb0d84a6284-kube-api-access-8w9hc\") pod \"ingress-canary-bsjt9\" (UID: \"7b0de265-9c71-4fbf-a42b-8bb0d84a6284\") " pod="openshift-ingress-canary/ingress-canary-bsjt9" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.995792 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/32af1403-874a-49e0-ab8f-96511da15218-registry-tls\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.995819 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e61f813a-db17-46a6-a380-9f13452ef07b-console-serving-cert\") pod \"console-f9d7485db-qfqjj\" (UID: \"e61f813a-db17-46a6-a380-9f13452ef07b\") " pod="openshift-console/console-f9d7485db-qfqjj" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.996130 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b996bdda-9e5f-403c-9ab3-a1a371388e08-image-import-ca\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.996321 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e61f813a-db17-46a6-a380-9f13452ef07b-oauth-serving-cert\") pod \"console-f9d7485db-qfqjj\" (UID: \"e61f813a-db17-46a6-a380-9f13452ef07b\") " pod="openshift-console/console-f9d7485db-qfqjj" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.996356 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b0de265-9c71-4fbf-a42b-8bb0d84a6284-cert\") pod \"ingress-canary-bsjt9\" (UID: \"7b0de265-9c71-4fbf-a42b-8bb0d84a6284\") " pod="openshift-ingress-canary/ingress-canary-bsjt9" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.996378 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00636396-50ae-4e3b-b21f-f83d6dcb1ddc-config\") pod \"kube-apiserver-operator-766d6c64bb-jbg9h\" (UID: \"00636396-50ae-4e3b-b21f-f83d6dcb1ddc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jbg9h" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.996425 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktq2q\" (UniqueName: \"kubernetes.io/projected/fb453a19-efba-405c-b3c3-b73892c2c4ac-kube-api-access-ktq2q\") pod \"machine-config-controller-84d6567774-9krm6\" (UID: \"fb453a19-efba-405c-b3c3-b73892c2c4ac\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9krm6" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.996449 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85fb0eb0-18ab-4d38-a6e6-f9b2220249e3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-n6t79\" (UID: \"85fb0eb0-18ab-4d38-a6e6-f9b2220249e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n6t79" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.996469 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ae0168ae-d62e-4c9e-be20-6e9a37751c7a-images\") pod \"machine-config-operator-74547568cd-56f66\" (UID: \"ae0168ae-d62e-4c9e-be20-6e9a37751c7a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-56f66" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.996494 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0131b05b-4a2d-4bb5-b65f-8531833bd203-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-cg4mg\" (UID: \"0131b05b-4a2d-4bb5-b65f-8531833bd203\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cg4mg" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.996515 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/15598969-d30d-4c3d-a3ca-e9a75c54fb90-signing-cabundle\") pod \"service-ca-9c57cc56f-nsnw9\" (UID: \"15598969-d30d-4c3d-a3ca-e9a75c54fb90\") " pod="openshift-service-ca/service-ca-9c57cc56f-nsnw9" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.997364 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e279336b-ac69-4574-9910-11c1fe663252-default-certificate\") pod \"router-default-5444994796-5f46k\" (UID: \"e279336b-ac69-4574-9910-11c1fe663252\") " pod="openshift-ingress/router-default-5444994796-5f46k" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.997441 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b996bdda-9e5f-403c-9ab3-a1a371388e08-node-pullsecrets\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.997583 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhv8f\" (UniqueName: \"kubernetes.io/projected/85fb0eb0-18ab-4d38-a6e6-f9b2220249e3-kube-api-access-xhv8f\") pod \"cluster-image-registry-operator-dc59b4c8b-n6t79\" (UID: \"85fb0eb0-18ab-4d38-a6e6-f9b2220249e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n6t79" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.997618 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fb453a19-efba-405c-b3c3-b73892c2c4ac-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9krm6\" (UID: \"fb453a19-efba-405c-b3c3-b73892c2c4ac\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9krm6" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.997653 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da89f990-01f3-4fc4-bc66-6f33c9639082-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vxlrb\" (UID: \"da89f990-01f3-4fc4-bc66-6f33c9639082\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vxlrb" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.997686 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b996bdda-9e5f-403c-9ab3-a1a371388e08-audit-dir\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.997715 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bc23dbe-e45c-4c9d-99c8-b7cec390a6b0-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-w6p7c\" (UID: \"4bc23dbe-e45c-4c9d-99c8-b7cec390a6b0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w6p7c" Mar 12 14:49:16 crc kubenswrapper[4869]: I0312 14:49:16.997746 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bb16ce4b-e604-45d9-9635-c2565dcbd228-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qdm27\" (UID: \"bb16ce4b-e604-45d9-9635-c2565dcbd228\") " pod="openshift-marketplace/marketplace-operator-79b997595-qdm27" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:16.998507 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0131b05b-4a2d-4bb5-b65f-8531833bd203-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-cg4mg\" (UID: \"0131b05b-4a2d-4bb5-b65f-8531833bd203\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cg4mg" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:16.998640 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/93b6b5e8-18cd-419e-9a09-f8e6c29febd2-csi-data-dir\") pod \"csi-hostpathplugin-njkp6\" (UID: \"93b6b5e8-18cd-419e-9a09-f8e6c29febd2\") " pod="hostpath-provisioner/csi-hostpathplugin-njkp6" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:16.998673 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6663b93f-3968-4397-94f9-57c42e90fad8-serving-cert\") pod \"etcd-operator-b45778765-4dlk8\" (UID: \"6663b93f-3968-4397-94f9-57c42e90fad8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4dlk8" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:16.998699 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00636396-50ae-4e3b-b21f-f83d6dcb1ddc-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jbg9h\" (UID: \"00636396-50ae-4e3b-b21f-f83d6dcb1ddc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jbg9h" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:16.998729 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6663b93f-3968-4397-94f9-57c42e90fad8-etcd-client\") pod \"etcd-operator-b45778765-4dlk8\" (UID: \"6663b93f-3968-4397-94f9-57c42e90fad8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4dlk8" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:16.998749 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a46b2dfd-f10c-4963-bf75-2068a886b420-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8p2d8\" (UID: \"a46b2dfd-f10c-4963-bf75-2068a886b420\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8p2d8" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:16.998771 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0131b05b-4a2d-4bb5-b65f-8531833bd203-config\") pod \"kube-controller-manager-operator-78b949d7b-cg4mg\" (UID: \"0131b05b-4a2d-4bb5-b65f-8531833bd203\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cg4mg" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:16.998793 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/93b6b5e8-18cd-419e-9a09-f8e6c29febd2-socket-dir\") pod \"csi-hostpathplugin-njkp6\" (UID: \"93b6b5e8-18cd-419e-9a09-f8e6c29febd2\") " pod="hostpath-provisioner/csi-hostpathplugin-njkp6" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:16.998820 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7525t\" (UniqueName: \"kubernetes.io/projected/dcd0af06-c1e3-44b1-9dfc-af1683bf9893-kube-api-access-7525t\") pod \"multus-admission-controller-857f4d67dd-h2p9q\" (UID: \"dcd0af06-c1e3-44b1-9dfc-af1683bf9893\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-h2p9q" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:16.998842 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d3a496a-d6d4-474b-8ab8-fffa4e661e07-apiservice-cert\") pod \"packageserver-d55dfcdfc-9xp9n\" (UID: \"2d3a496a-d6d4-474b-8ab8-fffa4e661e07\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9xp9n" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:16.998866 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkpzw\" (UniqueName: \"kubernetes.io/projected/e279336b-ac69-4574-9910-11c1fe663252-kube-api-access-rkpzw\") pod \"router-default-5444994796-5f46k\" (UID: \"e279336b-ac69-4574-9910-11c1fe663252\") " pod="openshift-ingress/router-default-5444994796-5f46k" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:16.998888 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbnvx\" (UniqueName: \"kubernetes.io/projected/15598969-d30d-4c3d-a3ca-e9a75c54fb90-kube-api-access-bbnvx\") pod \"service-ca-9c57cc56f-nsnw9\" (UID: \"15598969-d30d-4c3d-a3ca-e9a75c54fb90\") " pod="openshift-service-ca/service-ca-9c57cc56f-nsnw9" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:16.999635 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b996bdda-9e5f-403c-9ab3-a1a371388e08-serving-cert\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.000192 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e61f813a-db17-46a6-a380-9f13452ef07b-console-config\") pod \"console-f9d7485db-qfqjj\" (UID: \"e61f813a-db17-46a6-a380-9f13452ef07b\") " pod="openshift-console/console-f9d7485db-qfqjj" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.000358 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e61f813a-db17-46a6-a380-9f13452ef07b-trusted-ca-bundle\") pod \"console-f9d7485db-qfqjj\" (UID: \"e61f813a-db17-46a6-a380-9f13452ef07b\") " pod="openshift-console/console-f9d7485db-qfqjj" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.000638 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b996bdda-9e5f-403c-9ab3-a1a371388e08-audit-dir\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.001033 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b89c3c57-363d-472f-a7cf-32b0e1542f60-metrics-tls\") pod \"dns-operator-744455d44c-6r22c\" (UID: \"b89c3c57-363d-472f-a7cf-32b0e1542f60\") " pod="openshift-dns-operator/dns-operator-744455d44c-6r22c" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.003443 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/85fb0eb0-18ab-4d38-a6e6-f9b2220249e3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-n6t79\" (UID: \"85fb0eb0-18ab-4d38-a6e6-f9b2220249e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n6t79" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.004189 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/32af1403-874a-49e0-ab8f-96511da15218-registry-tls\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.004290 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b996bdda-9e5f-403c-9ab3-a1a371388e08-node-pullsecrets\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.004351 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/32af1403-874a-49e0-ab8f-96511da15218-registry-certificates\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:16.989037 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b996bdda-9e5f-403c-9ab3-a1a371388e08-audit\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.004661 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7g5n\" (UniqueName: \"kubernetes.io/projected/32af1403-874a-49e0-ab8f-96511da15218-kube-api-access-n7g5n\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.004749 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdthj\" (UniqueName: \"kubernetes.io/projected/ae0168ae-d62e-4c9e-be20-6e9a37751c7a-kube-api-access-xdthj\") pod \"machine-config-operator-74547568cd-56f66\" (UID: \"ae0168ae-d62e-4c9e-be20-6e9a37751c7a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-56f66" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.004769 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b996bdda-9e5f-403c-9ab3-a1a371388e08-encryption-config\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.004779 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kb6g\" (UniqueName: \"kubernetes.io/projected/8dd7972d-1866-4706-93b7-66fd45227c7f-kube-api-access-7kb6g\") pod \"package-server-manager-789f6589d5-8jwlf\" (UID: \"8dd7972d-1866-4706-93b7-66fd45227c7f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8jwlf" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.004884 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb453a19-efba-405c-b3c3-b73892c2c4ac-proxy-tls\") pod \"machine-config-controller-84d6567774-9krm6\" (UID: \"fb453a19-efba-405c-b3c3-b73892c2c4ac\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9krm6" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.005164 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e61f813a-db17-46a6-a380-9f13452ef07b-oauth-serving-cert\") pod \"console-f9d7485db-qfqjj\" (UID: \"e61f813a-db17-46a6-a380-9f13452ef07b\") " pod="openshift-console/console-f9d7485db-qfqjj" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.005227 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/32af1403-874a-49e0-ab8f-96511da15218-installation-pull-secrets\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.007035 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da89f990-01f3-4fc4-bc66-6f33c9639082-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vxlrb\" (UID: \"da89f990-01f3-4fc4-bc66-6f33c9639082\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vxlrb" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.007814 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e61f813a-db17-46a6-a380-9f13452ef07b-console-serving-cert\") pod \"console-f9d7485db-qfqjj\" (UID: \"e61f813a-db17-46a6-a380-9f13452ef07b\") " pod="openshift-console/console-f9d7485db-qfqjj" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.008182 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f0c481b0-7a0f-4737-8660-fcb14728964b-trusted-ca\") pod \"ingress-operator-5b745b69d9-l4g7m\" (UID: \"f0c481b0-7a0f-4737-8660-fcb14728964b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l4g7m" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.009276 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.016135 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/90ae9f42-560b-4b79-a947-25c6de331025-images\") pod \"machine-api-operator-5694c8668f-2drd7\" (UID: \"90ae9f42-560b-4b79-a947-25c6de331025\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2drd7" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.017009 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6663b93f-3968-4397-94f9-57c42e90fad8-etcd-client\") pod \"etcd-operator-b45778765-4dlk8\" (UID: \"6663b93f-3968-4397-94f9-57c42e90fad8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4dlk8" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.017354 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e61f813a-db17-46a6-a380-9f13452ef07b-console-oauth-config\") pod \"console-f9d7485db-qfqjj\" (UID: \"e61f813a-db17-46a6-a380-9f13452ef07b\") " pod="openshift-console/console-f9d7485db-qfqjj" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.021105 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85fb0eb0-18ab-4d38-a6e6-f9b2220249e3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-n6t79\" (UID: \"85fb0eb0-18ab-4d38-a6e6-f9b2220249e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n6t79" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.023925 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/32af1403-874a-49e0-ab8f-96511da15218-installation-pull-secrets\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.026019 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.026433 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da89f990-01f3-4fc4-bc66-6f33c9639082-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vxlrb\" (UID: \"da89f990-01f3-4fc4-bc66-6f33c9639082\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vxlrb" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.026957 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6663b93f-3968-4397-94f9-57c42e90fad8-serving-cert\") pod \"etcd-operator-b45778765-4dlk8\" (UID: \"6663b93f-3968-4397-94f9-57c42e90fad8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4dlk8" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.036672 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca5302b5-90b9-412e-b378-a1fdedf81184-config\") pod \"controller-manager-879f6c89f-jz4tc\" (UID: \"ca5302b5-90b9-412e-b378-a1fdedf81184\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.044732 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.048689 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/993b8ae9-69a2-4cb2-806a-888528215561-etcd-client\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.070002 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.077660 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ca5302b5-90b9-412e-b378-a1fdedf81184-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-jz4tc\" (UID: \"ca5302b5-90b9-412e-b378-a1fdedf81184\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.084244 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.085752 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/993b8ae9-69a2-4cb2-806a-888528215561-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.087143 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-vnnq4" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.105087 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.109152 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.109351 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w9hc\" (UniqueName: \"kubernetes.io/projected/7b0de265-9c71-4fbf-a42b-8bb0d84a6284-kube-api-access-8w9hc\") pod \"ingress-canary-bsjt9\" (UID: \"7b0de265-9c71-4fbf-a42b-8bb0d84a6284\") " pod="openshift-ingress-canary/ingress-canary-bsjt9" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.109380 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/46d3c5a6-c886-4ae0-b381-95ffb9902718-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-7zmqf\" (UID: \"46d3c5a6-c886-4ae0-b381-95ffb9902718\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7zmqf" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.109400 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b0de265-9c71-4fbf-a42b-8bb0d84a6284-cert\") pod \"ingress-canary-bsjt9\" (UID: \"7b0de265-9c71-4fbf-a42b-8bb0d84a6284\") " pod="openshift-ingress-canary/ingress-canary-bsjt9" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.109420 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00636396-50ae-4e3b-b21f-f83d6dcb1ddc-config\") pod \"kube-apiserver-operator-766d6c64bb-jbg9h\" (UID: \"00636396-50ae-4e3b-b21f-f83d6dcb1ddc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jbg9h" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.109448 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktq2q\" (UniqueName: \"kubernetes.io/projected/fb453a19-efba-405c-b3c3-b73892c2c4ac-kube-api-access-ktq2q\") pod \"machine-config-controller-84d6567774-9krm6\" (UID: \"fb453a19-efba-405c-b3c3-b73892c2c4ac\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9krm6" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.109474 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ae0168ae-d62e-4c9e-be20-6e9a37751c7a-images\") pod \"machine-config-operator-74547568cd-56f66\" (UID: \"ae0168ae-d62e-4c9e-be20-6e9a37751c7a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-56f66" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.109491 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0131b05b-4a2d-4bb5-b65f-8531833bd203-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-cg4mg\" (UID: \"0131b05b-4a2d-4bb5-b65f-8531833bd203\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cg4mg" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.109509 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/15598969-d30d-4c3d-a3ca-e9a75c54fb90-signing-cabundle\") pod \"service-ca-9c57cc56f-nsnw9\" (UID: \"15598969-d30d-4c3d-a3ca-e9a75c54fb90\") " pod="openshift-service-ca/service-ca-9c57cc56f-nsnw9" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.109536 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e279336b-ac69-4574-9910-11c1fe663252-default-certificate\") pod \"router-default-5444994796-5f46k\" (UID: \"e279336b-ac69-4574-9910-11c1fe663252\") " pod="openshift-ingress/router-default-5444994796-5f46k" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.109595 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fb453a19-efba-405c-b3c3-b73892c2c4ac-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9krm6\" (UID: \"fb453a19-efba-405c-b3c3-b73892c2c4ac\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9krm6" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.109613 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bb16ce4b-e604-45d9-9635-c2565dcbd228-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qdm27\" (UID: \"bb16ce4b-e604-45d9-9635-c2565dcbd228\") " pod="openshift-marketplace/marketplace-operator-79b997595-qdm27" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.109645 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bc23dbe-e45c-4c9d-99c8-b7cec390a6b0-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-w6p7c\" (UID: \"4bc23dbe-e45c-4c9d-99c8-b7cec390a6b0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w6p7c" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.109667 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0131b05b-4a2d-4bb5-b65f-8531833bd203-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-cg4mg\" (UID: \"0131b05b-4a2d-4bb5-b65f-8531833bd203\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cg4mg" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.109663 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/46d3c5a6-c886-4ae0-b381-95ffb9902718-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-7zmqf\" (UID: \"46d3c5a6-c886-4ae0-b381-95ffb9902718\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7zmqf" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.109691 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/93b6b5e8-18cd-419e-9a09-f8e6c29febd2-csi-data-dir\") pod \"csi-hostpathplugin-njkp6\" (UID: \"93b6b5e8-18cd-419e-9a09-f8e6c29febd2\") " pod="hostpath-provisioner/csi-hostpathplugin-njkp6" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.109707 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00636396-50ae-4e3b-b21f-f83d6dcb1ddc-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jbg9h\" (UID: \"00636396-50ae-4e3b-b21f-f83d6dcb1ddc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jbg9h" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.109723 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a46b2dfd-f10c-4963-bf75-2068a886b420-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8p2d8\" (UID: \"a46b2dfd-f10c-4963-bf75-2068a886b420\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8p2d8" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.109736 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0131b05b-4a2d-4bb5-b65f-8531833bd203-config\") pod \"kube-controller-manager-operator-78b949d7b-cg4mg\" (UID: \"0131b05b-4a2d-4bb5-b65f-8531833bd203\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cg4mg" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.109750 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/93b6b5e8-18cd-419e-9a09-f8e6c29febd2-socket-dir\") pod \"csi-hostpathplugin-njkp6\" (UID: \"93b6b5e8-18cd-419e-9a09-f8e6c29febd2\") " pod="hostpath-provisioner/csi-hostpathplugin-njkp6" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.109768 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7525t\" (UniqueName: \"kubernetes.io/projected/dcd0af06-c1e3-44b1-9dfc-af1683bf9893-kube-api-access-7525t\") pod \"multus-admission-controller-857f4d67dd-h2p9q\" (UID: \"dcd0af06-c1e3-44b1-9dfc-af1683bf9893\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-h2p9q" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.109784 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkpzw\" (UniqueName: \"kubernetes.io/projected/e279336b-ac69-4574-9910-11c1fe663252-kube-api-access-rkpzw\") pod \"router-default-5444994796-5f46k\" (UID: \"e279336b-ac69-4574-9910-11c1fe663252\") " pod="openshift-ingress/router-default-5444994796-5f46k" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.109801 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbnvx\" (UniqueName: \"kubernetes.io/projected/15598969-d30d-4c3d-a3ca-e9a75c54fb90-kube-api-access-bbnvx\") pod \"service-ca-9c57cc56f-nsnw9\" (UID: \"15598969-d30d-4c3d-a3ca-e9a75c54fb90\") " pod="openshift-service-ca/service-ca-9c57cc56f-nsnw9" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.109817 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d3a496a-d6d4-474b-8ab8-fffa4e661e07-apiservice-cert\") pod \"packageserver-d55dfcdfc-9xp9n\" (UID: \"2d3a496a-d6d4-474b-8ab8-fffa4e661e07\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9xp9n" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.109838 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdthj\" (UniqueName: \"kubernetes.io/projected/ae0168ae-d62e-4c9e-be20-6e9a37751c7a-kube-api-access-xdthj\") pod \"machine-config-operator-74547568cd-56f66\" (UID: \"ae0168ae-d62e-4c9e-be20-6e9a37751c7a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-56f66" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.109855 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kb6g\" (UniqueName: \"kubernetes.io/projected/8dd7972d-1866-4706-93b7-66fd45227c7f-kube-api-access-7kb6g\") pod \"package-server-manager-789f6589d5-8jwlf\" (UID: \"8dd7972d-1866-4706-93b7-66fd45227c7f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8jwlf" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.109879 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb453a19-efba-405c-b3c3-b73892c2c4ac-proxy-tls\") pod \"machine-config-controller-84d6567774-9krm6\" (UID: \"fb453a19-efba-405c-b3c3-b73892c2c4ac\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9krm6" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.109897 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f0c481b0-7a0f-4737-8660-fcb14728964b-trusted-ca\") pod \"ingress-operator-5b745b69d9-l4g7m\" (UID: \"f0c481b0-7a0f-4737-8660-fcb14728964b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l4g7m" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.109924 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/93b6b5e8-18cd-419e-9a09-f8e6c29febd2-mountpoint-dir\") pod \"csi-hostpathplugin-njkp6\" (UID: \"93b6b5e8-18cd-419e-9a09-f8e6c29febd2\") " pod="hostpath-provisioner/csi-hostpathplugin-njkp6" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.109947 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6tgj\" (UniqueName: \"kubernetes.io/projected/a33a8a30-b9bd-4d21-a5f5-aca28766b920-kube-api-access-l6tgj\") pod \"service-ca-operator-777779d784-ld8nq\" (UID: \"a33a8a30-b9bd-4d21-a5f5-aca28766b920\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ld8nq" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.109967 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4ea68feb-ae95-41ab-9d2f-43f111c3721c-profile-collector-cert\") pod \"catalog-operator-68c6474976-z5d9m\" (UID: \"4ea68feb-ae95-41ab-9d2f-43f111c3721c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z5d9m" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.109986 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxkc7\" (UniqueName: \"kubernetes.io/projected/e5060045-fb36-499c-ab6e-d09eee39afd8-kube-api-access-gxkc7\") pod \"migrator-59844c95c7-c55nj\" (UID: \"e5060045-fb36-499c-ab6e-d09eee39afd8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c55nj" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110007 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4ea68feb-ae95-41ab-9d2f-43f111c3721c-srv-cert\") pod \"catalog-operator-68c6474976-z5d9m\" (UID: \"4ea68feb-ae95-41ab-9d2f-43f111c3721c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z5d9m" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110024 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7tzr\" (UniqueName: \"kubernetes.io/projected/93b6b5e8-18cd-419e-9a09-f8e6c29febd2-kube-api-access-c7tzr\") pod \"csi-hostpathplugin-njkp6\" (UID: \"93b6b5e8-18cd-419e-9a09-f8e6c29febd2\") " pod="hostpath-provisioner/csi-hostpathplugin-njkp6" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110048 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h74tj\" (UniqueName: \"kubernetes.io/projected/f0c481b0-7a0f-4737-8660-fcb14728964b-kube-api-access-h74tj\") pod \"ingress-operator-5b745b69d9-l4g7m\" (UID: \"f0c481b0-7a0f-4737-8660-fcb14728964b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l4g7m" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110066 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0cf1e554-88c7-44e8-8f5c-2487a10fa32a-certs\") pod \"machine-config-server-dq7n7\" (UID: \"0cf1e554-88c7-44e8-8f5c-2487a10fa32a\") " pod="openshift-machine-config-operator/machine-config-server-dq7n7" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110082 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a0a0f5f3-30b3-45bf-9545-877b619b94f5-config-volume\") pod \"dns-default-624kd\" (UID: \"a0a0f5f3-30b3-45bf-9545-877b619b94f5\") " pod="openshift-dns/dns-default-624kd" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110102 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/15598969-d30d-4c3d-a3ca-e9a75c54fb90-signing-key\") pod \"service-ca-9c57cc56f-nsnw9\" (UID: \"15598969-d30d-4c3d-a3ca-e9a75c54fb90\") " pod="openshift-service-ca/service-ca-9c57cc56f-nsnw9" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110125 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bc23dbe-e45c-4c9d-99c8-b7cec390a6b0-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-w6p7c\" (UID: \"4bc23dbe-e45c-4c9d-99c8-b7cec390a6b0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w6p7c" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110140 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2d3a496a-d6d4-474b-8ab8-fffa4e661e07-tmpfs\") pod \"packageserver-d55dfcdfc-9xp9n\" (UID: \"2d3a496a-d6d4-474b-8ab8-fffa4e661e07\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9xp9n" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110156 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c84fv\" (UniqueName: \"kubernetes.io/projected/22b1e4b6-8e9c-4e12-8627-469e056beee5-kube-api-access-c84fv\") pod \"collect-profiles-29555445-z4j5n\" (UID: \"22b1e4b6-8e9c-4e12-8627-469e056beee5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-z4j5n" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110173 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqj8g\" (UniqueName: \"kubernetes.io/projected/2d3a496a-d6d4-474b-8ab8-fffa4e661e07-kube-api-access-zqj8g\") pod \"packageserver-d55dfcdfc-9xp9n\" (UID: \"2d3a496a-d6d4-474b-8ab8-fffa4e661e07\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9xp9n" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110188 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c2c29741-810a-48bb-a987-127fe5d45625-profile-collector-cert\") pod \"olm-operator-6b444d44fb-z4kzx\" (UID: \"c2c29741-810a-48bb-a987-127fe5d45625\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z4kzx" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110203 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/46d3c5a6-c886-4ae0-b381-95ffb9902718-ready\") pod \"cni-sysctl-allowlist-ds-7zmqf\" (UID: \"46d3c5a6-c886-4ae0-b381-95ffb9902718\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7zmqf" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110218 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22b1e4b6-8e9c-4e12-8627-469e056beee5-config-volume\") pod \"collect-profiles-29555445-z4j5n\" (UID: \"22b1e4b6-8e9c-4e12-8627-469e056beee5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-z4j5n" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110232 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a0a0f5f3-30b3-45bf-9545-877b619b94f5-metrics-tls\") pod \"dns-default-624kd\" (UID: \"a0a0f5f3-30b3-45bf-9545-877b619b94f5\") " pod="openshift-dns/dns-default-624kd" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110248 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bc23dbe-e45c-4c9d-99c8-b7cec390a6b0-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-w6p7c\" (UID: \"4bc23dbe-e45c-4c9d-99c8-b7cec390a6b0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w6p7c" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110264 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0c481b0-7a0f-4737-8660-fcb14728964b-metrics-tls\") pod \"ingress-operator-5b745b69d9-l4g7m\" (UID: \"f0c481b0-7a0f-4737-8660-fcb14728964b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l4g7m" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110295 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7fpl\" (UniqueName: \"kubernetes.io/projected/46d3c5a6-c886-4ae0-b381-95ffb9902718-kube-api-access-v7fpl\") pod \"cni-sysctl-allowlist-ds-7zmqf\" (UID: \"46d3c5a6-c886-4ae0-b381-95ffb9902718\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7zmqf" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110310 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e279336b-ac69-4574-9910-11c1fe663252-stats-auth\") pod \"router-default-5444994796-5f46k\" (UID: \"e279336b-ac69-4574-9910-11c1fe663252\") " pod="openshift-ingress/router-default-5444994796-5f46k" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110326 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ae0168ae-d62e-4c9e-be20-6e9a37751c7a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-56f66\" (UID: \"ae0168ae-d62e-4c9e-be20-6e9a37751c7a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-56f66" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110343 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxz29\" (UniqueName: \"kubernetes.io/projected/4ea68feb-ae95-41ab-9d2f-43f111c3721c-kube-api-access-bxz29\") pod \"catalog-operator-68c6474976-z5d9m\" (UID: \"4ea68feb-ae95-41ab-9d2f-43f111c3721c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z5d9m" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110357 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e279336b-ac69-4574-9910-11c1fe663252-service-ca-bundle\") pod \"router-default-5444994796-5f46k\" (UID: \"e279336b-ac69-4574-9910-11c1fe663252\") " pod="openshift-ingress/router-default-5444994796-5f46k" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110384 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwxng\" (UniqueName: \"kubernetes.io/projected/a46b2dfd-f10c-4963-bf75-2068a886b420-kube-api-access-fwxng\") pod \"kube-storage-version-migrator-operator-b67b599dd-8p2d8\" (UID: \"a46b2dfd-f10c-4963-bf75-2068a886b420\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8p2d8" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110400 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e279336b-ac69-4574-9910-11c1fe663252-metrics-certs\") pod \"router-default-5444994796-5f46k\" (UID: \"e279336b-ac69-4574-9910-11c1fe663252\") " pod="openshift-ingress/router-default-5444994796-5f46k" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110414 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/93b6b5e8-18cd-419e-9a09-f8e6c29febd2-registration-dir\") pod \"csi-hostpathplugin-njkp6\" (UID: \"93b6b5e8-18cd-419e-9a09-f8e6c29febd2\") " pod="hostpath-provisioner/csi-hostpathplugin-njkp6" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110432 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dcd0af06-c1e3-44b1-9dfc-af1683bf9893-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-h2p9q\" (UID: \"dcd0af06-c1e3-44b1-9dfc-af1683bf9893\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-h2p9q" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110447 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22b1e4b6-8e9c-4e12-8627-469e056beee5-secret-volume\") pod \"collect-profiles-29555445-z4j5n\" (UID: \"22b1e4b6-8e9c-4e12-8627-469e056beee5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-z4j5n" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110465 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a33a8a30-b9bd-4d21-a5f5-aca28766b920-serving-cert\") pod \"service-ca-operator-777779d784-ld8nq\" (UID: \"a33a8a30-b9bd-4d21-a5f5-aca28766b920\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ld8nq" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110484 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8dd7972d-1866-4706-93b7-66fd45227c7f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8jwlf\" (UID: \"8dd7972d-1866-4706-93b7-66fd45227c7f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8jwlf" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110484 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00636396-50ae-4e3b-b21f-f83d6dcb1ddc-config\") pod \"kube-apiserver-operator-766d6c64bb-jbg9h\" (UID: \"00636396-50ae-4e3b-b21f-f83d6dcb1ddc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jbg9h" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110511 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltf77\" (UniqueName: \"kubernetes.io/projected/0cf1e554-88c7-44e8-8f5c-2487a10fa32a-kube-api-access-ltf77\") pod \"machine-config-server-dq7n7\" (UID: \"0cf1e554-88c7-44e8-8f5c-2487a10fa32a\") " pod="openshift-machine-config-operator/machine-config-server-dq7n7" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110533 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcrz5\" (UniqueName: \"kubernetes.io/projected/68451e19-64a1-471e-85c8-7238bb88e14c-kube-api-access-bcrz5\") pod \"control-plane-machine-set-operator-78cbb6b69f-xq5fz\" (UID: \"68451e19-64a1-471e-85c8-7238bb88e14c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xq5fz" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110569 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d3a496a-d6d4-474b-8ab8-fffa4e661e07-webhook-cert\") pod \"packageserver-d55dfcdfc-9xp9n\" (UID: \"2d3a496a-d6d4-474b-8ab8-fffa4e661e07\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9xp9n" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110585 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ae0168ae-d62e-4c9e-be20-6e9a37751c7a-proxy-tls\") pod \"machine-config-operator-74547568cd-56f66\" (UID: \"ae0168ae-d62e-4c9e-be20-6e9a37751c7a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-56f66" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110600 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0cf1e554-88c7-44e8-8f5c-2487a10fa32a-node-bootstrap-token\") pod \"machine-config-server-dq7n7\" (UID: \"0cf1e554-88c7-44e8-8f5c-2487a10fa32a\") " pod="openshift-machine-config-operator/machine-config-server-dq7n7" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110629 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a46b2dfd-f10c-4963-bf75-2068a886b420-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8p2d8\" (UID: \"a46b2dfd-f10c-4963-bf75-2068a886b420\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8p2d8" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110650 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/93b6b5e8-18cd-419e-9a09-f8e6c29febd2-plugins-dir\") pod \"csi-hostpathplugin-njkp6\" (UID: \"93b6b5e8-18cd-419e-9a09-f8e6c29febd2\") " pod="hostpath-provisioner/csi-hostpathplugin-njkp6" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110680 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/46d3c5a6-c886-4ae0-b381-95ffb9902718-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-7zmqf\" (UID: \"46d3c5a6-c886-4ae0-b381-95ffb9902718\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7zmqf" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110695 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00636396-50ae-4e3b-b21f-f83d6dcb1ddc-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jbg9h\" (UID: \"00636396-50ae-4e3b-b21f-f83d6dcb1ddc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jbg9h" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110712 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb16ce4b-e604-45d9-9635-c2565dcbd228-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qdm27\" (UID: \"bb16ce4b-e604-45d9-9635-c2565dcbd228\") " pod="openshift-marketplace/marketplace-operator-79b997595-qdm27" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110732 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c2c29741-810a-48bb-a987-127fe5d45625-srv-cert\") pod \"olm-operator-6b444d44fb-z4kzx\" (UID: \"c2c29741-810a-48bb-a987-127fe5d45625\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z4kzx" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110736 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fb453a19-efba-405c-b3c3-b73892c2c4ac-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9krm6\" (UID: \"fb453a19-efba-405c-b3c3-b73892c2c4ac\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9krm6" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110750 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/68451e19-64a1-471e-85c8-7238bb88e14c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xq5fz\" (UID: \"68451e19-64a1-471e-85c8-7238bb88e14c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xq5fz" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110757 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/93b6b5e8-18cd-419e-9a09-f8e6c29febd2-socket-dir\") pod \"csi-hostpathplugin-njkp6\" (UID: \"93b6b5e8-18cd-419e-9a09-f8e6c29febd2\") " pod="hostpath-provisioner/csi-hostpathplugin-njkp6" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110780 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95flb\" (UniqueName: \"kubernetes.io/projected/bb16ce4b-e604-45d9-9635-c2565dcbd228-kube-api-access-95flb\") pod \"marketplace-operator-79b997595-qdm27\" (UID: \"bb16ce4b-e604-45d9-9635-c2565dcbd228\") " pod="openshift-marketplace/marketplace-operator-79b997595-qdm27" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110798 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq9gk\" (UniqueName: \"kubernetes.io/projected/c2c29741-810a-48bb-a987-127fe5d45625-kube-api-access-vq9gk\") pod \"olm-operator-6b444d44fb-z4kzx\" (UID: \"c2c29741-810a-48bb-a987-127fe5d45625\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z4kzx" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110817 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f0c481b0-7a0f-4737-8660-fcb14728964b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-l4g7m\" (UID: \"f0c481b0-7a0f-4737-8660-fcb14728964b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l4g7m" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110838 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8nk6\" (UniqueName: \"kubernetes.io/projected/a0a0f5f3-30b3-45bf-9545-877b619b94f5-kube-api-access-h8nk6\") pod \"dns-default-624kd\" (UID: \"a0a0f5f3-30b3-45bf-9545-877b619b94f5\") " pod="openshift-dns/dns-default-624kd" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.110860 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a33a8a30-b9bd-4d21-a5f5-aca28766b920-config\") pod \"service-ca-operator-777779d784-ld8nq\" (UID: \"a33a8a30-b9bd-4d21-a5f5-aca28766b920\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ld8nq" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.111010 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0131b05b-4a2d-4bb5-b65f-8531833bd203-config\") pod \"kube-controller-manager-operator-78b949d7b-cg4mg\" (UID: \"0131b05b-4a2d-4bb5-b65f-8531833bd203\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cg4mg" Mar 12 14:49:17 crc kubenswrapper[4869]: E0312 14:49:17.111120 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:17.611095529 +0000 UTC m=+109.896320837 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.111433 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ae0168ae-d62e-4c9e-be20-6e9a37751c7a-images\") pod \"machine-config-operator-74547568cd-56f66\" (UID: \"ae0168ae-d62e-4c9e-be20-6e9a37751c7a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-56f66" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.111454 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a33a8a30-b9bd-4d21-a5f5-aca28766b920-config\") pod \"service-ca-operator-777779d784-ld8nq\" (UID: \"a33a8a30-b9bd-4d21-a5f5-aca28766b920\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ld8nq" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.112061 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bc23dbe-e45c-4c9d-99c8-b7cec390a6b0-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-w6p7c\" (UID: \"4bc23dbe-e45c-4c9d-99c8-b7cec390a6b0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w6p7c" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.112391 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/15598969-d30d-4c3d-a3ca-e9a75c54fb90-signing-cabundle\") pod \"service-ca-9c57cc56f-nsnw9\" (UID: \"15598969-d30d-4c3d-a3ca-e9a75c54fb90\") " pod="openshift-service-ca/service-ca-9c57cc56f-nsnw9" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.112482 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2d3a496a-d6d4-474b-8ab8-fffa4e661e07-tmpfs\") pod \"packageserver-d55dfcdfc-9xp9n\" (UID: \"2d3a496a-d6d4-474b-8ab8-fffa4e661e07\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9xp9n" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.113665 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/93b6b5e8-18cd-419e-9a09-f8e6c29febd2-csi-data-dir\") pod \"csi-hostpathplugin-njkp6\" (UID: \"93b6b5e8-18cd-419e-9a09-f8e6c29febd2\") " pod="hostpath-provisioner/csi-hostpathplugin-njkp6" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.114081 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a0a0f5f3-30b3-45bf-9545-877b619b94f5-metrics-tls\") pod \"dns-default-624kd\" (UID: \"a0a0f5f3-30b3-45bf-9545-877b619b94f5\") " pod="openshift-dns/dns-default-624kd" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.114480 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/46d3c5a6-c886-4ae0-b381-95ffb9902718-ready\") pod \"cni-sysctl-allowlist-ds-7zmqf\" (UID: \"46d3c5a6-c886-4ae0-b381-95ffb9902718\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7zmqf" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.114486 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7b0de265-9c71-4fbf-a42b-8bb0d84a6284-cert\") pod \"ingress-canary-bsjt9\" (UID: \"7b0de265-9c71-4fbf-a42b-8bb0d84a6284\") " pod="openshift-ingress-canary/ingress-canary-bsjt9" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.115339 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d3a496a-d6d4-474b-8ab8-fffa4e661e07-apiservice-cert\") pod \"packageserver-d55dfcdfc-9xp9n\" (UID: \"2d3a496a-d6d4-474b-8ab8-fffa4e661e07\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9xp9n" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.115433 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22b1e4b6-8e9c-4e12-8627-469e056beee5-config-volume\") pod \"collect-profiles-29555445-z4j5n\" (UID: \"22b1e4b6-8e9c-4e12-8627-469e056beee5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-z4j5n" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.115488 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb453a19-efba-405c-b3c3-b73892c2c4ac-proxy-tls\") pod \"machine-config-controller-84d6567774-9krm6\" (UID: \"fb453a19-efba-405c-b3c3-b73892c2c4ac\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9krm6" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.116914 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0cf1e554-88c7-44e8-8f5c-2487a10fa32a-certs\") pod \"machine-config-server-dq7n7\" (UID: \"0cf1e554-88c7-44e8-8f5c-2487a10fa32a\") " pod="openshift-machine-config-operator/machine-config-server-dq7n7" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.117242 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a0a0f5f3-30b3-45bf-9545-877b619b94f5-config-volume\") pod \"dns-default-624kd\" (UID: \"a0a0f5f3-30b3-45bf-9545-877b619b94f5\") " pod="openshift-dns/dns-default-624kd" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.118020 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/15598969-d30d-4c3d-a3ca-e9a75c54fb90-signing-key\") pod \"service-ca-9c57cc56f-nsnw9\" (UID: \"15598969-d30d-4c3d-a3ca-e9a75c54fb90\") " pod="openshift-service-ca/service-ca-9c57cc56f-nsnw9" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.118208 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d3a496a-d6d4-474b-8ab8-fffa4e661e07-webhook-cert\") pod \"packageserver-d55dfcdfc-9xp9n\" (UID: \"2d3a496a-d6d4-474b-8ab8-fffa4e661e07\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9xp9n" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.118333 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00636396-50ae-4e3b-b21f-f83d6dcb1ddc-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jbg9h\" (UID: \"00636396-50ae-4e3b-b21f-f83d6dcb1ddc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jbg9h" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.118791 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bb16ce4b-e604-45d9-9635-c2565dcbd228-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qdm27\" (UID: \"bb16ce4b-e604-45d9-9635-c2565dcbd228\") " pod="openshift-marketplace/marketplace-operator-79b997595-qdm27" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.118868 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/93b6b5e8-18cd-419e-9a09-f8e6c29febd2-mountpoint-dir\") pod \"csi-hostpathplugin-njkp6\" (UID: \"93b6b5e8-18cd-419e-9a09-f8e6c29febd2\") " pod="hostpath-provisioner/csi-hostpathplugin-njkp6" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.120245 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bc23dbe-e45c-4c9d-99c8-b7cec390a6b0-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-w6p7c\" (UID: \"4bc23dbe-e45c-4c9d-99c8-b7cec390a6b0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w6p7c" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.120311 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f0c481b0-7a0f-4737-8660-fcb14728964b-trusted-ca\") pod \"ingress-operator-5b745b69d9-l4g7m\" (UID: \"f0c481b0-7a0f-4737-8660-fcb14728964b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l4g7m" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.120467 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c2c29741-810a-48bb-a987-127fe5d45625-profile-collector-cert\") pod \"olm-operator-6b444d44fb-z4kzx\" (UID: \"c2c29741-810a-48bb-a987-127fe5d45625\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z4kzx" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.120623 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ae0168ae-d62e-4c9e-be20-6e9a37751c7a-proxy-tls\") pod \"machine-config-operator-74547568cd-56f66\" (UID: \"ae0168ae-d62e-4c9e-be20-6e9a37751c7a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-56f66" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.121245 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0c481b0-7a0f-4737-8660-fcb14728964b-metrics-tls\") pod \"ingress-operator-5b745b69d9-l4g7m\" (UID: \"f0c481b0-7a0f-4737-8660-fcb14728964b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l4g7m" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.121315 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/93b6b5e8-18cd-419e-9a09-f8e6c29febd2-plugins-dir\") pod \"csi-hostpathplugin-njkp6\" (UID: \"93b6b5e8-18cd-419e-9a09-f8e6c29febd2\") " pod="hostpath-provisioner/csi-hostpathplugin-njkp6" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.121423 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a33a8a30-b9bd-4d21-a5f5-aca28766b920-serving-cert\") pod \"service-ca-operator-777779d784-ld8nq\" (UID: \"a33a8a30-b9bd-4d21-a5f5-aca28766b920\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ld8nq" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.121950 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ae0168ae-d62e-4c9e-be20-6e9a37751c7a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-56f66\" (UID: \"ae0168ae-d62e-4c9e-be20-6e9a37751c7a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-56f66" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.122746 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4ea68feb-ae95-41ab-9d2f-43f111c3721c-srv-cert\") pod \"catalog-operator-68c6474976-z5d9m\" (UID: \"4ea68feb-ae95-41ab-9d2f-43f111c3721c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z5d9m" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.122958 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb16ce4b-e604-45d9-9635-c2565dcbd228-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qdm27\" (UID: \"bb16ce4b-e604-45d9-9635-c2565dcbd228\") " pod="openshift-marketplace/marketplace-operator-79b997595-qdm27" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.123034 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/93b6b5e8-18cd-419e-9a09-f8e6c29febd2-registration-dir\") pod \"csi-hostpathplugin-njkp6\" (UID: \"93b6b5e8-18cd-419e-9a09-f8e6c29febd2\") " pod="hostpath-provisioner/csi-hostpathplugin-njkp6" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.123642 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4ea68feb-ae95-41ab-9d2f-43f111c3721c-profile-collector-cert\") pod \"catalog-operator-68c6474976-z5d9m\" (UID: \"4ea68feb-ae95-41ab-9d2f-43f111c3721c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z5d9m" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.124265 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e279336b-ac69-4574-9910-11c1fe663252-service-ca-bundle\") pod \"router-default-5444994796-5f46k\" (UID: \"e279336b-ac69-4574-9910-11c1fe663252\") " pod="openshift-ingress/router-default-5444994796-5f46k" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.124517 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a46b2dfd-f10c-4963-bf75-2068a886b420-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8p2d8\" (UID: \"a46b2dfd-f10c-4963-bf75-2068a886b420\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8p2d8" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.125309 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.125429 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/46d3c5a6-c886-4ae0-b381-95ffb9902718-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-7zmqf\" (UID: \"46d3c5a6-c886-4ae0-b381-95ffb9902718\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7zmqf" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.126855 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0131b05b-4a2d-4bb5-b65f-8531833bd203-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-cg4mg\" (UID: \"0131b05b-4a2d-4bb5-b65f-8531833bd203\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cg4mg" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.129146 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a46b2dfd-f10c-4963-bf75-2068a886b420-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8p2d8\" (UID: \"a46b2dfd-f10c-4963-bf75-2068a886b420\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8p2d8" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.129255 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c2c29741-810a-48bb-a987-127fe5d45625-srv-cert\") pod \"olm-operator-6b444d44fb-z4kzx\" (UID: \"c2c29741-810a-48bb-a987-127fe5d45625\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z4kzx" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.129509 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22b1e4b6-8e9c-4e12-8627-469e056beee5-secret-volume\") pod \"collect-profiles-29555445-z4j5n\" (UID: \"22b1e4b6-8e9c-4e12-8627-469e056beee5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-z4j5n" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.130803 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0cf1e554-88c7-44e8-8f5c-2487a10fa32a-node-bootstrap-token\") pod \"machine-config-server-dq7n7\" (UID: \"0cf1e554-88c7-44e8-8f5c-2487a10fa32a\") " pod="openshift-machine-config-operator/machine-config-server-dq7n7" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.131146 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/68451e19-64a1-471e-85c8-7238bb88e14c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xq5fz\" (UID: \"68451e19-64a1-471e-85c8-7238bb88e14c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xq5fz" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.132665 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8dd7972d-1866-4706-93b7-66fd45227c7f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8jwlf\" (UID: \"8dd7972d-1866-4706-93b7-66fd45227c7f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8jwlf" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.133091 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e279336b-ac69-4574-9910-11c1fe663252-stats-auth\") pod \"router-default-5444994796-5f46k\" (UID: \"e279336b-ac69-4574-9910-11c1fe663252\") " pod="openshift-ingress/router-default-5444994796-5f46k" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.133968 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e279336b-ac69-4574-9910-11c1fe663252-metrics-certs\") pod \"router-default-5444994796-5f46k\" (UID: \"e279336b-ac69-4574-9910-11c1fe663252\") " pod="openshift-ingress/router-default-5444994796-5f46k" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.134387 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e279336b-ac69-4574-9910-11c1fe663252-default-certificate\") pod \"router-default-5444994796-5f46k\" (UID: \"e279336b-ac69-4574-9910-11c1fe663252\") " pod="openshift-ingress/router-default-5444994796-5f46k" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.140935 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dcd0af06-c1e3-44b1-9dfc-af1683bf9893-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-h2p9q\" (UID: \"dcd0af06-c1e3-44b1-9dfc-af1683bf9893\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-h2p9q" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.165701 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.170220 4869 ???:1] "http: TLS handshake error from 192.168.126.11:60860: no serving certificate available for the kubelet" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.178787 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/993b8ae9-69a2-4cb2-806a-888528215561-encryption-config\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.184645 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.185489 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/993b8ae9-69a2-4cb2-806a-888528215561-audit-policies\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.196242 4869 ???:1] "http: TLS handshake error from 192.168.126.11:60862: no serving certificate available for the kubelet" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.205380 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.211369 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.211534 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf9qv\" (UniqueName: \"kubernetes.io/projected/47d3c91d-a968-413d-a941-bc67279bf905-kube-api-access-gf9qv\") pod \"authentication-operator-69f744f599-fs9qm\" (UID: \"47d3c91d-a968-413d-a941-bc67279bf905\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fs9qm" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.211568 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5br8w\" (UniqueName: \"kubernetes.io/projected/ca5302b5-90b9-412e-b378-a1fdedf81184-kube-api-access-5br8w\") pod \"controller-manager-879f6c89f-jz4tc\" (UID: \"ca5302b5-90b9-412e-b378-a1fdedf81184\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" Mar 12 14:49:17 crc kubenswrapper[4869]: E0312 14:49:17.211756 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:17.711739391 +0000 UTC m=+109.996964749 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.216087 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90ae9f42-560b-4b79-a947-25c6de331025-config\") pod \"machine-api-operator-5694c8668f-2drd7\" (UID: \"90ae9f42-560b-4b79-a947-25c6de331025\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2drd7" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.225749 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.244587 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.255807 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf9qv\" (UniqueName: \"kubernetes.io/projected/47d3c91d-a968-413d-a941-bc67279bf905-kube-api-access-gf9qv\") pod \"authentication-operator-69f744f599-fs9qm\" (UID: \"47d3c91d-a968-413d-a941-bc67279bf905\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fs9qm" Mar 12 14:49:17 crc kubenswrapper[4869]: E0312 14:49:17.264845 4869 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 12 14:49:17 crc kubenswrapper[4869]: E0312 14:49:17.264897 4869 secret.go:188] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 12 14:49:17 crc kubenswrapper[4869]: E0312 14:49:17.264927 4869 secret.go:188] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 12 14:49:17 crc kubenswrapper[4869]: E0312 14:49:17.264907 4869 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 12 14:49:17 crc kubenswrapper[4869]: E0312 14:49:17.264920 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/993b8ae9-69a2-4cb2-806a-888528215561-trusted-ca-bundle podName:993b8ae9-69a2-4cb2-806a-888528215561 nodeName:}" failed. No retries permitted until 2026-03-12 14:49:18.264904176 +0000 UTC m=+110.550129454 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/993b8ae9-69a2-4cb2-806a-888528215561-trusted-ca-bundle") pod "apiserver-7bbb656c7d-qmrc7" (UID: "993b8ae9-69a2-4cb2-806a-888528215561") : failed to sync configmap cache: timed out waiting for the condition Mar 12 14:49:17 crc kubenswrapper[4869]: E0312 14:49:17.265043 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90ae9f42-560b-4b79-a947-25c6de331025-machine-api-operator-tls podName:90ae9f42-560b-4b79-a947-25c6de331025 nodeName:}" failed. No retries permitted until 2026-03-12 14:49:18.26501959 +0000 UTC m=+110.550244948 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/90ae9f42-560b-4b79-a947-25c6de331025-machine-api-operator-tls") pod "machine-api-operator-5694c8668f-2drd7" (UID: "90ae9f42-560b-4b79-a947-25c6de331025") : failed to sync secret cache: timed out waiting for the condition Mar 12 14:49:17 crc kubenswrapper[4869]: E0312 14:49:17.265063 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca5302b5-90b9-412e-b378-a1fdedf81184-serving-cert podName:ca5302b5-90b9-412e-b378-a1fdedf81184 nodeName:}" failed. No retries permitted until 2026-03-12 14:49:18.265054681 +0000 UTC m=+110.550280049 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/ca5302b5-90b9-412e-b378-a1fdedf81184-serving-cert") pod "controller-manager-879f6c89f-jz4tc" (UID: "ca5302b5-90b9-412e-b378-a1fdedf81184") : failed to sync secret cache: timed out waiting for the condition Mar 12 14:49:17 crc kubenswrapper[4869]: E0312 14:49:17.265076 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca5302b5-90b9-412e-b378-a1fdedf81184-client-ca podName:ca5302b5-90b9-412e-b378-a1fdedf81184 nodeName:}" failed. No retries permitted until 2026-03-12 14:49:18.265070271 +0000 UTC m=+110.550295639 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/ca5302b5-90b9-412e-b378-a1fdedf81184-client-ca") pod "controller-manager-879f6c89f-jz4tc" (UID: "ca5302b5-90b9-412e-b378-a1fdedf81184") : failed to sync configmap cache: timed out waiting for the condition Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.265075 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 12 14:49:17 crc kubenswrapper[4869]: E0312 14:49:17.264937 4869 secret.go:188] Couldn't get secret openshift-oauth-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 12 14:49:17 crc kubenswrapper[4869]: E0312 14:49:17.265310 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/993b8ae9-69a2-4cb2-806a-888528215561-serving-cert podName:993b8ae9-69a2-4cb2-806a-888528215561 nodeName:}" failed. No retries permitted until 2026-03-12 14:49:18.265300578 +0000 UTC m=+110.550525856 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/993b8ae9-69a2-4cb2-806a-888528215561-serving-cert") pod "apiserver-7bbb656c7d-qmrc7" (UID: "993b8ae9-69a2-4cb2-806a-888528215561") : failed to sync secret cache: timed out waiting for the condition Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.269894 4869 ???:1] "http: TLS handshake error from 192.168.126.11:60876: no serving certificate available for the kubelet" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.284850 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.304887 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 12 14:49:17 crc kubenswrapper[4869]: E0312 14:49:17.307780 4869 projected.go:194] Error preparing data for projected volume kube-api-access-tqk5t for pod openshift-machine-api/machine-api-operator-5694c8668f-2drd7: failed to sync configmap cache: timed out waiting for the condition Mar 12 14:49:17 crc kubenswrapper[4869]: E0312 14:49:17.307943 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/90ae9f42-560b-4b79-a947-25c6de331025-kube-api-access-tqk5t podName:90ae9f42-560b-4b79-a947-25c6de331025 nodeName:}" failed. No retries permitted until 2026-03-12 14:49:17.807921579 +0000 UTC m=+110.093146857 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-tqk5t" (UniqueName: "kubernetes.io/projected/90ae9f42-560b-4b79-a947-25c6de331025-kube-api-access-tqk5t") pod "machine-api-operator-5694c8668f-2drd7" (UID: "90ae9f42-560b-4b79-a947-25c6de331025") : failed to sync configmap cache: timed out waiting for the condition Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.312266 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:17 crc kubenswrapper[4869]: E0312 14:49:17.312602 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:17.812574908 +0000 UTC m=+110.097800226 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.312961 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:17 crc kubenswrapper[4869]: E0312 14:49:17.313447 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:17.813426673 +0000 UTC m=+110.098651942 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.325025 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.335620 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5br8w\" (UniqueName: \"kubernetes.io/projected/ca5302b5-90b9-412e-b378-a1fdedf81184-kube-api-access-5br8w\") pod \"controller-manager-879f6c89f-jz4tc\" (UID: \"ca5302b5-90b9-412e-b378-a1fdedf81184\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.344939 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.365574 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.367184 4869 ???:1] "http: TLS handshake error from 192.168.126.11:60892: no serving certificate available for the kubelet" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.380031 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-fs9qm" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.384578 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.389213 4869 ???:1] "http: TLS handshake error from 192.168.126.11:60906: no serving certificate available for the kubelet" Mar 12 14:49:17 crc kubenswrapper[4869]: E0312 14:49:17.394034 4869 projected.go:194] Error preparing data for projected volume kube-api-access-hpkzh for pod openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7: failed to sync configmap cache: timed out waiting for the condition Mar 12 14:49:17 crc kubenswrapper[4869]: E0312 14:49:17.394115 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/993b8ae9-69a2-4cb2-806a-888528215561-kube-api-access-hpkzh podName:993b8ae9-69a2-4cb2-806a-888528215561 nodeName:}" failed. No retries permitted until 2026-03-12 14:49:17.894093799 +0000 UTC m=+110.179319077 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-hpkzh" (UniqueName: "kubernetes.io/projected/993b8ae9-69a2-4cb2-806a-888528215561-kube-api-access-hpkzh") pod "apiserver-7bbb656c7d-qmrc7" (UID: "993b8ae9-69a2-4cb2-806a-888528215561") : failed to sync configmap cache: timed out waiting for the condition Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.405518 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.413957 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:17 crc kubenswrapper[4869]: E0312 14:49:17.414156 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:17.914130557 +0000 UTC m=+110.199355835 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.414269 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:17 crc kubenswrapper[4869]: E0312 14:49:17.414927 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:17.91492003 +0000 UTC m=+110.200145308 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.425053 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.469668 4869 ???:1] "http: TLS handshake error from 192.168.126.11:60920: no serving certificate available for the kubelet" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.470042 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m48bv\" (UniqueName: \"kubernetes.io/projected/e61f813a-db17-46a6-a380-9f13452ef07b-kube-api-access-m48bv\") pod \"console-f9d7485db-qfqjj\" (UID: \"e61f813a-db17-46a6-a380-9f13452ef07b\") " pod="openshift-console/console-f9d7485db-qfqjj" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.491606 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtjtt\" (UniqueName: \"kubernetes.io/projected/da89f990-01f3-4fc4-bc66-6f33c9639082-kube-api-access-jtjtt\") pod \"openshift-controller-manager-operator-756b6f6bc6-vxlrb\" (UID: \"da89f990-01f3-4fc4-bc66-6f33c9639082\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vxlrb" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.499673 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xssq2\" (UniqueName: \"kubernetes.io/projected/b89c3c57-363d-472f-a7cf-32b0e1542f60-kube-api-access-xssq2\") pod \"dns-operator-744455d44c-6r22c\" (UID: \"b89c3c57-363d-472f-a7cf-32b0e1542f60\") " pod="openshift-dns-operator/dns-operator-744455d44c-6r22c" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.515705 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:17 crc kubenswrapper[4869]: E0312 14:49:17.516579 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:18.01650101 +0000 UTC m=+110.301726308 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.517274 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:17 crc kubenswrapper[4869]: E0312 14:49:17.517842 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:18.01782395 +0000 UTC m=+110.303049228 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.520975 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22wc2\" (UniqueName: \"kubernetes.io/projected/6663b93f-3968-4397-94f9-57c42e90fad8-kube-api-access-22wc2\") pod \"etcd-operator-b45778765-4dlk8\" (UID: \"6663b93f-3968-4397-94f9-57c42e90fad8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4dlk8" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.528666 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vxlrb" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.542967 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32af1403-874a-49e0-ab8f-96511da15218-bound-sa-token\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.565391 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgkn4\" (UniqueName: \"kubernetes.io/projected/7d89d513-f587-4072-9038-09aa4e0a6b0d-kube-api-access-cgkn4\") pod \"downloads-7954f5f757-6xr6k\" (UID: \"7d89d513-f587-4072-9038-09aa4e0a6b0d\") " pod="openshift-console/downloads-7954f5f757-6xr6k" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.568353 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fs9qm"] Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.577146 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp45t\" (UniqueName: \"kubernetes.io/projected/b996bdda-9e5f-403c-9ab3-a1a371388e08-kube-api-access-bp45t\") pod \"apiserver-76f77b778f-2hvvd\" (UID: \"b996bdda-9e5f-403c-9ab3-a1a371388e08\") " pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.605321 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhv8f\" (UniqueName: \"kubernetes.io/projected/85fb0eb0-18ab-4d38-a6e6-f9b2220249e3-kube-api-access-xhv8f\") pod \"cluster-image-registry-operator-dc59b4c8b-n6t79\" (UID: \"85fb0eb0-18ab-4d38-a6e6-f9b2220249e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n6t79" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.619183 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:17 crc kubenswrapper[4869]: E0312 14:49:17.619772 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:18.11975634 +0000 UTC m=+110.404981608 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.659514 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7g5n\" (UniqueName: \"kubernetes.io/projected/32af1403-874a-49e0-ab8f-96511da15218-kube-api-access-n7g5n\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.672956 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w9hc\" (UniqueName: \"kubernetes.io/projected/7b0de265-9c71-4fbf-a42b-8bb0d84a6284-kube-api-access-8w9hc\") pod \"ingress-canary-bsjt9\" (UID: \"7b0de265-9c71-4fbf-a42b-8bb0d84a6284\") " pod="openshift-ingress-canary/ingress-canary-bsjt9" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.681515 4869 ???:1] "http: TLS handshake error from 192.168.126.11:58250: no serving certificate available for the kubelet" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.681581 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85fb0eb0-18ab-4d38-a6e6-f9b2220249e3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-n6t79\" (UID: \"85fb0eb0-18ab-4d38-a6e6-f9b2220249e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n6t79" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.682898 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktq2q\" (UniqueName: \"kubernetes.io/projected/fb453a19-efba-405c-b3c3-b73892c2c4ac-kube-api-access-ktq2q\") pod \"machine-config-controller-84d6567774-9krm6\" (UID: \"fb453a19-efba-405c-b3c3-b73892c2c4ac\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9krm6" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.700809 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7525t\" (UniqueName: \"kubernetes.io/projected/dcd0af06-c1e3-44b1-9dfc-af1683bf9893-kube-api-access-7525t\") pod \"multus-admission-controller-857f4d67dd-h2p9q\" (UID: \"dcd0af06-c1e3-44b1-9dfc-af1683bf9893\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-h2p9q" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.715959 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-qfqjj" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.718960 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkpzw\" (UniqueName: \"kubernetes.io/projected/e279336b-ac69-4574-9910-11c1fe663252-kube-api-access-rkpzw\") pod \"router-default-5444994796-5f46k\" (UID: \"e279336b-ac69-4574-9910-11c1fe663252\") " pod="openshift-ingress/router-default-5444994796-5f46k" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.721495 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:17 crc kubenswrapper[4869]: E0312 14:49:17.721942 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:18.221929077 +0000 UTC m=+110.507154355 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.741827 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-6r22c" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.746775 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbnvx\" (UniqueName: \"kubernetes.io/projected/15598969-d30d-4c3d-a3ca-e9a75c54fb90-kube-api-access-bbnvx\") pod \"service-ca-9c57cc56f-nsnw9\" (UID: \"15598969-d30d-4c3d-a3ca-e9a75c54fb90\") " pod="openshift-service-ca/service-ca-9c57cc56f-nsnw9" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.765116 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h74tj\" (UniqueName: \"kubernetes.io/projected/f0c481b0-7a0f-4737-8660-fcb14728964b-kube-api-access-h74tj\") pod \"ingress-operator-5b745b69d9-l4g7m\" (UID: \"f0c481b0-7a0f-4737-8660-fcb14728964b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l4g7m" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.770449 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vxlrb"] Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.773963 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bsjt9" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.779585 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdthj\" (UniqueName: \"kubernetes.io/projected/ae0168ae-d62e-4c9e-be20-6e9a37751c7a-kube-api-access-xdthj\") pod \"machine-config-operator-74547568cd-56f66\" (UID: \"ae0168ae-d62e-4c9e-be20-6e9a37751c7a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-56f66" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.801183 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-6xr6k" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.801615 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kb6g\" (UniqueName: \"kubernetes.io/projected/8dd7972d-1866-4706-93b7-66fd45227c7f-kube-api-access-7kb6g\") pod \"package-server-manager-789f6589d5-8jwlf\" (UID: \"8dd7972d-1866-4706-93b7-66fd45227c7f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8jwlf" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.809642 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.821483 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-4dlk8" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.823179 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.823780 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqk5t\" (UniqueName: \"kubernetes.io/projected/90ae9f42-560b-4b79-a947-25c6de331025-kube-api-access-tqk5t\") pod \"machine-api-operator-5694c8668f-2drd7\" (UID: \"90ae9f42-560b-4b79-a947-25c6de331025\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2drd7" Mar 12 14:49:17 crc kubenswrapper[4869]: E0312 14:49:17.823927 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:18.323888248 +0000 UTC m=+110.609113526 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.841587 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n6t79" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.845707 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqk5t\" (UniqueName: \"kubernetes.io/projected/90ae9f42-560b-4b79-a947-25c6de331025-kube-api-access-tqk5t\") pod \"machine-api-operator-5694c8668f-2drd7\" (UID: \"90ae9f42-560b-4b79-a947-25c6de331025\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2drd7" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.846883 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltf77\" (UniqueName: \"kubernetes.io/projected/0cf1e554-88c7-44e8-8f5c-2487a10fa32a-kube-api-access-ltf77\") pod \"machine-config-server-dq7n7\" (UID: \"0cf1e554-88c7-44e8-8f5c-2487a10fa32a\") " pod="openshift-machine-config-operator/machine-config-server-dq7n7" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.860368 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcrz5\" (UniqueName: \"kubernetes.io/projected/68451e19-64a1-471e-85c8-7238bb88e14c-kube-api-access-bcrz5\") pod \"control-plane-machine-set-operator-78cbb6b69f-xq5fz\" (UID: \"68451e19-64a1-471e-85c8-7238bb88e14c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xq5fz" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.860807 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-fs9qm" event={"ID":"47d3c91d-a968-413d-a941-bc67279bf905","Type":"ContainerStarted","Data":"c05580d96d2bc32f0eb14c41d0fef672661ce17806a6744a390b6eb955ca9231"} Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.860855 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-fs9qm" event={"ID":"47d3c91d-a968-413d-a941-bc67279bf905","Type":"ContainerStarted","Data":"7cc5447e131d01edc273b54dc74fc26849d23a18a79d5cd3f15b2e876dfa7884"} Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.862264 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0131b05b-4a2d-4bb5-b65f-8531833bd203-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-cg4mg\" (UID: \"0131b05b-4a2d-4bb5-b65f-8531833bd203\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cg4mg" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.869477 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z6x5x" event={"ID":"d17df7e2-d85d-4172-aff7-0b5e63605a77","Type":"ContainerStarted","Data":"5dc7d2f2c7615284cd14faa40ada39bde3303e283be4118402c31b24bf3828ba"} Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.869665 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z6x5x" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.876699 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-5f46k" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.883212 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vxlrb" event={"ID":"da89f990-01f3-4fc4-bc66-6f33c9639082","Type":"ContainerStarted","Data":"f6199d542d7beb4d1d3cf511534bcdf075cafd533b4664ca4d367c14aa41d977"} Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.889522 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bc23dbe-e45c-4c9d-99c8-b7cec390a6b0-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-w6p7c\" (UID: \"4bc23dbe-e45c-4c9d-99c8-b7cec390a6b0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w6p7c" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.889873 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w6p7c" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.890047 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.905205 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxkc7\" (UniqueName: \"kubernetes.io/projected/e5060045-fb36-499c-ab6e-d09eee39afd8-kube-api-access-gxkc7\") pod \"migrator-59844c95c7-c55nj\" (UID: \"e5060045-fb36-499c-ab6e-d09eee39afd8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c55nj" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.915296 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cg4mg" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.922346 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xq5fz" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.922739 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7fpl\" (UniqueName: \"kubernetes.io/projected/46d3c5a6-c886-4ae0-b381-95ffb9902718-kube-api-access-v7fpl\") pod \"cni-sysctl-allowlist-ds-7zmqf\" (UID: \"46d3c5a6-c886-4ae0-b381-95ffb9902718\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7zmqf" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.924838 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpkzh\" (UniqueName: \"kubernetes.io/projected/993b8ae9-69a2-4cb2-806a-888528215561-kube-api-access-hpkzh\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.924969 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:17 crc kubenswrapper[4869]: E0312 14:49:17.928062 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:18.428047634 +0000 UTC m=+110.713272912 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.931107 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c55nj" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.936668 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpkzh\" (UniqueName: \"kubernetes.io/projected/993b8ae9-69a2-4cb2-806a-888528215561-kube-api-access-hpkzh\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.941954 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7tzr\" (UniqueName: \"kubernetes.io/projected/93b6b5e8-18cd-419e-9a09-f8e6c29febd2-kube-api-access-c7tzr\") pod \"csi-hostpathplugin-njkp6\" (UID: \"93b6b5e8-18cd-419e-9a09-f8e6c29febd2\") " pod="hostpath-provisioner/csi-hostpathplugin-njkp6" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.942996 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-56f66" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.944700 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9krm6" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.953055 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-h2p9q" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.954345 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-qfqjj"] Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.962636 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c84fv\" (UniqueName: \"kubernetes.io/projected/22b1e4b6-8e9c-4e12-8627-469e056beee5-kube-api-access-c84fv\") pod \"collect-profiles-29555445-z4j5n\" (UID: \"22b1e4b6-8e9c-4e12-8627-469e056beee5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-z4j5n" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.968834 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8jwlf" Mar 12 14:49:17 crc kubenswrapper[4869]: I0312 14:49:17.997638 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-nsnw9" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.015820 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-z4j5n" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.021631 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqj8g\" (UniqueName: \"kubernetes.io/projected/2d3a496a-d6d4-474b-8ab8-fffa4e661e07-kube-api-access-zqj8g\") pod \"packageserver-d55dfcdfc-9xp9n\" (UID: \"2d3a496a-d6d4-474b-8ab8-fffa4e661e07\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9xp9n" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.023628 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq9gk\" (UniqueName: \"kubernetes.io/projected/c2c29741-810a-48bb-a987-127fe5d45625-kube-api-access-vq9gk\") pod \"olm-operator-6b444d44fb-z4kzx\" (UID: \"c2c29741-810a-48bb-a987-127fe5d45625\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z4kzx" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.023684 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6tgj\" (UniqueName: \"kubernetes.io/projected/a33a8a30-b9bd-4d21-a5f5-aca28766b920-kube-api-access-l6tgj\") pod \"service-ca-operator-777779d784-ld8nq\" (UID: \"a33a8a30-b9bd-4d21-a5f5-aca28766b920\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ld8nq" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.025596 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:18 crc kubenswrapper[4869]: E0312 14:49:18.025944 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:18.525902983 +0000 UTC m=+110.811128261 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.026142 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:18 crc kubenswrapper[4869]: E0312 14:49:18.029593 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:18.529569322 +0000 UTC m=+110.814794600 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.034877 4869 ???:1] "http: TLS handshake error from 192.168.126.11:58258: no serving certificate available for the kubelet" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.040180 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-njkp6" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.041863 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxz29\" (UniqueName: \"kubernetes.io/projected/4ea68feb-ae95-41ab-9d2f-43f111c3721c-kube-api-access-bxz29\") pod \"catalog-operator-68c6474976-z5d9m\" (UID: \"4ea68feb-ae95-41ab-9d2f-43f111c3721c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z5d9m" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.051261 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-dq7n7" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.056263 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-7zmqf" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.062760 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95flb\" (UniqueName: \"kubernetes.io/projected/bb16ce4b-e604-45d9-9635-c2565dcbd228-kube-api-access-95flb\") pod \"marketplace-operator-79b997595-qdm27\" (UID: \"bb16ce4b-e604-45d9-9635-c2565dcbd228\") " pod="openshift-marketplace/marketplace-operator-79b997595-qdm27" Mar 12 14:49:18 crc kubenswrapper[4869]: W0312 14:49:18.063692 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode61f813a_db17_46a6_a380_9f13452ef07b.slice/crio-a391b4ea97251748aa8094411d4e11a2d4e97590212b932d592a20fe1ced765d WatchSource:0}: Error finding container a391b4ea97251748aa8094411d4e11a2d4e97590212b932d592a20fe1ced765d: Status 404 returned error can't find the container with id a391b4ea97251748aa8094411d4e11a2d4e97590212b932d592a20fe1ced765d Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.090208 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f0c481b0-7a0f-4737-8660-fcb14728964b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-l4g7m\" (UID: \"f0c481b0-7a0f-4737-8660-fcb14728964b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l4g7m" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.099829 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwxng\" (UniqueName: \"kubernetes.io/projected/a46b2dfd-f10c-4963-bf75-2068a886b420-kube-api-access-fwxng\") pod \"kube-storage-version-migrator-operator-b67b599dd-8p2d8\" (UID: \"a46b2dfd-f10c-4963-bf75-2068a886b420\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8p2d8" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.122400 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00636396-50ae-4e3b-b21f-f83d6dcb1ddc-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jbg9h\" (UID: \"00636396-50ae-4e3b-b21f-f83d6dcb1ddc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jbg9h" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.128168 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:18 crc kubenswrapper[4869]: E0312 14:49:18.128566 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:18.628530442 +0000 UTC m=+110.913755720 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.139395 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8nk6\" (UniqueName: \"kubernetes.io/projected/a0a0f5f3-30b3-45bf-9545-877b619b94f5-kube-api-access-h8nk6\") pod \"dns-default-624kd\" (UID: \"a0a0f5f3-30b3-45bf-9545-877b619b94f5\") " pod="openshift-dns/dns-default-624kd" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.147506 4869 request.go:700] Waited for 1.000450027s due to client-side throttling, not priority and fairness, request: PATCH:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/pods/cluster-samples-operator-665b6dd947-65jxf/status Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.170309 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l4g7m" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.200616 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jbg9h" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.207983 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8p2d8" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.229901 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:18 crc kubenswrapper[4869]: E0312 14:49:18.230314 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:18.730302828 +0000 UTC m=+111.015528106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.260513 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qdm27" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.277063 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9xp9n" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.282312 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z4kzx" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.290137 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z5d9m" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.308795 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ld8nq" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.332073 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:18 crc kubenswrapper[4869]: E0312 14:49:18.332481 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:18.832443944 +0000 UTC m=+111.117669222 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.332754 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/993b8ae9-69a2-4cb2-806a-888528215561-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.332866 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/90ae9f42-560b-4b79-a947-25c6de331025-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-2drd7\" (UID: \"90ae9f42-560b-4b79-a947-25c6de331025\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2drd7" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.332995 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.333060 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ca5302b5-90b9-412e-b378-a1fdedf81184-client-ca\") pod \"controller-manager-879f6c89f-jz4tc\" (UID: \"ca5302b5-90b9-412e-b378-a1fdedf81184\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.333113 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca5302b5-90b9-412e-b378-a1fdedf81184-serving-cert\") pod \"controller-manager-879f6c89f-jz4tc\" (UID: \"ca5302b5-90b9-412e-b378-a1fdedf81184\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.333165 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/993b8ae9-69a2-4cb2-806a-888528215561-serving-cert\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:18 crc kubenswrapper[4869]: E0312 14:49:18.334515 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:18.834491585 +0000 UTC m=+111.119716863 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.334997 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ca5302b5-90b9-412e-b378-a1fdedf81184-client-ca\") pod \"controller-manager-879f6c89f-jz4tc\" (UID: \"ca5302b5-90b9-412e-b378-a1fdedf81184\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.335636 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/993b8ae9-69a2-4cb2-806a-888528215561-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.344445 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca5302b5-90b9-412e-b378-a1fdedf81184-serving-cert\") pod \"controller-manager-879f6c89f-jz4tc\" (UID: \"ca5302b5-90b9-412e-b378-a1fdedf81184\") " pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.347248 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/90ae9f42-560b-4b79-a947-25c6de331025-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-2drd7\" (UID: \"90ae9f42-560b-4b79-a947-25c6de331025\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2drd7" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.362656 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/993b8ae9-69a2-4cb2-806a-888528215561-serving-cert\") pod \"apiserver-7bbb656c7d-qmrc7\" (UID: \"993b8ae9-69a2-4cb2-806a-888528215561\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.363388 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-624kd" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.398316 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6r22c"] Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.398554 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bsjt9"] Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.412367 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-6xr6k"] Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.434691 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:18 crc kubenswrapper[4869]: E0312 14:49:18.434982 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:18.934965792 +0000 UTC m=+111.220191070 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.442978 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4dlk8"] Mar 12 14:49:18 crc kubenswrapper[4869]: W0312 14:49:18.451956 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb89c3c57_363d_472f_a7cf_32b0e1542f60.slice/crio-bac4a92022bab41022b9cce9f38f1848b3c6d05f6d9e661bd4fff5cc8318cb67 WatchSource:0}: Error finding container bac4a92022bab41022b9cce9f38f1848b3c6d05f6d9e661bd4fff5cc8318cb67: Status 404 returned error can't find the container with id bac4a92022bab41022b9cce9f38f1848b3c6d05f6d9e661bd4fff5cc8318cb67 Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.470524 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n6t79"] Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.502139 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-2drd7" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.513597 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.514770 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-2hvvd"] Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.525132 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.536477 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:18 crc kubenswrapper[4869]: E0312 14:49:18.537153 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:19.037134199 +0000 UTC m=+111.322359477 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.640784 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:18 crc kubenswrapper[4869]: E0312 14:49:18.641106 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:19.141080109 +0000 UTC m=+111.426305387 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.641151 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:18 crc kubenswrapper[4869]: E0312 14:49:18.641506 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:19.141495951 +0000 UTC m=+111.426721439 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.709617 4869 ???:1] "http: TLS handshake error from 192.168.126.11:58270: no serving certificate available for the kubelet" Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.770773 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:18 crc kubenswrapper[4869]: E0312 14:49:18.770981 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:19.270928202 +0000 UTC m=+111.556153480 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.771146 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:18 crc kubenswrapper[4869]: E0312 14:49:18.771641 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:19.271630173 +0000 UTC m=+111.556855451 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.879297 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:18 crc kubenswrapper[4869]: E0312 14:49:18.879979 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:19.379953723 +0000 UTC m=+111.665179001 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.948768 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-dq7n7" event={"ID":"0cf1e554-88c7-44e8-8f5c-2487a10fa32a","Type":"ContainerStarted","Data":"c4a538c1c9ebbd520b06e61b3b6ed2a78a287a78a119680c122380f5de30b7c7"} Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.949070 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-dq7n7" event={"ID":"0cf1e554-88c7-44e8-8f5c-2487a10fa32a","Type":"ContainerStarted","Data":"52cc42267bef545b65404307af12107defc0d8bfd614b48c54f0b791272e70a5"} Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.955882 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vxlrb" event={"ID":"da89f990-01f3-4fc4-bc66-6f33c9639082","Type":"ContainerStarted","Data":"957873a1da679882bfc1091c47b860c294d2093a9209520995cb6a77203bce83"} Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.963180 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-5f46k" event={"ID":"e279336b-ac69-4574-9910-11c1fe663252","Type":"ContainerStarted","Data":"a00679aee0cb10e17f29775a51c3b04dec3afa25aaa0ff3e9db5c71522dfd93a"} Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.963234 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-5f46k" event={"ID":"e279336b-ac69-4574-9910-11c1fe663252","Type":"ContainerStarted","Data":"42ce0c3b534cf228cfe9eef1543515421a69d50589289213a8e4e76b11f4cdcd"} Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.967072 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-7zmqf" event={"ID":"46d3c5a6-c886-4ae0-b381-95ffb9902718","Type":"ContainerStarted","Data":"c08f009987d3d7a4091e63b8f8bf03c83adb6ead2095c0f6f8383bf42eef72f0"} Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.970678 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n6t79" event={"ID":"85fb0eb0-18ab-4d38-a6e6-f9b2220249e3","Type":"ContainerStarted","Data":"58d8ad8f91c936e550e64d1cbaecfd8fea36fed1ae0b43194796a2b12d373805"} Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.972581 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-qfqjj" event={"ID":"e61f813a-db17-46a6-a380-9f13452ef07b","Type":"ContainerStarted","Data":"1fe54d1f7b5ae6ead062dc47282a3afb0ee42dd63b1d261fe6c4f8314dbea557"} Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.972610 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-qfqjj" event={"ID":"e61f813a-db17-46a6-a380-9f13452ef07b","Type":"ContainerStarted","Data":"a391b4ea97251748aa8094411d4e11a2d4e97590212b932d592a20fe1ced765d"} Mar 12 14:49:18 crc kubenswrapper[4869]: I0312 14:49:18.978489 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-4dlk8" event={"ID":"6663b93f-3968-4397-94f9-57c42e90fad8","Type":"ContainerStarted","Data":"48ee1f84c70a7473670ca234eb19a1abff90c79ad284b3da50d9bff4eba801f6"} Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.023356 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cg4mg"] Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.024927 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:19 crc kubenswrapper[4869]: E0312 14:49:19.029267 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:19.529232765 +0000 UTC m=+111.814458043 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.051718 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-6xr6k" event={"ID":"7d89d513-f587-4072-9038-09aa4e0a6b0d","Type":"ContainerStarted","Data":"30bf3811e5d71a310425347e9c173c7fd08bc8ae719af5ba631bd0187e2fd12e"} Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.051778 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-6xr6k" event={"ID":"7d89d513-f587-4072-9038-09aa4e0a6b0d","Type":"ContainerStarted","Data":"b8d9ac2cf4efb8eeff5b85bb7cc74d1ea6e5b11d7a3db0e5f9641d796651d46c"} Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.051801 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-6xr6k" Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.063961 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6r22c" event={"ID":"b89c3c57-363d-472f-a7cf-32b0e1542f60","Type":"ContainerStarted","Data":"bac4a92022bab41022b9cce9f38f1848b3c6d05f6d9e661bd4fff5cc8318cb67"} Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.066154 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bsjt9" event={"ID":"7b0de265-9c71-4fbf-a42b-8bb0d84a6284","Type":"ContainerStarted","Data":"641f38b1b0017e3048aa41bdd23006cd983c9af32a16bd36b165be4f033143f9"} Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.066201 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bsjt9" event={"ID":"7b0de265-9c71-4fbf-a42b-8bb0d84a6284","Type":"ContainerStarted","Data":"0375a461b31ea8b562f505dc0ca63beb724fcda8cfdcba767469aa7523ee031e"} Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.068816 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" event={"ID":"b996bdda-9e5f-403c-9ab3-a1a371388e08","Type":"ContainerStarted","Data":"af65721080a00d8396423359ffefa143d99686bb950064de1746b5aa0c3f0638"} Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.129115 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.134738 4869 patch_prober.go:28] interesting pod/downloads-7954f5f757-6xr6k container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.134830 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6xr6k" podUID="7d89d513-f587-4072-9038-09aa4e0a6b0d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 12 14:49:19 crc kubenswrapper[4869]: E0312 14:49:19.136729 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:19.63669403 +0000 UTC m=+111.921919308 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.137197 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.145517 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z6x5x" Mar 12 14:49:19 crc kubenswrapper[4869]: E0312 14:49:19.145863 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:19.645838103 +0000 UTC m=+111.931063431 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.246854 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:19 crc kubenswrapper[4869]: E0312 14:49:19.248125 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:19.748105813 +0000 UTC m=+112.033331091 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.248604 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:19 crc kubenswrapper[4869]: E0312 14:49:19.249350 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:19.74933843 +0000 UTC m=+112.034563708 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.353706 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:19 crc kubenswrapper[4869]: E0312 14:49:19.354166 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:19.854146266 +0000 UTC m=+112.139371544 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.404037 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xq5fz"] Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.418176 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-c55nj"] Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.445440 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w6p7c"] Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.457047 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:19 crc kubenswrapper[4869]: E0312 14:49:19.457385 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:19.957373424 +0000 UTC m=+112.242598702 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:19 crc kubenswrapper[4869]: W0312 14:49:19.470636 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5060045_fb36_499c_ab6e_d09eee39afd8.slice/crio-222254fc543f18ff94cdb1cb2f9e330eb9bcb0ac7b06eb98d7db49ac7f74a8ea WatchSource:0}: Error finding container 222254fc543f18ff94cdb1cb2f9e330eb9bcb0ac7b06eb98d7db49ac7f74a8ea: Status 404 returned error can't find the container with id 222254fc543f18ff94cdb1cb2f9e330eb9bcb0ac7b06eb98d7db49ac7f74a8ea Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.558025 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:19 crc kubenswrapper[4869]: E0312 14:49:19.558457 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:20.058438199 +0000 UTC m=+112.343663477 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.577253 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-l4g7m"] Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.593564 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ld8nq"] Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.609105 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9xp9n"] Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.624517 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-56f66"] Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.634449 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8jwlf"] Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.661585 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:19 crc kubenswrapper[4869]: E0312 14:49:19.661864 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:20.161851803 +0000 UTC m=+112.447077081 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.762469 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:19 crc kubenswrapper[4869]: E0312 14:49:19.769481 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:20.269452702 +0000 UTC m=+112.554677980 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.769531 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z5d9m"] Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.800068 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-h2p9q"] Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.813961 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nsnw9"] Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.816289 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555445-z4j5n"] Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.818867 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jbg9h"] Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.825852 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9krm6"] Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.833477 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qdm27"] Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.838011 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-624kd"] Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.849526 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z4kzx"] Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.855730 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8p2d8"] Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.857670 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jz4tc"] Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.863608 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-2drd7"] Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.864993 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-njkp6"] Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.866702 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7"] Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.871744 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:19 crc kubenswrapper[4869]: E0312 14:49:19.872118 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:20.372104094 +0000 UTC m=+112.657329372 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.877273 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-5f46k" Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.887222 4869 patch_prober.go:28] interesting pod/router-default-5444994796-5f46k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 14:49:19 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Mar 12 14:49:19 crc kubenswrapper[4869]: [+]process-running ok Mar 12 14:49:19 crc kubenswrapper[4869]: healthz check failed Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.887310 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5f46k" podUID="e279336b-ac69-4574-9910-11c1fe663252" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 14:49:19 crc kubenswrapper[4869]: W0312 14:49:19.888530 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ea68feb_ae95_41ab_9d2f_43f111c3721c.slice/crio-88b0bc0f7233193cbf27d763561eaa4190f2bdb7256d55809c8957cf39513533 WatchSource:0}: Error finding container 88b0bc0f7233193cbf27d763561eaa4190f2bdb7256d55809c8957cf39513533: Status 404 returned error can't find the container with id 88b0bc0f7233193cbf27d763561eaa4190f2bdb7256d55809c8957cf39513533 Mar 12 14:49:19 crc kubenswrapper[4869]: W0312 14:49:19.891152 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22b1e4b6_8e9c_4e12_8627_469e056beee5.slice/crio-041438a689bfd7960bd2bda7c3eba3c49bbd77e961db16269b8c2c068e68487c WatchSource:0}: Error finding container 041438a689bfd7960bd2bda7c3eba3c49bbd77e961db16269b8c2c068e68487c: Status 404 returned error can't find the container with id 041438a689bfd7960bd2bda7c3eba3c49bbd77e961db16269b8c2c068e68487c Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.908937 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-qfqjj" podStartSLOduration=75.908917361 podStartE2EDuration="1m15.908917361s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:19.88943412 +0000 UTC m=+112.174659398" watchObservedRunningTime="2026-03-12 14:49:19.908917361 +0000 UTC m=+112.194142639" Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.909182 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vxlrb" podStartSLOduration=75.909176129 podStartE2EDuration="1m15.909176129s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:19.908169019 +0000 UTC m=+112.193394297" watchObservedRunningTime="2026-03-12 14:49:19.909176129 +0000 UTC m=+112.194401417" Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.927391 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-5f46k" podStartSLOduration=75.927370952 podStartE2EDuration="1m15.927370952s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:19.926989871 +0000 UTC m=+112.212215149" watchObservedRunningTime="2026-03-12 14:49:19.927370952 +0000 UTC m=+112.212596230" Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.963021 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.964509 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-dq7n7" podStartSLOduration=5.964492709 podStartE2EDuration="5.964492709s" podCreationTimestamp="2026-03-12 14:49:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:19.963222841 +0000 UTC m=+112.248448119" watchObservedRunningTime="2026-03-12 14:49:19.964492709 +0000 UTC m=+112.249717987" Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.972605 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:19 crc kubenswrapper[4869]: E0312 14:49:19.972771 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:20.472746785 +0000 UTC m=+112.757972053 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:19 crc kubenswrapper[4869]: I0312 14:49:19.974799 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:19 crc kubenswrapper[4869]: E0312 14:49:19.975241 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:20.475224359 +0000 UTC m=+112.760449637 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:19 crc kubenswrapper[4869]: W0312 14:49:19.989823 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2c29741_810a_48bb_a987_127fe5d45625.slice/crio-b13ed61c80a56ceafc30696b541ac14df4e784524785596eb41ae0e511a392bd WatchSource:0}: Error finding container b13ed61c80a56ceafc30696b541ac14df4e784524785596eb41ae0e511a392bd: Status 404 returned error can't find the container with id b13ed61c80a56ceafc30696b541ac14df4e784524785596eb41ae0e511a392bd Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.012990 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-bsjt9" podStartSLOduration=6.012967875 podStartE2EDuration="6.012967875s" podCreationTimestamp="2026-03-12 14:49:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:19.997597406 +0000 UTC m=+112.282822704" watchObservedRunningTime="2026-03-12 14:49:20.012967875 +0000 UTC m=+112.298193173" Mar 12 14:49:20 crc kubenswrapper[4869]: W0312 14:49:20.019744 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93b6b5e8_18cd_419e_9a09_f8e6c29febd2.slice/crio-f2bab749bb0f6be12225e1e489a3da36c2e8a5ef2d71deb246ac975d4b87b65c WatchSource:0}: Error finding container f2bab749bb0f6be12225e1e489a3da36c2e8a5ef2d71deb246ac975d4b87b65c: Status 404 returned error can't find the container with id f2bab749bb0f6be12225e1e489a3da36c2e8a5ef2d71deb246ac975d4b87b65c Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.048164 4869 ???:1] "http: TLS handshake error from 192.168.126.11:58284: no serving certificate available for the kubelet" Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.053476 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dv4p4" podStartSLOduration=76.053445002 podStartE2EDuration="1m16.053445002s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:20.048779013 +0000 UTC m=+112.334004291" watchObservedRunningTime="2026-03-12 14:49:20.053445002 +0000 UTC m=+112.338670280" Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.075350 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:20 crc kubenswrapper[4869]: E0312 14:49:20.076422 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:20.576374176 +0000 UTC m=+112.861599454 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.076980 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:20 crc kubenswrapper[4869]: E0312 14:49:20.077654 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:20.577645174 +0000 UTC m=+112.862870452 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.081153 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-6xr6k" podStartSLOduration=76.081135678 podStartE2EDuration="1m16.081135678s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:20.077432837 +0000 UTC m=+112.362658115" watchObservedRunningTime="2026-03-12 14:49:20.081135678 +0000 UTC m=+112.366360956" Mar 12 14:49:20 crc kubenswrapper[4869]: W0312 14:49:20.133351 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb453a19_efba_405c_b3c3_b73892c2c4ac.slice/crio-c5101dc6fecb2c63e7c1be7453c8966728fb84b1124a485bc2f2adb1f8018451 WatchSource:0}: Error finding container c5101dc6fecb2c63e7c1be7453c8966728fb84b1124a485bc2f2adb1f8018451: Status 404 returned error can't find the container with id c5101dc6fecb2c63e7c1be7453c8966728fb84b1124a485bc2f2adb1f8018451 Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.157317 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" podStartSLOduration=76.157291609 podStartE2EDuration="1m16.157291609s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:20.155817975 +0000 UTC m=+112.441043263" watchObservedRunningTime="2026-03-12 14:49:20.157291609 +0000 UTC m=+112.442516897" Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.175492 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cg4mg" event={"ID":"0131b05b-4a2d-4bb5-b65f-8531833bd203","Type":"ContainerStarted","Data":"57c67bcef676e58281df70e81a4d302114d204572a26affeb7493186070bb5f9"} Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.175743 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cg4mg" event={"ID":"0131b05b-4a2d-4bb5-b65f-8531833bd203","Type":"ContainerStarted","Data":"d407f1c96c6a616419dc61126e1f5819eece2131bc5e8f797a11f45920bfe019"} Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.177720 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:20 crc kubenswrapper[4869]: E0312 14:49:20.178185 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:20.678165622 +0000 UTC m=+112.963390900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.182794 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n6t79" event={"ID":"85fb0eb0-18ab-4d38-a6e6-f9b2220249e3","Type":"ContainerStarted","Data":"6063827b13b92e480e09a00915f17b6190b070c84cb162a5e32e8d480ae166dc"} Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.184707 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l4g7m" event={"ID":"f0c481b0-7a0f-4737-8660-fcb14728964b","Type":"ContainerStarted","Data":"4516a650db3aaa542b942d31574e4d184b5fe772758317219d733e1a43f44cfc"} Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.184739 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l4g7m" event={"ID":"f0c481b0-7a0f-4737-8660-fcb14728964b","Type":"ContainerStarted","Data":"570f8715f669cc667bc7a521cab41439a2d1469ab7dc28fc11b9a0709d218427"} Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.185484 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-h2p9q" event={"ID":"dcd0af06-c1e3-44b1-9dfc-af1683bf9893","Type":"ContainerStarted","Data":"14cb546e8d2ece9d0a6f3037a4b4eb332fe42418ea791f98db4fd3881dd55fe9"} Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.186277 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-2drd7" event={"ID":"90ae9f42-560b-4b79-a947-25c6de331025","Type":"ContainerStarted","Data":"14ee188ebdc9aed65b67d21349a6b35d1bf8dcf24af7d3448d4ea64fb37683de"} Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.187310 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xq5fz" event={"ID":"68451e19-64a1-471e-85c8-7238bb88e14c","Type":"ContainerStarted","Data":"48df3668d11ac50cea1e3dc399adcf503666115a69df6745ca049b2966e2f238"} Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.187337 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xq5fz" event={"ID":"68451e19-64a1-471e-85c8-7238bb88e14c","Type":"ContainerStarted","Data":"a4b94b584e00c872c9794d66d5bdbecbe015148c2e731cf0fc8e99acefafd201"} Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.189810 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5hczk" podStartSLOduration=76.189796279 podStartE2EDuration="1m16.189796279s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:20.177128291 +0000 UTC m=+112.462353569" watchObservedRunningTime="2026-03-12 14:49:20.189796279 +0000 UTC m=+112.475021557" Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.202252 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-65jxf" podStartSLOduration=76.202222729 podStartE2EDuration="1m16.202222729s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:20.201659772 +0000 UTC m=+112.486885060" watchObservedRunningTime="2026-03-12 14:49:20.202222729 +0000 UTC m=+112.487448007" Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.221175 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-56f66" event={"ID":"ae0168ae-d62e-4c9e-be20-6e9a37751c7a","Type":"ContainerStarted","Data":"dd7a9b95585671ad09db4302253d9ab10878495461f4a7cfbaf63f0cb23e853d"} Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.231594 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-z4j5n" event={"ID":"22b1e4b6-8e9c-4e12-8627-469e056beee5","Type":"ContainerStarted","Data":"041438a689bfd7960bd2bda7c3eba3c49bbd77e961db16269b8c2c068e68487c"} Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.235714 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" event={"ID":"993b8ae9-69a2-4cb2-806a-888528215561","Type":"ContainerStarted","Data":"4cc29d5e04a1ff83ec5dc1cd22eaae07cd43b80db023ed1da7693e3aff066c84"} Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.243020 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tcc6m" podStartSLOduration=76.242965144 podStartE2EDuration="1m16.242965144s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:20.242380167 +0000 UTC m=+112.527605435" watchObservedRunningTime="2026-03-12 14:49:20.242965144 +0000 UTC m=+112.528190432" Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.244988 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6r22c" event={"ID":"b89c3c57-363d-472f-a7cf-32b0e1542f60","Type":"ContainerStarted","Data":"87c61e3e76671b6ce1efadf7f4a43db306ebd48bc5da7f84cd8f125dd228d88a"} Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.245143 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6r22c" event={"ID":"b89c3c57-363d-472f-a7cf-32b0e1542f60","Type":"ContainerStarted","Data":"9e3b6ec381a3628473b9c87a9c0488d1ab2774678b94ea87e2fb15194f8ff648"} Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.250997 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z5d9m" event={"ID":"4ea68feb-ae95-41ab-9d2f-43f111c3721c","Type":"ContainerStarted","Data":"88b0bc0f7233193cbf27d763561eaa4190f2bdb7256d55809c8957cf39513533"} Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.252471 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-7zmqf" event={"ID":"46d3c5a6-c886-4ae0-b381-95ffb9902718","Type":"ContainerStarted","Data":"72472770ea7bd55eac4c9d20c39ac3ba553fd8709bff38bbcbb73a2fe241d2f8"} Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.252724 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-7zmqf" Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.260031 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9xp9n" event={"ID":"2d3a496a-d6d4-474b-8ab8-fffa4e661e07","Type":"ContainerStarted","Data":"6d8fac5ce1436325d2f80fc4ca66381d94218661496a230ec7dc0467983326ab"} Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.261459 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" event={"ID":"ca5302b5-90b9-412e-b378-a1fdedf81184","Type":"ContainerStarted","Data":"fda7d60aa80b88d10eda0ef5937de273b0428829438fa9ec20ad9b4f873b8a40"} Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.263188 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-nsnw9" event={"ID":"15598969-d30d-4c3d-a3ca-e9a75c54fb90","Type":"ContainerStarted","Data":"2bbb4f73b4e080b3b6220cd0b5296546f148378dbfc9413df751e14aa45ada92"} Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.278942 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:20 crc kubenswrapper[4869]: E0312 14:49:20.285842 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:20.783245226 +0000 UTC m=+113.068470504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.285830 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-vnnq4" podStartSLOduration=76.285810392 podStartE2EDuration="1m16.285810392s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:20.278115733 +0000 UTC m=+112.563341011" watchObservedRunningTime="2026-03-12 14:49:20.285810392 +0000 UTC m=+112.571035670" Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.296287 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c55nj" event={"ID":"e5060045-fb36-499c-ab6e-d09eee39afd8","Type":"ContainerStarted","Data":"e3d6c71c99168d6134ee5cfc41584430e925b407feef65a047f282a17d92eee6"} Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.296375 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c55nj" event={"ID":"e5060045-fb36-499c-ab6e-d09eee39afd8","Type":"ContainerStarted","Data":"222254fc543f18ff94cdb1cb2f9e330eb9bcb0ac7b06eb98d7db49ac7f74a8ea"} Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.306603 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-njkp6" event={"ID":"93b6b5e8-18cd-419e-9a09-f8e6c29febd2","Type":"ContainerStarted","Data":"f2bab749bb0f6be12225e1e489a3da36c2e8a5ef2d71deb246ac975d4b87b65c"} Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.308248 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w6p7c" event={"ID":"4bc23dbe-e45c-4c9d-99c8-b7cec390a6b0","Type":"ContainerStarted","Data":"f5a34e444a0b36b07dc1e996a4f0513e21fee313c8fc257bec543caac215887b"} Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.309867 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8p2d8" event={"ID":"a46b2dfd-f10c-4963-bf75-2068a886b420","Type":"ContainerStarted","Data":"e5c66fb794dea872c111b18a400b20eb03b22de57cd0004cdcf78cb60006c3f0"} Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.314606 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-624kd" event={"ID":"a0a0f5f3-30b3-45bf-9545-877b619b94f5","Type":"ContainerStarted","Data":"1408f4b8e80846dc768b4b7ae95975c00c142385524c3f00221944bcc85ba547"} Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.320728 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z4kzx" event={"ID":"c2c29741-810a-48bb-a987-127fe5d45625","Type":"ContainerStarted","Data":"b13ed61c80a56ceafc30696b541ac14df4e784524785596eb41ae0e511a392bd"} Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.324788 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-fs9qm" podStartSLOduration=76.324753294 podStartE2EDuration="1m16.324753294s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:20.318686443 +0000 UTC m=+112.603911721" watchObservedRunningTime="2026-03-12 14:49:20.324753294 +0000 UTC m=+112.609978572" Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.330180 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jbg9h" event={"ID":"00636396-50ae-4e3b-b21f-f83d6dcb1ddc","Type":"ContainerStarted","Data":"f474b6944c30777e0696c06b0d8bf9efc68b55f667dddbb757e0a5353594a875"} Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.338944 4869 generic.go:334] "Generic (PLEG): container finished" podID="b996bdda-9e5f-403c-9ab3-a1a371388e08" containerID="bd9c1c55bd434bd1b14223a02b00cb4530cbf92bcedb514d0db30d26c379548d" exitCode=0 Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.380659 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:20 crc kubenswrapper[4869]: E0312 14:49:20.382570 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:20.882463835 +0000 UTC m=+113.167689113 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.385636 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" event={"ID":"b996bdda-9e5f-403c-9ab3-a1a371388e08","Type":"ContainerDied","Data":"bd9c1c55bd434bd1b14223a02b00cb4530cbf92bcedb514d0db30d26c379548d"} Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.385783 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-4dlk8" event={"ID":"6663b93f-3968-4397-94f9-57c42e90fad8","Type":"ContainerStarted","Data":"9fbe6e84a1bb89a9403aa3ada56515516b139f73a54782cc151d2c14d86d69cd"} Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.385852 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qdm27" event={"ID":"bb16ce4b-e604-45d9-9635-c2565dcbd228","Type":"ContainerStarted","Data":"54d82dc2700af463e03e36dece7879feeb7a725100c2349bef6d0a02d1105f36"} Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.396868 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ld8nq" event={"ID":"a33a8a30-b9bd-4d21-a5f5-aca28766b920","Type":"ContainerStarted","Data":"dca4297d2ae7c3314c21f9ca96b68933bc95579f05e1a9b38e784ca1e00589a9"} Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.421256 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z6x5x" podStartSLOduration=76.42120788 podStartE2EDuration="1m16.42120788s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:20.371661283 +0000 UTC m=+112.656886561" watchObservedRunningTime="2026-03-12 14:49:20.42120788 +0000 UTC m=+112.706433168" Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.422790 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8jwlf" event={"ID":"8dd7972d-1866-4706-93b7-66fd45227c7f","Type":"ContainerStarted","Data":"bb31907672cd3df94310411a4ad9a9ae473b622fd223483918c9357775f6eb8d"} Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.423763 4869 patch_prober.go:28] interesting pod/downloads-7954f5f757-6xr6k container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.423837 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6xr6k" podUID="7d89d513-f587-4072-9038-09aa4e0a6b0d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.453845 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-7zmqf" Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.485195 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:20 crc kubenswrapper[4869]: E0312 14:49:20.487315 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:20.987299221 +0000 UTC m=+113.272524499 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.497087 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cg4mg" podStartSLOduration=76.497068253 podStartE2EDuration="1m16.497068253s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:20.492900909 +0000 UTC m=+112.778126187" watchObservedRunningTime="2026-03-12 14:49:20.497068253 +0000 UTC m=+112.782293521" Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.590377 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:20 crc kubenswrapper[4869]: E0312 14:49:20.590900 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:21.090874431 +0000 UTC m=+113.376099709 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.605508 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-7zmqf" podStartSLOduration=6.605490057 podStartE2EDuration="6.605490057s" podCreationTimestamp="2026-03-12 14:49:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:20.603439685 +0000 UTC m=+112.888664963" watchObservedRunningTime="2026-03-12 14:49:20.605490057 +0000 UTC m=+112.890715335" Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.679350 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-4dlk8" podStartSLOduration=76.679333759 podStartE2EDuration="1m16.679333759s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:20.677933367 +0000 UTC m=+112.963158645" watchObservedRunningTime="2026-03-12 14:49:20.679333759 +0000 UTC m=+112.964559037" Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.693145 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:20 crc kubenswrapper[4869]: E0312 14:49:20.693515 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:21.193497991 +0000 UTC m=+113.478723269 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.703179 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jz4tc"] Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.728194 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xq5fz" podStartSLOduration=76.728171805 podStartE2EDuration="1m16.728171805s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:20.715182038 +0000 UTC m=+113.000407316" watchObservedRunningTime="2026-03-12 14:49:20.728171805 +0000 UTC m=+113.013397083" Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.760182 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dv4p4"] Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.760529 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dv4p4" podUID="3ca821a8-9fb1-4ba9-8955-0969735ee00a" containerName="route-controller-manager" containerID="cri-o://df4e08e72f43fd35b430a46743215055ead330b913d6de1df46d0281e82f96b4" gracePeriod=30 Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.794147 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:20 crc kubenswrapper[4869]: E0312 14:49:20.794266 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:21.294234546 +0000 UTC m=+113.579459824 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.794576 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:20 crc kubenswrapper[4869]: E0312 14:49:20.794964 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:21.294948827 +0000 UTC m=+113.580174105 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.814820 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-6r22c" podStartSLOduration=76.814786609 podStartE2EDuration="1m16.814786609s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:20.803029788 +0000 UTC m=+113.088255066" watchObservedRunningTime="2026-03-12 14:49:20.814786609 +0000 UTC m=+113.100011887" Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.899369 4869 patch_prober.go:28] interesting pod/router-default-5444994796-5f46k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 14:49:20 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Mar 12 14:49:20 crc kubenswrapper[4869]: [+]process-running ok Mar 12 14:49:20 crc kubenswrapper[4869]: healthz check failed Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.899482 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5f46k" podUID="e279336b-ac69-4574-9910-11c1fe663252" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 14:49:20 crc kubenswrapper[4869]: I0312 14:49:20.900327 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:20 crc kubenswrapper[4869]: E0312 14:49:20.900795 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:21.400774263 +0000 UTC m=+113.685999541 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.002857 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:21 crc kubenswrapper[4869]: E0312 14:49:21.003315 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:21.503302951 +0000 UTC m=+113.788528229 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.109045 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:21 crc kubenswrapper[4869]: E0312 14:49:21.109658 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:21.609635622 +0000 UTC m=+113.894860900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.214549 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.214603 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.214643 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.214694 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.214736 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 14:49:21 crc kubenswrapper[4869]: E0312 14:49:21.217295 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:21.717271123 +0000 UTC m=+114.002496391 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.315617 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.316494 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8415254a-55e8-451e-8be1-364b98f44196-metrics-certs\") pod \"network-metrics-daemon-hllm5\" (UID: \"8415254a-55e8-451e-8be1-364b98f44196\") " pod="openshift-multus/network-metrics-daemon-hllm5" Mar 12 14:49:21 crc kubenswrapper[4869]: E0312 14:49:21.318448 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:21.818389558 +0000 UTC m=+114.103615066 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.332094 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.334954 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.335020 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.335556 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.339036 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8415254a-55e8-451e-8be1-364b98f44196-metrics-certs\") pod \"network-metrics-daemon-hllm5\" (UID: \"8415254a-55e8-451e-8be1-364b98f44196\") " pod="openshift-multus/network-metrics-daemon-hllm5" Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.358086 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.372201 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.376896 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.387365 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hllm5" Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.419864 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:21 crc kubenswrapper[4869]: E0312 14:49:21.420380 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:21.92036142 +0000 UTC m=+114.205586698 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.432133 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n6t79" podStartSLOduration=77.43210815 podStartE2EDuration="1m17.43210815s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:21.43009817 +0000 UTC m=+113.715323448" watchObservedRunningTime="2026-03-12 14:49:21.43210815 +0000 UTC m=+113.717333428" Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.461581 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ld8nq" event={"ID":"a33a8a30-b9bd-4d21-a5f5-aca28766b920","Type":"ContainerStarted","Data":"1a48d556d44a74c3663b8f129d720e72a72c2f468f007b17c1f1ff9f6d90d442"} Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.463369 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9krm6" event={"ID":"fb453a19-efba-405c-b3c3-b73892c2c4ac","Type":"ContainerStarted","Data":"c5101dc6fecb2c63e7c1be7453c8966728fb84b1124a485bc2f2adb1f8018451"} Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.470989 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c55nj" event={"ID":"e5060045-fb36-499c-ab6e-d09eee39afd8","Type":"ContainerStarted","Data":"71c874d76de45e83ac5b9f2f0aafb23d553338774345d341a9d0a7ba864a3dc8"} Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.526363 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:21 crc kubenswrapper[4869]: E0312 14:49:21.528361 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:22.02834094 +0000 UTC m=+114.313566398 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.568362 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9xp9n" event={"ID":"2d3a496a-d6d4-474b-8ab8-fffa4e661e07","Type":"ContainerStarted","Data":"4f4e8f9ddc2745834ca4f899b8becc49522214c4792fcf32c5f88ad0e9d213a9"} Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.570791 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9xp9n" Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.585367 4869 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-9xp9n container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" start-of-body= Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.585450 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9xp9n" podUID="2d3a496a-d6d4-474b-8ab8-fffa4e661e07" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.604777 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c55nj" podStartSLOduration=77.604758919 podStartE2EDuration="1m17.604758919s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:21.603157272 +0000 UTC m=+113.888382560" watchObservedRunningTime="2026-03-12 14:49:21.604758919 +0000 UTC m=+113.889984197" Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.629089 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:21 crc kubenswrapper[4869]: E0312 14:49:21.630442 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:22.130425265 +0000 UTC m=+114.415650543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.639689 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ld8nq" podStartSLOduration=77.6396577 podStartE2EDuration="1m17.6396577s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:21.63930352 +0000 UTC m=+113.924528798" watchObservedRunningTime="2026-03-12 14:49:21.6396577 +0000 UTC m=+113.924882978" Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.661319 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qdm27" event={"ID":"bb16ce4b-e604-45d9-9635-c2565dcbd228","Type":"ContainerStarted","Data":"28c39ab654df88951b8558c26bafcf32029ed735756309b4459f406e3274b808"} Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.662265 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qdm27" Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.666793 4869 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qdm27 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.666852 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qdm27" podUID="bb16ce4b-e604-45d9-9635-c2565dcbd228" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.689617 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9xp9n" podStartSLOduration=77.689596948 podStartE2EDuration="1m17.689596948s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:21.683360623 +0000 UTC m=+113.968585901" watchObservedRunningTime="2026-03-12 14:49:21.689596948 +0000 UTC m=+113.974822226" Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.710950 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-2drd7" event={"ID":"90ae9f42-560b-4b79-a947-25c6de331025","Type":"ContainerStarted","Data":"6da1bcd1bfe9eac83823d490be963f2e0699bf766c666a4c85e669d0e535b3d5"} Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.719964 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8jwlf" event={"ID":"8dd7972d-1866-4706-93b7-66fd45227c7f","Type":"ContainerStarted","Data":"cb953d309ed69ab3ff3bbc3664af621dd0e76650ed73ebfa297a54231d4edd0b"} Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.731361 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:21 crc kubenswrapper[4869]: E0312 14:49:21.732698 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:22.232678003 +0000 UTC m=+114.517903281 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.746036 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-qdm27" podStartSLOduration=77.746006761 podStartE2EDuration="1m17.746006761s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:21.745165326 +0000 UTC m=+114.030390604" watchObservedRunningTime="2026-03-12 14:49:21.746006761 +0000 UTC m=+114.031232039" Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.767314 4869 generic.go:334] "Generic (PLEG): container finished" podID="3ca821a8-9fb1-4ba9-8955-0969735ee00a" containerID="df4e08e72f43fd35b430a46743215055ead330b913d6de1df46d0281e82f96b4" exitCode=0 Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.767464 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dv4p4" event={"ID":"3ca821a8-9fb1-4ba9-8955-0969735ee00a","Type":"ContainerDied","Data":"df4e08e72f43fd35b430a46743215055ead330b913d6de1df46d0281e82f96b4"} Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.775773 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w6p7c" event={"ID":"4bc23dbe-e45c-4c9d-99c8-b7cec390a6b0","Type":"ContainerStarted","Data":"f933aed7f4980e3c02f6413136fd0d96d8374d32aacea75e0e7329bdc62820bd"} Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.795918 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jbg9h" event={"ID":"00636396-50ae-4e3b-b21f-f83d6dcb1ddc","Type":"ContainerStarted","Data":"615c45737fe3a1f999888469457c1d2adeea7c5d82bad07a1345c4890602324d"} Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.836222 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-56f66" event={"ID":"ae0168ae-d62e-4c9e-be20-6e9a37751c7a","Type":"ContainerStarted","Data":"3276f35feb96f87d808576e372a860c838726ac97423421e1032a17996de2192"} Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.836909 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:21 crc kubenswrapper[4869]: E0312 14:49:21.838220 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:22.338203491 +0000 UTC m=+114.623428769 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.855036 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jbg9h" podStartSLOduration=77.855016912 podStartE2EDuration="1m17.855016912s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:21.854036173 +0000 UTC m=+114.139261451" watchObservedRunningTime="2026-03-12 14:49:21.855016912 +0000 UTC m=+114.140242190" Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.855665 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-w6p7c" podStartSLOduration=77.855658131 podStartE2EDuration="1m17.855658131s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:21.812972588 +0000 UTC m=+114.098197866" watchObservedRunningTime="2026-03-12 14:49:21.855658131 +0000 UTC m=+114.140883409" Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.867316 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-z4j5n" event={"ID":"22b1e4b6-8e9c-4e12-8627-469e056beee5","Type":"ContainerStarted","Data":"f0b3a4fd58f3b8d4d5103d22eee5bb2e2cb19c1f2f54ddf41c09d93e9caf40d7"} Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.890833 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8p2d8" event={"ID":"a46b2dfd-f10c-4963-bf75-2068a886b420","Type":"ContainerStarted","Data":"b729286d2c0136c60579f03479818ee94b55c0ade5d1bff52e8049e3e5e128dc"} Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.896364 4869 patch_prober.go:28] interesting pod/router-default-5444994796-5f46k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 14:49:21 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Mar 12 14:49:21 crc kubenswrapper[4869]: [+]process-running ok Mar 12 14:49:21 crc kubenswrapper[4869]: healthz check failed Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.896436 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5f46k" podUID="e279336b-ac69-4574-9910-11c1fe663252" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.911376 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-56f66" podStartSLOduration=77.911344392 podStartE2EDuration="1m17.911344392s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:21.880220474 +0000 UTC m=+114.165445752" watchObservedRunningTime="2026-03-12 14:49:21.911344392 +0000 UTC m=+114.196569670" Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.917804 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dv4p4" Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.950530 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.950640 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ca821a8-9fb1-4ba9-8955-0969735ee00a-config\") pod \"3ca821a8-9fb1-4ba9-8955-0969735ee00a\" (UID: \"3ca821a8-9fb1-4ba9-8955-0969735ee00a\") " Mar 12 14:49:21 crc kubenswrapper[4869]: E0312 14:49:21.950803 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:22.450744377 +0000 UTC m=+114.735969655 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.950859 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ca821a8-9fb1-4ba9-8955-0969735ee00a-serving-cert\") pod \"3ca821a8-9fb1-4ba9-8955-0969735ee00a\" (UID: \"3ca821a8-9fb1-4ba9-8955-0969735ee00a\") " Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.950977 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ca821a8-9fb1-4ba9-8955-0969735ee00a-client-ca\") pod \"3ca821a8-9fb1-4ba9-8955-0969735ee00a\" (UID: \"3ca821a8-9fb1-4ba9-8955-0969735ee00a\") " Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.951011 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tp5mh\" (UniqueName: \"kubernetes.io/projected/3ca821a8-9fb1-4ba9-8955-0969735ee00a-kube-api-access-tp5mh\") pod \"3ca821a8-9fb1-4ba9-8955-0969735ee00a\" (UID: \"3ca821a8-9fb1-4ba9-8955-0969735ee00a\") " Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.959919 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ca821a8-9fb1-4ba9-8955-0969735ee00a-config" (OuterVolumeSpecName: "config") pod "3ca821a8-9fb1-4ba9-8955-0969735ee00a" (UID: "3ca821a8-9fb1-4ba9-8955-0969735ee00a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.961772 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ca821a8-9fb1-4ba9-8955-0969735ee00a-client-ca" (OuterVolumeSpecName: "client-ca") pod "3ca821a8-9fb1-4ba9-8955-0969735ee00a" (UID: "3ca821a8-9fb1-4ba9-8955-0969735ee00a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.962310 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.963772 4869 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ca821a8-9fb1-4ba9-8955-0969735ee00a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 14:49:21 crc kubenswrapper[4869]: E0312 14:49:21.966940 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:22.46692102 +0000 UTC m=+114.752146298 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.977188 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ca821a8-9fb1-4ba9-8955-0969735ee00a-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.980280 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-z4j5n" podStartSLOduration=77.980255167 podStartE2EDuration="1m17.980255167s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:21.913655061 +0000 UTC m=+114.198880359" watchObservedRunningTime="2026-03-12 14:49:21.980255167 +0000 UTC m=+114.265480435" Mar 12 14:49:21 crc kubenswrapper[4869]: I0312 14:49:21.980592 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8p2d8" podStartSLOduration=77.980588017 podStartE2EDuration="1m17.980588017s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:21.950082877 +0000 UTC m=+114.235308175" watchObservedRunningTime="2026-03-12 14:49:21.980588017 +0000 UTC m=+114.265813305" Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.003823 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ca821a8-9fb1-4ba9-8955-0969735ee00a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3ca821a8-9fb1-4ba9-8955-0969735ee00a" (UID: "3ca821a8-9fb1-4ba9-8955-0969735ee00a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.010700 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ca821a8-9fb1-4ba9-8955-0969735ee00a-kube-api-access-tp5mh" (OuterVolumeSpecName: "kube-api-access-tp5mh") pod "3ca821a8-9fb1-4ba9-8955-0969735ee00a" (UID: "3ca821a8-9fb1-4ba9-8955-0969735ee00a"). InnerVolumeSpecName "kube-api-access-tp5mh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.079219 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:22 crc kubenswrapper[4869]: E0312 14:49:22.079631 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:22.579508197 +0000 UTC m=+114.864733475 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.080063 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.080142 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ca821a8-9fb1-4ba9-8955-0969735ee00a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.080164 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tp5mh\" (UniqueName: \"kubernetes.io/projected/3ca821a8-9fb1-4ba9-8955-0969735ee00a-kube-api-access-tp5mh\") on node \"crc\" DevicePath \"\"" Mar 12 14:49:22 crc kubenswrapper[4869]: E0312 14:49:22.081409 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:22.581399924 +0000 UTC m=+114.866625202 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.181381 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:22 crc kubenswrapper[4869]: E0312 14:49:22.182379 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:22.682336094 +0000 UTC m=+114.967561372 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.186265 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hllm5"] Mar 12 14:49:22 crc kubenswrapper[4869]: W0312 14:49:22.217024 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8415254a_55e8_451e_8be1_364b98f44196.slice/crio-d766fcf84e54fe2b67cd43f1d692358454e873a170bab2169a678273ad7dfc46 WatchSource:0}: Error finding container d766fcf84e54fe2b67cd43f1d692358454e873a170bab2169a678273ad7dfc46: Status 404 returned error can't find the container with id d766fcf84e54fe2b67cd43f1d692358454e873a170bab2169a678273ad7dfc46 Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.293578 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:22 crc kubenswrapper[4869]: E0312 14:49:22.294107 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:22.794090477 +0000 UTC m=+115.079315755 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:22 crc kubenswrapper[4869]: W0312 14:49:22.306396 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-d2db668a952d9aaa83cef7ce87235325efd6390a22e4806d0c3f44d1b157b075 WatchSource:0}: Error finding container d2db668a952d9aaa83cef7ce87235325efd6390a22e4806d0c3f44d1b157b075: Status 404 returned error can't find the container with id d2db668a952d9aaa83cef7ce87235325efd6390a22e4806d0c3f44d1b157b075 Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.397498 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:22 crc kubenswrapper[4869]: E0312 14:49:22.398017 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:22.897993566 +0000 UTC m=+115.183218844 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.490745 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-7zmqf"] Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.498961 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:22 crc kubenswrapper[4869]: E0312 14:49:22.499382 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:22.99936732 +0000 UTC m=+115.284592598 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.602203 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:22 crc kubenswrapper[4869]: E0312 14:49:22.602942 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:23.102922488 +0000 UTC m=+115.388147766 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.663951 4869 ???:1] "http: TLS handshake error from 192.168.126.11:58288: no serving certificate available for the kubelet" Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.704294 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:22 crc kubenswrapper[4869]: E0312 14:49:22.704855 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:23.204830167 +0000 UTC m=+115.490055445 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.805129 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:22 crc kubenswrapper[4869]: E0312 14:49:22.805292 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:23.305260013 +0000 UTC m=+115.590485291 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.805438 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:22 crc kubenswrapper[4869]: E0312 14:49:22.805801 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:23.305792098 +0000 UTC m=+115.591017366 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.845591 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f79bfd8b-n2qrd"] Mar 12 14:49:22 crc kubenswrapper[4869]: E0312 14:49:22.845779 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ca821a8-9fb1-4ba9-8955-0969735ee00a" containerName="route-controller-manager" Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.845791 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ca821a8-9fb1-4ba9-8955-0969735ee00a" containerName="route-controller-manager" Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.845880 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ca821a8-9fb1-4ba9-8955-0969735ee00a" containerName="route-controller-manager" Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.846195 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f79bfd8b-n2qrd" Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.883632 4869 patch_prober.go:28] interesting pod/router-default-5444994796-5f46k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 14:49:22 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Mar 12 14:49:22 crc kubenswrapper[4869]: [+]process-running ok Mar 12 14:49:22 crc kubenswrapper[4869]: healthz check failed Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.883736 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5f46k" podUID="e279336b-ac69-4574-9910-11c1fe663252" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.904416 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"3139de8329f5f0dde14f95c4f3aa1f96af001bb0885c754797e3d731211181aa"} Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.904868 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"08997924aaec20fe57772a58f25e8e448d04ccacd2d2d0ce943ea3563a9b6fec"} Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.906072 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.906276 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11f8193e-ba56-4421-9458-e7f1c214db2b-serving-cert\") pod \"route-controller-manager-f79bfd8b-n2qrd\" (UID: \"11f8193e-ba56-4421-9458-e7f1c214db2b\") " pod="openshift-route-controller-manager/route-controller-manager-f79bfd8b-n2qrd" Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.906303 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqpgb\" (UniqueName: \"kubernetes.io/projected/11f8193e-ba56-4421-9458-e7f1c214db2b-kube-api-access-nqpgb\") pod \"route-controller-manager-f79bfd8b-n2qrd\" (UID: \"11f8193e-ba56-4421-9458-e7f1c214db2b\") " pod="openshift-route-controller-manager/route-controller-manager-f79bfd8b-n2qrd" Mar 12 14:49:22 crc kubenswrapper[4869]: E0312 14:49:22.906376 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:23.406355618 +0000 UTC m=+115.691580896 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.906483 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11f8193e-ba56-4421-9458-e7f1c214db2b-client-ca\") pod \"route-controller-manager-f79bfd8b-n2qrd\" (UID: \"11f8193e-ba56-4421-9458-e7f1c214db2b\") " pod="openshift-route-controller-manager/route-controller-manager-f79bfd8b-n2qrd" Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.906651 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11f8193e-ba56-4421-9458-e7f1c214db2b-config\") pod \"route-controller-manager-f79bfd8b-n2qrd\" (UID: \"11f8193e-ba56-4421-9458-e7f1c214db2b\") " pod="openshift-route-controller-manager/route-controller-manager-f79bfd8b-n2qrd" Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.908370 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-56f66" event={"ID":"ae0168ae-d62e-4c9e-be20-6e9a37751c7a","Type":"ContainerStarted","Data":"1caf0078d717fd7398141abf550922ccec43de2fa9d21578d5725246165ed014"} Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.920981 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" event={"ID":"b996bdda-9e5f-403c-9ab3-a1a371388e08","Type":"ContainerStarted","Data":"dce4e062487e8a65251dedc66bdd7fbdc1e62785f01678a3a491e554ea42fb5c"} Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.921024 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" event={"ID":"b996bdda-9e5f-403c-9ab3-a1a371388e08","Type":"ContainerStarted","Data":"c66f1d5d167e6ed340f5ff5187752f1724ec06b2f71d8224257152695b89071c"} Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.923268 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dv4p4" event={"ID":"3ca821a8-9fb1-4ba9-8955-0969735ee00a","Type":"ContainerDied","Data":"2904cab08b0207415816fc7088f59777b0dba6a18061e1a51ead0d965d990440"} Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.923299 4869 scope.go:117] "RemoveContainer" containerID="df4e08e72f43fd35b430a46743215055ead330b913d6de1df46d0281e82f96b4" Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.923416 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dv4p4" Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.929965 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f79bfd8b-n2qrd"] Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.939313 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" event={"ID":"ca5302b5-90b9-412e-b378-a1fdedf81184","Type":"ContainerStarted","Data":"fe34a0f8432106a01307de857b58191dc9d0afffdb0f9b9f509d35defb734691"} Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.939462 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" podUID="ca5302b5-90b9-412e-b378-a1fdedf81184" containerName="controller-manager" containerID="cri-o://fe34a0f8432106a01307de857b58191dc9d0afffdb0f9b9f509d35defb734691" gracePeriod=30 Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.940079 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.941127 4869 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-jz4tc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.941155 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" podUID="ca5302b5-90b9-412e-b378-a1fdedf81184" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.942236 4869 generic.go:334] "Generic (PLEG): container finished" podID="993b8ae9-69a2-4cb2-806a-888528215561" containerID="d61bf8bbb3fb18e81ceecdd3bd8c23fd20bc1ca86b0745315de4c140eca32215" exitCode=0 Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.942310 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" event={"ID":"993b8ae9-69a2-4cb2-806a-888528215561","Type":"ContainerDied","Data":"d61bf8bbb3fb18e81ceecdd3bd8c23fd20bc1ca86b0745315de4c140eca32215"} Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.947256 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"0c93b96b26bad1e8b632669fb9f8918cabc673fed15ab94dd7c66780ccac98e3"} Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.947324 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a8e509b7db27831b660f766ac2d9f4e144baa2db6b2e9302c1475a6a9e01c26a"} Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.950464 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-njkp6" event={"ID":"93b6b5e8-18cd-419e-9a09-f8e6c29febd2","Type":"ContainerStarted","Data":"a0a22251047f6fc2b5119853678539396bfe1d93f213fe098658b83e5555251d"} Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.951487 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hllm5" event={"ID":"8415254a-55e8-451e-8be1-364b98f44196","Type":"ContainerStarted","Data":"7a0e00ed6c858173c91789cfa75cc7d9dc36518d71353e0c9e333cfb0528341f"} Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.951510 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hllm5" event={"ID":"8415254a-55e8-451e-8be1-364b98f44196","Type":"ContainerStarted","Data":"d766fcf84e54fe2b67cd43f1d692358454e873a170bab2169a678273ad7dfc46"} Mar 12 14:49:22 crc kubenswrapper[4869]: I0312 14:49:22.955350 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-2drd7" event={"ID":"90ae9f42-560b-4b79-a947-25c6de331025","Type":"ContainerStarted","Data":"83d309002fdaad603a6d8f3dea0f2f5ac5955f5b12a4626ba5b8c40854ddaefa"} Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.013736 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.013913 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11f8193e-ba56-4421-9458-e7f1c214db2b-client-ca\") pod \"route-controller-manager-f79bfd8b-n2qrd\" (UID: \"11f8193e-ba56-4421-9458-e7f1c214db2b\") " pod="openshift-route-controller-manager/route-controller-manager-f79bfd8b-n2qrd" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.014003 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11f8193e-ba56-4421-9458-e7f1c214db2b-config\") pod \"route-controller-manager-f79bfd8b-n2qrd\" (UID: \"11f8193e-ba56-4421-9458-e7f1c214db2b\") " pod="openshift-route-controller-manager/route-controller-manager-f79bfd8b-n2qrd" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.014169 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11f8193e-ba56-4421-9458-e7f1c214db2b-serving-cert\") pod \"route-controller-manager-f79bfd8b-n2qrd\" (UID: \"11f8193e-ba56-4421-9458-e7f1c214db2b\") " pod="openshift-route-controller-manager/route-controller-manager-f79bfd8b-n2qrd" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.014224 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqpgb\" (UniqueName: \"kubernetes.io/projected/11f8193e-ba56-4421-9458-e7f1c214db2b-kube-api-access-nqpgb\") pod \"route-controller-manager-f79bfd8b-n2qrd\" (UID: \"11f8193e-ba56-4421-9458-e7f1c214db2b\") " pod="openshift-route-controller-manager/route-controller-manager-f79bfd8b-n2qrd" Mar 12 14:49:23 crc kubenswrapper[4869]: E0312 14:49:23.014705 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:23.514692389 +0000 UTC m=+115.799917667 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.018350 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11f8193e-ba56-4421-9458-e7f1c214db2b-client-ca\") pod \"route-controller-manager-f79bfd8b-n2qrd\" (UID: \"11f8193e-ba56-4421-9458-e7f1c214db2b\") " pod="openshift-route-controller-manager/route-controller-manager-f79bfd8b-n2qrd" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.019493 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11f8193e-ba56-4421-9458-e7f1c214db2b-config\") pod \"route-controller-manager-f79bfd8b-n2qrd\" (UID: \"11f8193e-ba56-4421-9458-e7f1c214db2b\") " pod="openshift-route-controller-manager/route-controller-manager-f79bfd8b-n2qrd" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.055686 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" podStartSLOduration=79.055671901 podStartE2EDuration="1m19.055671901s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:23.053885428 +0000 UTC m=+115.339110706" watchObservedRunningTime="2026-03-12 14:49:23.055671901 +0000 UTC m=+115.340897179" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.075631 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8jwlf" event={"ID":"8dd7972d-1866-4706-93b7-66fd45227c7f","Type":"ContainerStarted","Data":"0a45b5a076a82db38fab7552f19c2ffbfb23a6d80c33f14c04294fd555d119c4"} Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.076350 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8jwlf" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.076844 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11f8193e-ba56-4421-9458-e7f1c214db2b-serving-cert\") pod \"route-controller-manager-f79bfd8b-n2qrd\" (UID: \"11f8193e-ba56-4421-9458-e7f1c214db2b\") " pod="openshift-route-controller-manager/route-controller-manager-f79bfd8b-n2qrd" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.102510 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqpgb\" (UniqueName: \"kubernetes.io/projected/11f8193e-ba56-4421-9458-e7f1c214db2b-kube-api-access-nqpgb\") pod \"route-controller-manager-f79bfd8b-n2qrd\" (UID: \"11f8193e-ba56-4421-9458-e7f1c214db2b\") " pod="openshift-route-controller-manager/route-controller-manager-f79bfd8b-n2qrd" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.115260 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:23 crc kubenswrapper[4869]: E0312 14:49:23.117137 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:23.617115694 +0000 UTC m=+115.902340972 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.125489 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l4g7m" event={"ID":"f0c481b0-7a0f-4737-8660-fcb14728964b","Type":"ContainerStarted","Data":"1b85529a14b78e99d491a31ba9419619ffedbdfcd88f51988ca79868261258a0"} Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.129173 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" podStartSLOduration=79.129157103 podStartE2EDuration="1m19.129157103s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:23.128971527 +0000 UTC m=+115.414196805" watchObservedRunningTime="2026-03-12 14:49:23.129157103 +0000 UTC m=+115.414382381" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.142587 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-h2p9q" event={"ID":"dcd0af06-c1e3-44b1-9dfc-af1683bf9893","Type":"ContainerStarted","Data":"11b91c66906f255cd5672b25959f4f4a9aeeba62f309031e91b945fc83169b58"} Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.142650 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-h2p9q" event={"ID":"dcd0af06-c1e3-44b1-9dfc-af1683bf9893","Type":"ContainerStarted","Data":"2c5a31d2ce72cc5fb14a707a782b205a642bb942fe7e0ef7f448f3efba9be7cd"} Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.155492 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9krm6" event={"ID":"fb453a19-efba-405c-b3c3-b73892c2c4ac","Type":"ContainerStarted","Data":"386dcd435a76c983635e1bd48f188f7059e93b1c3ef8b252c547b017b08d8d67"} Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.155574 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9krm6" event={"ID":"fb453a19-efba-405c-b3c3-b73892c2c4ac","Type":"ContainerStarted","Data":"2ec96cfb298e5e851a92b08169214b03f0242bcdb7288750ecc4854b364f492c"} Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.157698 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d69db2260887d44b63770bfa8a5633ba1a4cd53cf4ffc8320d78030d670f95ac"} Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.157742 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d2db668a952d9aaa83cef7ce87235325efd6390a22e4806d0c3f44d1b157b075"} Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.158208 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.160674 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-624kd" event={"ID":"a0a0f5f3-30b3-45bf-9545-877b619b94f5","Type":"ContainerStarted","Data":"7b4705fb4de74b809d74353ad3591c10c3fd766711b45e26ab214a5f5f3a0e58"} Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.160694 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-624kd" event={"ID":"a0a0f5f3-30b3-45bf-9545-877b619b94f5","Type":"ContainerStarted","Data":"429196beb91a1cf6e0a9911084cb81825d0e813097c3b56cbd1be1e5c4c50913"} Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.161042 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-624kd" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.172408 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f79bfd8b-n2qrd" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.187070 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z4kzx" event={"ID":"c2c29741-810a-48bb-a987-127fe5d45625","Type":"ContainerStarted","Data":"7db6e0990efda91782a72a59768af3c755e042f09b4929370567e379885274ca"} Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.187858 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z4kzx" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.194566 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dv4p4"] Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.194683 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dv4p4"] Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.200026 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z5d9m" event={"ID":"4ea68feb-ae95-41ab-9d2f-43f111c3721c","Type":"ContainerStarted","Data":"08843806f11e0be6b3d61d3329c7c1e6c44553f42fe6bea5bbfa5ba377b05046"} Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.201114 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z5d9m" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.210012 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z4kzx" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.211925 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z5d9m" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.219363 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-nsnw9" event={"ID":"15598969-d30d-4c3d-a3ca-e9a75c54fb90","Type":"ContainerStarted","Data":"3957101bb3d48cea0268683ae9961208cff67137fe5eae1c9801174624948145"} Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.224851 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:23 crc kubenswrapper[4869]: E0312 14:49:23.226132 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:23.726114094 +0000 UTC m=+116.011339362 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.227447 4869 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qdm27 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.237635 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qdm27" podUID="bb16ce4b-e604-45d9-9635-c2565dcbd228" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.240880 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-2drd7" podStartSLOduration=79.240847634 podStartE2EDuration="1m19.240847634s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:23.21690709 +0000 UTC m=+115.502132368" watchObservedRunningTime="2026-03-12 14:49:23.240847634 +0000 UTC m=+115.526072902" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.308652 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l4g7m" podStartSLOduration=79.308624125 podStartE2EDuration="1m19.308624125s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:23.307872493 +0000 UTC m=+115.593097771" watchObservedRunningTime="2026-03-12 14:49:23.308624125 +0000 UTC m=+115.593849403" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.330020 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:23 crc kubenswrapper[4869]: E0312 14:49:23.331940 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:23.8319155 +0000 UTC m=+116.117140778 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.344734 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9krm6" podStartSLOduration=79.344707231 podStartE2EDuration="1m19.344707231s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:23.340768774 +0000 UTC m=+115.625994072" watchObservedRunningTime="2026-03-12 14:49:23.344707231 +0000 UTC m=+115.629932509" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.419985 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8jwlf" podStartSLOduration=79.419957256 podStartE2EDuration="1m19.419957256s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:23.419380308 +0000 UTC m=+115.704605586" watchObservedRunningTime="2026-03-12 14:49:23.419957256 +0000 UTC m=+115.705182534" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.425882 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-z4kzx" podStartSLOduration=79.425858902 podStartE2EDuration="1m19.425858902s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:23.383068855 +0000 UTC m=+115.668294143" watchObservedRunningTime="2026-03-12 14:49:23.425858902 +0000 UTC m=+115.711084180" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.431684 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:23 crc kubenswrapper[4869]: E0312 14:49:23.432372 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:23.932359035 +0000 UTC m=+116.217584313 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.536611 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:23 crc kubenswrapper[4869]: E0312 14:49:23.536910 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:24.036895493 +0000 UTC m=+116.322120771 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.544399 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-h2p9q" podStartSLOduration=79.544377536 podStartE2EDuration="1m19.544377536s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:23.487883521 +0000 UTC m=+115.773108799" watchObservedRunningTime="2026-03-12 14:49:23.544377536 +0000 UTC m=+115.829602804" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.544677 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-624kd" podStartSLOduration=9.544672215 podStartE2EDuration="9.544672215s" podCreationTimestamp="2026-03-12 14:49:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:23.537238523 +0000 UTC m=+115.822463801" watchObservedRunningTime="2026-03-12 14:49:23.544672215 +0000 UTC m=+115.829897493" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.560424 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-nsnw9" podStartSLOduration=79.560408994 podStartE2EDuration="1m19.560408994s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:23.558942481 +0000 UTC m=+115.844167759" watchObservedRunningTime="2026-03-12 14:49:23.560408994 +0000 UTC m=+115.845634262" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.638372 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:23 crc kubenswrapper[4869]: E0312 14:49:23.638855 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:24.138840034 +0000 UTC m=+116.424065312 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.739967 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:23 crc kubenswrapper[4869]: E0312 14:49:23.740311 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:24.240245508 +0000 UTC m=+116.525470796 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.740366 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:23 crc kubenswrapper[4869]: E0312 14:49:23.740694 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:24.240685131 +0000 UTC m=+116.525910409 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.833416 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager_controller-manager-879f6c89f-jz4tc_ca5302b5-90b9-412e-b378-a1fdedf81184/controller-manager/0.log" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.833530 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.840063 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z5d9m" podStartSLOduration=79.840043784 podStartE2EDuration="1m19.840043784s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:23.574937188 +0000 UTC m=+115.860162466" watchObservedRunningTime="2026-03-12 14:49:23.840043784 +0000 UTC m=+116.125269062" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.841472 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:23 crc kubenswrapper[4869]: E0312 14:49:23.841757 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:24.341743345 +0000 UTC m=+116.626968623 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.841785 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f79bfd8b-n2qrd"] Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.888978 4869 patch_prober.go:28] interesting pod/router-default-5444994796-5f46k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 14:49:23 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Mar 12 14:49:23 crc kubenswrapper[4869]: [+]process-running ok Mar 12 14:49:23 crc kubenswrapper[4869]: healthz check failed Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.889051 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5f46k" podUID="e279336b-ac69-4574-9910-11c1fe663252" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.937881 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9xp9n" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.944060 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ca5302b5-90b9-412e-b378-a1fdedf81184-proxy-ca-bundles\") pod \"ca5302b5-90b9-412e-b378-a1fdedf81184\" (UID: \"ca5302b5-90b9-412e-b378-a1fdedf81184\") " Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.944316 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5br8w\" (UniqueName: \"kubernetes.io/projected/ca5302b5-90b9-412e-b378-a1fdedf81184-kube-api-access-5br8w\") pod \"ca5302b5-90b9-412e-b378-a1fdedf81184\" (UID: \"ca5302b5-90b9-412e-b378-a1fdedf81184\") " Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.944345 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca5302b5-90b9-412e-b378-a1fdedf81184-serving-cert\") pod \"ca5302b5-90b9-412e-b378-a1fdedf81184\" (UID: \"ca5302b5-90b9-412e-b378-a1fdedf81184\") " Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.944395 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ca5302b5-90b9-412e-b378-a1fdedf81184-client-ca\") pod \"ca5302b5-90b9-412e-b378-a1fdedf81184\" (UID: \"ca5302b5-90b9-412e-b378-a1fdedf81184\") " Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.944450 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca5302b5-90b9-412e-b378-a1fdedf81184-config\") pod \"ca5302b5-90b9-412e-b378-a1fdedf81184\" (UID: \"ca5302b5-90b9-412e-b378-a1fdedf81184\") " Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.944674 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:23 crc kubenswrapper[4869]: E0312 14:49:23.945145 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:24.445127998 +0000 UTC m=+116.730353277 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.946003 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca5302b5-90b9-412e-b378-a1fdedf81184-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ca5302b5-90b9-412e-b378-a1fdedf81184" (UID: "ca5302b5-90b9-412e-b378-a1fdedf81184"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.946448 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca5302b5-90b9-412e-b378-a1fdedf81184-client-ca" (OuterVolumeSpecName: "client-ca") pod "ca5302b5-90b9-412e-b378-a1fdedf81184" (UID: "ca5302b5-90b9-412e-b378-a1fdedf81184"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.946947 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca5302b5-90b9-412e-b378-a1fdedf81184-config" (OuterVolumeSpecName: "config") pod "ca5302b5-90b9-412e-b378-a1fdedf81184" (UID: "ca5302b5-90b9-412e-b378-a1fdedf81184"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.966676 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca5302b5-90b9-412e-b378-a1fdedf81184-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ca5302b5-90b9-412e-b378-a1fdedf81184" (UID: "ca5302b5-90b9-412e-b378-a1fdedf81184"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:49:23 crc kubenswrapper[4869]: I0312 14:49:23.970836 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca5302b5-90b9-412e-b378-a1fdedf81184-kube-api-access-5br8w" (OuterVolumeSpecName: "kube-api-access-5br8w") pod "ca5302b5-90b9-412e-b378-a1fdedf81184" (UID: "ca5302b5-90b9-412e-b378-a1fdedf81184"). InnerVolumeSpecName "kube-api-access-5br8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.045817 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:24 crc kubenswrapper[4869]: E0312 14:49:24.046295 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:24.546263595 +0000 UTC m=+116.831488893 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.046800 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.046896 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5br8w\" (UniqueName: \"kubernetes.io/projected/ca5302b5-90b9-412e-b378-a1fdedf81184-kube-api-access-5br8w\") on node \"crc\" DevicePath \"\"" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.046913 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca5302b5-90b9-412e-b378-a1fdedf81184-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.046927 4869 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ca5302b5-90b9-412e-b378-a1fdedf81184-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.046939 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca5302b5-90b9-412e-b378-a1fdedf81184-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.046954 4869 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ca5302b5-90b9-412e-b378-a1fdedf81184-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 12 14:49:24 crc kubenswrapper[4869]: E0312 14:49:24.047244 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:24.547228834 +0000 UTC m=+116.832454112 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.148156 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:24 crc kubenswrapper[4869]: E0312 14:49:24.148352 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:24.648327389 +0000 UTC m=+116.933552667 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.148409 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:24 crc kubenswrapper[4869]: E0312 14:49:24.148744 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:24.648730571 +0000 UTC m=+116.933955839 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.221716 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f79bfd8b-n2qrd" event={"ID":"11f8193e-ba56-4421-9458-e7f1c214db2b","Type":"ContainerStarted","Data":"eb9f28434bbe3754b5c023891f0f289fd2a812cf9a42175e834c11dbdcc82149"} Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.224154 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hllm5" event={"ID":"8415254a-55e8-451e-8be1-364b98f44196","Type":"ContainerStarted","Data":"e4ea2951631e63886481103530374492042bb49b1cc1c48c0e8d2c5adba91ca1"} Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.226414 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager_controller-manager-879f6c89f-jz4tc_ca5302b5-90b9-412e-b378-a1fdedf81184/controller-manager/0.log" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.226486 4869 generic.go:334] "Generic (PLEG): container finished" podID="ca5302b5-90b9-412e-b378-a1fdedf81184" containerID="fe34a0f8432106a01307de857b58191dc9d0afffdb0f9b9f509d35defb734691" exitCode=2 Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.226559 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.226565 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" event={"ID":"ca5302b5-90b9-412e-b378-a1fdedf81184","Type":"ContainerDied","Data":"fe34a0f8432106a01307de857b58191dc9d0afffdb0f9b9f509d35defb734691"} Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.226600 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-jz4tc" event={"ID":"ca5302b5-90b9-412e-b378-a1fdedf81184","Type":"ContainerDied","Data":"fda7d60aa80b88d10eda0ef5937de273b0428829438fa9ec20ad9b4f873b8a40"} Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.226619 4869 scope.go:117] "RemoveContainer" containerID="fe34a0f8432106a01307de857b58191dc9d0afffdb0f9b9f509d35defb734691" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.228915 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" event={"ID":"993b8ae9-69a2-4cb2-806a-888528215561","Type":"ContainerStarted","Data":"193a165b5d4dfaeb773d13899bcb942283141b852b56b5b252755590bc7638ae"} Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.229088 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-7zmqf" podUID="46d3c5a6-c886-4ae0-b381-95ffb9902718" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://72472770ea7bd55eac4c9d20c39ac3ba553fd8709bff38bbcbb73a2fe241d2f8" gracePeriod=30 Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.229863 4869 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qdm27 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.229918 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qdm27" podUID="bb16ce4b-e604-45d9-9635-c2565dcbd228" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.251247 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.251518 4869 scope.go:117] "RemoveContainer" containerID="fe34a0f8432106a01307de857b58191dc9d0afffdb0f9b9f509d35defb734691" Mar 12 14:49:24 crc kubenswrapper[4869]: E0312 14:49:24.251664 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:24.75164185 +0000 UTC m=+117.036867138 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:24 crc kubenswrapper[4869]: E0312 14:49:24.255645 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe34a0f8432106a01307de857b58191dc9d0afffdb0f9b9f509d35defb734691\": container with ID starting with fe34a0f8432106a01307de857b58191dc9d0afffdb0f9b9f509d35defb734691 not found: ID does not exist" containerID="fe34a0f8432106a01307de857b58191dc9d0afffdb0f9b9f509d35defb734691" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.255687 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe34a0f8432106a01307de857b58191dc9d0afffdb0f9b9f509d35defb734691"} err="failed to get container status \"fe34a0f8432106a01307de857b58191dc9d0afffdb0f9b9f509d35defb734691\": rpc error: code = NotFound desc = could not find container \"fe34a0f8432106a01307de857b58191dc9d0afffdb0f9b9f509d35defb734691\": container with ID starting with fe34a0f8432106a01307de857b58191dc9d0afffdb0f9b9f509d35defb734691 not found: ID does not exist" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.263062 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-hllm5" podStartSLOduration=80.26305215 podStartE2EDuration="1m20.26305215s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:24.259929077 +0000 UTC m=+116.545154355" watchObservedRunningTime="2026-03-12 14:49:24.26305215 +0000 UTC m=+116.548277428" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.323934 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" podStartSLOduration=80.323907775 podStartE2EDuration="1m20.323907775s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:24.299973782 +0000 UTC m=+116.585199070" watchObservedRunningTime="2026-03-12 14:49:24.323907775 +0000 UTC m=+116.609133043" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.330014 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jz4tc"] Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.334843 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-jz4tc"] Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.338497 4869 scope.go:117] "RemoveContainer" containerID="0e6104f6e86200fc4f007b43b7b8c0c0dfb0cf70075ca81ff9773e4424d03e28" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.357504 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:24 crc kubenswrapper[4869]: E0312 14:49:24.360985 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:24.86096079 +0000 UTC m=+117.146186068 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.381770 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ca821a8-9fb1-4ba9-8955-0969735ee00a" path="/var/lib/kubelet/pods/3ca821a8-9fb1-4ba9-8955-0969735ee00a/volumes" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.383572 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca5302b5-90b9-412e-b378-a1fdedf81184" path="/var/lib/kubelet/pods/ca5302b5-90b9-412e-b378-a1fdedf81184/volumes" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.460395 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:24 crc kubenswrapper[4869]: E0312 14:49:24.461230 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:24.96120593 +0000 UTC m=+117.246431198 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.562508 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:24 crc kubenswrapper[4869]: E0312 14:49:24.562910 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:25.062894303 +0000 UTC m=+117.348119581 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.663661 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:24 crc kubenswrapper[4869]: E0312 14:49:24.664052 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:25.164005549 +0000 UTC m=+117.449230827 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.664287 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:24 crc kubenswrapper[4869]: E0312 14:49:24.664715 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:25.164696319 +0000 UTC m=+117.449921597 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.765969 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:24 crc kubenswrapper[4869]: E0312 14:49:24.766166 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:25.266133235 +0000 UTC m=+117.551358513 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.766284 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:24 crc kubenswrapper[4869]: E0312 14:49:24.766719 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:25.266703042 +0000 UTC m=+117.551928320 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.859258 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-65d49948fd-qwd2g"] Mar 12 14:49:24 crc kubenswrapper[4869]: E0312 14:49:24.859491 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca5302b5-90b9-412e-b378-a1fdedf81184" containerName="controller-manager" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.859506 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca5302b5-90b9-412e-b378-a1fdedf81184" containerName="controller-manager" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.859645 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca5302b5-90b9-412e-b378-a1fdedf81184" containerName="controller-manager" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.860057 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65d49948fd-qwd2g" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.865361 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.865394 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.866852 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:24 crc kubenswrapper[4869]: E0312 14:49:24.867126 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:25.367111686 +0000 UTC m=+117.652336964 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.867841 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.867860 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.869671 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.881032 4869 patch_prober.go:28] interesting pod/router-default-5444994796-5f46k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 14:49:24 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Mar 12 14:49:24 crc kubenswrapper[4869]: [+]process-running ok Mar 12 14:49:24 crc kubenswrapper[4869]: healthz check failed Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.881093 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5f46k" podUID="e279336b-ac69-4574-9910-11c1fe663252" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.884346 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.901380 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.905247 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-65d49948fd-qwd2g"] Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.910154 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ngq4q"] Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.910988 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ngq4q" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.913861 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.941309 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ngq4q"] Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.968435 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/549a5bb8-50a4-4380-b8ad-27a1097119de-config\") pod \"controller-manager-65d49948fd-qwd2g\" (UID: \"549a5bb8-50a4-4380-b8ad-27a1097119de\") " pod="openshift-controller-manager/controller-manager-65d49948fd-qwd2g" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.968481 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/549a5bb8-50a4-4380-b8ad-27a1097119de-serving-cert\") pod \"controller-manager-65d49948fd-qwd2g\" (UID: \"549a5bb8-50a4-4380-b8ad-27a1097119de\") " pod="openshift-controller-manager/controller-manager-65d49948fd-qwd2g" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.968523 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/549a5bb8-50a4-4380-b8ad-27a1097119de-proxy-ca-bundles\") pod \"controller-manager-65d49948fd-qwd2g\" (UID: \"549a5bb8-50a4-4380-b8ad-27a1097119de\") " pod="openshift-controller-manager/controller-manager-65d49948fd-qwd2g" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.968548 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/549a5bb8-50a4-4380-b8ad-27a1097119de-client-ca\") pod \"controller-manager-65d49948fd-qwd2g\" (UID: \"549a5bb8-50a4-4380-b8ad-27a1097119de\") " pod="openshift-controller-manager/controller-manager-65d49948fd-qwd2g" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.968618 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgt4v\" (UniqueName: \"kubernetes.io/projected/549a5bb8-50a4-4380-b8ad-27a1097119de-kube-api-access-xgt4v\") pod \"controller-manager-65d49948fd-qwd2g\" (UID: \"549a5bb8-50a4-4380-b8ad-27a1097119de\") " pod="openshift-controller-manager/controller-manager-65d49948fd-qwd2g" Mar 12 14:49:24 crc kubenswrapper[4869]: I0312 14:49:24.968667 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:24 crc kubenswrapper[4869]: E0312 14:49:24.968961 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:25.468949724 +0000 UTC m=+117.754175002 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.070414 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.071065 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8m68\" (UniqueName: \"kubernetes.io/projected/8bf6f93d-a6f8-494b-abfa-47f8a164b667-kube-api-access-r8m68\") pod \"certified-operators-ngq4q\" (UID: \"8bf6f93d-a6f8-494b-abfa-47f8a164b667\") " pod="openshift-marketplace/certified-operators-ngq4q" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.071102 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bf6f93d-a6f8-494b-abfa-47f8a164b667-utilities\") pod \"certified-operators-ngq4q\" (UID: \"8bf6f93d-a6f8-494b-abfa-47f8a164b667\") " pod="openshift-marketplace/certified-operators-ngq4q" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.071140 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/549a5bb8-50a4-4380-b8ad-27a1097119de-config\") pod \"controller-manager-65d49948fd-qwd2g\" (UID: \"549a5bb8-50a4-4380-b8ad-27a1097119de\") " pod="openshift-controller-manager/controller-manager-65d49948fd-qwd2g" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.071179 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/549a5bb8-50a4-4380-b8ad-27a1097119de-serving-cert\") pod \"controller-manager-65d49948fd-qwd2g\" (UID: \"549a5bb8-50a4-4380-b8ad-27a1097119de\") " pod="openshift-controller-manager/controller-manager-65d49948fd-qwd2g" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.071202 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/549a5bb8-50a4-4380-b8ad-27a1097119de-proxy-ca-bundles\") pod \"controller-manager-65d49948fd-qwd2g\" (UID: \"549a5bb8-50a4-4380-b8ad-27a1097119de\") " pod="openshift-controller-manager/controller-manager-65d49948fd-qwd2g" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.071223 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/549a5bb8-50a4-4380-b8ad-27a1097119de-client-ca\") pod \"controller-manager-65d49948fd-qwd2g\" (UID: \"549a5bb8-50a4-4380-b8ad-27a1097119de\") " pod="openshift-controller-manager/controller-manager-65d49948fd-qwd2g" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.071249 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bf6f93d-a6f8-494b-abfa-47f8a164b667-catalog-content\") pod \"certified-operators-ngq4q\" (UID: \"8bf6f93d-a6f8-494b-abfa-47f8a164b667\") " pod="openshift-marketplace/certified-operators-ngq4q" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.071301 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgt4v\" (UniqueName: \"kubernetes.io/projected/549a5bb8-50a4-4380-b8ad-27a1097119de-kube-api-access-xgt4v\") pod \"controller-manager-65d49948fd-qwd2g\" (UID: \"549a5bb8-50a4-4380-b8ad-27a1097119de\") " pod="openshift-controller-manager/controller-manager-65d49948fd-qwd2g" Mar 12 14:49:25 crc kubenswrapper[4869]: E0312 14:49:25.071773 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:25.571745399 +0000 UTC m=+117.856970677 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.073109 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/549a5bb8-50a4-4380-b8ad-27a1097119de-config\") pod \"controller-manager-65d49948fd-qwd2g\" (UID: \"549a5bb8-50a4-4380-b8ad-27a1097119de\") " pod="openshift-controller-manager/controller-manager-65d49948fd-qwd2g" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.074092 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lxjq9"] Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.074616 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/549a5bb8-50a4-4380-b8ad-27a1097119de-client-ca\") pod \"controller-manager-65d49948fd-qwd2g\" (UID: \"549a5bb8-50a4-4380-b8ad-27a1097119de\") " pod="openshift-controller-manager/controller-manager-65d49948fd-qwd2g" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.074828 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/549a5bb8-50a4-4380-b8ad-27a1097119de-proxy-ca-bundles\") pod \"controller-manager-65d49948fd-qwd2g\" (UID: \"549a5bb8-50a4-4380-b8ad-27a1097119de\") " pod="openshift-controller-manager/controller-manager-65d49948fd-qwd2g" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.075273 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lxjq9" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.081102 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/549a5bb8-50a4-4380-b8ad-27a1097119de-serving-cert\") pod \"controller-manager-65d49948fd-qwd2g\" (UID: \"549a5bb8-50a4-4380-b8ad-27a1097119de\") " pod="openshift-controller-manager/controller-manager-65d49948fd-qwd2g" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.091601 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.104472 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lxjq9"] Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.111244 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgt4v\" (UniqueName: \"kubernetes.io/projected/549a5bb8-50a4-4380-b8ad-27a1097119de-kube-api-access-xgt4v\") pod \"controller-manager-65d49948fd-qwd2g\" (UID: \"549a5bb8-50a4-4380-b8ad-27a1097119de\") " pod="openshift-controller-manager/controller-manager-65d49948fd-qwd2g" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.172175 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b8fa922-8e49-42e1-a4a5-40069c505bf6-utilities\") pod \"community-operators-lxjq9\" (UID: \"9b8fa922-8e49-42e1-a4a5-40069c505bf6\") " pod="openshift-marketplace/community-operators-lxjq9" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.172246 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.172280 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b8fa922-8e49-42e1-a4a5-40069c505bf6-catalog-content\") pod \"community-operators-lxjq9\" (UID: \"9b8fa922-8e49-42e1-a4a5-40069c505bf6\") " pod="openshift-marketplace/community-operators-lxjq9" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.172314 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8m68\" (UniqueName: \"kubernetes.io/projected/8bf6f93d-a6f8-494b-abfa-47f8a164b667-kube-api-access-r8m68\") pod \"certified-operators-ngq4q\" (UID: \"8bf6f93d-a6f8-494b-abfa-47f8a164b667\") " pod="openshift-marketplace/certified-operators-ngq4q" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.172347 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pltp\" (UniqueName: \"kubernetes.io/projected/9b8fa922-8e49-42e1-a4a5-40069c505bf6-kube-api-access-6pltp\") pod \"community-operators-lxjq9\" (UID: \"9b8fa922-8e49-42e1-a4a5-40069c505bf6\") " pod="openshift-marketplace/community-operators-lxjq9" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.172370 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bf6f93d-a6f8-494b-abfa-47f8a164b667-utilities\") pod \"certified-operators-ngq4q\" (UID: \"8bf6f93d-a6f8-494b-abfa-47f8a164b667\") " pod="openshift-marketplace/certified-operators-ngq4q" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.172434 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bf6f93d-a6f8-494b-abfa-47f8a164b667-catalog-content\") pod \"certified-operators-ngq4q\" (UID: \"8bf6f93d-a6f8-494b-abfa-47f8a164b667\") " pod="openshift-marketplace/certified-operators-ngq4q" Mar 12 14:49:25 crc kubenswrapper[4869]: E0312 14:49:25.172611 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:25.672595297 +0000 UTC m=+117.957820565 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.172931 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bf6f93d-a6f8-494b-abfa-47f8a164b667-catalog-content\") pod \"certified-operators-ngq4q\" (UID: \"8bf6f93d-a6f8-494b-abfa-47f8a164b667\") " pod="openshift-marketplace/certified-operators-ngq4q" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.173239 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bf6f93d-a6f8-494b-abfa-47f8a164b667-utilities\") pod \"certified-operators-ngq4q\" (UID: \"8bf6f93d-a6f8-494b-abfa-47f8a164b667\") " pod="openshift-marketplace/certified-operators-ngq4q" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.176600 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65d49948fd-qwd2g" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.207246 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8m68\" (UniqueName: \"kubernetes.io/projected/8bf6f93d-a6f8-494b-abfa-47f8a164b667-kube-api-access-r8m68\") pod \"certified-operators-ngq4q\" (UID: \"8bf6f93d-a6f8-494b-abfa-47f8a164b667\") " pod="openshift-marketplace/certified-operators-ngq4q" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.239834 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ngq4q" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.263265 4869 generic.go:334] "Generic (PLEG): container finished" podID="22b1e4b6-8e9c-4e12-8627-469e056beee5" containerID="f0b3a4fd58f3b8d4d5103d22eee5bb2e2cb19c1f2f54ddf41c09d93e9caf40d7" exitCode=0 Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.263352 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-z4j5n" event={"ID":"22b1e4b6-8e9c-4e12-8627-469e056beee5","Type":"ContainerDied","Data":"f0b3a4fd58f3b8d4d5103d22eee5bb2e2cb19c1f2f54ddf41c09d93e9caf40d7"} Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.275490 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:25 crc kubenswrapper[4869]: E0312 14:49:25.275658 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:25.775633179 +0000 UTC m=+118.060858457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.275836 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b8fa922-8e49-42e1-a4a5-40069c505bf6-utilities\") pod \"community-operators-lxjq9\" (UID: \"9b8fa922-8e49-42e1-a4a5-40069c505bf6\") " pod="openshift-marketplace/community-operators-lxjq9" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.275879 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.275903 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b8fa922-8e49-42e1-a4a5-40069c505bf6-catalog-content\") pod \"community-operators-lxjq9\" (UID: \"9b8fa922-8e49-42e1-a4a5-40069c505bf6\") " pod="openshift-marketplace/community-operators-lxjq9" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.275939 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pltp\" (UniqueName: \"kubernetes.io/projected/9b8fa922-8e49-42e1-a4a5-40069c505bf6-kube-api-access-6pltp\") pod \"community-operators-lxjq9\" (UID: \"9b8fa922-8e49-42e1-a4a5-40069c505bf6\") " pod="openshift-marketplace/community-operators-lxjq9" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.276657 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b8fa922-8e49-42e1-a4a5-40069c505bf6-catalog-content\") pod \"community-operators-lxjq9\" (UID: \"9b8fa922-8e49-42e1-a4a5-40069c505bf6\") " pod="openshift-marketplace/community-operators-lxjq9" Mar 12 14:49:25 crc kubenswrapper[4869]: E0312 14:49:25.276829 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 14:49:25.776801804 +0000 UTC m=+118.062027262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5lmx8" (UID: "32af1403-874a-49e0-ab8f-96511da15218") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.276922 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b8fa922-8e49-42e1-a4a5-40069c505bf6-utilities\") pod \"community-operators-lxjq9\" (UID: \"9b8fa922-8e49-42e1-a4a5-40069c505bf6\") " pod="openshift-marketplace/community-operators-lxjq9" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.277947 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f79bfd8b-n2qrd" event={"ID":"11f8193e-ba56-4421-9458-e7f1c214db2b","Type":"ContainerStarted","Data":"42f3ddd2823d34efa443e55f36b09c083d910108510c4167dfbea2942734b38d"} Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.278644 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-f79bfd8b-n2qrd" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.278943 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-66jjg"] Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.282255 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-njkp6" event={"ID":"93b6b5e8-18cd-419e-9a09-f8e6c29febd2","Type":"ContainerStarted","Data":"d7d898d90faf95f0d72234055341514d15a4f54651b8ab470694a0ab837f9b96"} Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.282300 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-njkp6" event={"ID":"93b6b5e8-18cd-419e-9a09-f8e6c29febd2","Type":"ContainerStarted","Data":"27f108b13735da956697cc4e0dca409174dc6ae76c4cc523dd6f9f1f1e675d39"} Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.282395 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-66jjg" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.289556 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.296286 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1e74d55d3f437aafbce19df9d38a30003f8c5668956c1cd61f7792e1747c4ed6"} Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.296470 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pltp\" (UniqueName: \"kubernetes.io/projected/9b8fa922-8e49-42e1-a4a5-40069c505bf6-kube-api-access-6pltp\") pod \"community-operators-lxjq9\" (UID: \"9b8fa922-8e49-42e1-a4a5-40069c505bf6\") " pod="openshift-marketplace/community-operators-lxjq9" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.297240 4869 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.297856 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.302238 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-66jjg"] Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.329329 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-f79bfd8b-n2qrd" podStartSLOduration=4.329281939 podStartE2EDuration="4.329281939s" podCreationTimestamp="2026-03-12 14:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:25.31723846 +0000 UTC m=+117.602463758" watchObservedRunningTime="2026-03-12 14:49:25.329281939 +0000 UTC m=+117.614507217" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.340947 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-f79bfd8b-n2qrd" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.409529 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.410349 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prjbj\" (UniqueName: \"kubernetes.io/projected/1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2-kube-api-access-prjbj\") pod \"certified-operators-66jjg\" (UID: \"1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2\") " pod="openshift-marketplace/certified-operators-66jjg" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.410470 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2-utilities\") pod \"certified-operators-66jjg\" (UID: \"1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2\") " pod="openshift-marketplace/certified-operators-66jjg" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.414353 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2-catalog-content\") pod \"certified-operators-66jjg\" (UID: \"1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2\") " pod="openshift-marketplace/certified-operators-66jjg" Mar 12 14:49:25 crc kubenswrapper[4869]: E0312 14:49:25.415229 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 14:49:25.915213182 +0000 UTC m=+118.200438460 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.430026 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lxjq9" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.487502 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.489173 4869 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-12T14:49:25.297253724Z","Handler":null,"Name":""} Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.507858 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=25.507838234 podStartE2EDuration="25.507838234s" podCreationTimestamp="2026-03-12 14:49:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:25.445435273 +0000 UTC m=+117.730660541" watchObservedRunningTime="2026-03-12 14:49:25.507838234 +0000 UTC m=+117.793063522" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.509768 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.514992 4869 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.515070 4869 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.515857 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2-catalog-content\") pod \"certified-operators-66jjg\" (UID: \"1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2\") " pod="openshift-marketplace/certified-operators-66jjg" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.515971 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prjbj\" (UniqueName: \"kubernetes.io/projected/1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2-kube-api-access-prjbj\") pod \"certified-operators-66jjg\" (UID: \"1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2\") " pod="openshift-marketplace/certified-operators-66jjg" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.516013 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.516055 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2-utilities\") pod \"certified-operators-66jjg\" (UID: \"1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2\") " pod="openshift-marketplace/certified-operators-66jjg" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.516609 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2-utilities\") pod \"certified-operators-66jjg\" (UID: \"1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2\") " pod="openshift-marketplace/certified-operators-66jjg" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.516766 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2-catalog-content\") pod \"certified-operators-66jjg\" (UID: \"1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2\") " pod="openshift-marketplace/certified-operators-66jjg" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.523302 4869 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.523353 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.553981 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prjbj\" (UniqueName: \"kubernetes.io/projected/1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2-kube-api-access-prjbj\") pod \"certified-operators-66jjg\" (UID: \"1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2\") " pod="openshift-marketplace/certified-operators-66jjg" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.564863 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6x5kk"] Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.566249 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6x5kk" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.568912 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6x5kk"] Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.607658 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-66jjg" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.621042 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82120625-a0ba-4136-b3d5-23b2b78f72cb-catalog-content\") pod \"community-operators-6x5kk\" (UID: \"82120625-a0ba-4136-b3d5-23b2b78f72cb\") " pod="openshift-marketplace/community-operators-6x5kk" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.621114 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82120625-a0ba-4136-b3d5-23b2b78f72cb-utilities\") pod \"community-operators-6x5kk\" (UID: \"82120625-a0ba-4136-b3d5-23b2b78f72cb\") " pod="openshift-marketplace/community-operators-6x5kk" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.621171 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzvvw\" (UniqueName: \"kubernetes.io/projected/82120625-a0ba-4136-b3d5-23b2b78f72cb-kube-api-access-mzvvw\") pod \"community-operators-6x5kk\" (UID: \"82120625-a0ba-4136-b3d5-23b2b78f72cb\") " pod="openshift-marketplace/community-operators-6x5kk" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.623656 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=0.623636088 podStartE2EDuration="623.636088ms" podCreationTimestamp="2026-03-12 14:49:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:25.622966688 +0000 UTC m=+117.908191976" watchObservedRunningTime="2026-03-12 14:49:25.623636088 +0000 UTC m=+117.908861366" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.659037 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5lmx8\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.708711 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=0.708693935 podStartE2EDuration="708.693935ms" podCreationTimestamp="2026-03-12 14:49:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:25.654997223 +0000 UTC m=+117.940222501" watchObservedRunningTime="2026-03-12 14:49:25.708693935 +0000 UTC m=+117.993919213" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.708876 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-65d49948fd-qwd2g"] Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.721764 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.722116 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82120625-a0ba-4136-b3d5-23b2b78f72cb-catalog-content\") pod \"community-operators-6x5kk\" (UID: \"82120625-a0ba-4136-b3d5-23b2b78f72cb\") " pod="openshift-marketplace/community-operators-6x5kk" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.722174 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82120625-a0ba-4136-b3d5-23b2b78f72cb-utilities\") pod \"community-operators-6x5kk\" (UID: \"82120625-a0ba-4136-b3d5-23b2b78f72cb\") " pod="openshift-marketplace/community-operators-6x5kk" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.722219 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzvvw\" (UniqueName: \"kubernetes.io/projected/82120625-a0ba-4136-b3d5-23b2b78f72cb-kube-api-access-mzvvw\") pod \"community-operators-6x5kk\" (UID: \"82120625-a0ba-4136-b3d5-23b2b78f72cb\") " pod="openshift-marketplace/community-operators-6x5kk" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.722706 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82120625-a0ba-4136-b3d5-23b2b78f72cb-catalog-content\") pod \"community-operators-6x5kk\" (UID: \"82120625-a0ba-4136-b3d5-23b2b78f72cb\") " pod="openshift-marketplace/community-operators-6x5kk" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.722833 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82120625-a0ba-4136-b3d5-23b2b78f72cb-utilities\") pod \"community-operators-6x5kk\" (UID: \"82120625-a0ba-4136-b3d5-23b2b78f72cb\") " pod="openshift-marketplace/community-operators-6x5kk" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.728933 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.748881 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzvvw\" (UniqueName: \"kubernetes.io/projected/82120625-a0ba-4136-b3d5-23b2b78f72cb-kube-api-access-mzvvw\") pod \"community-operators-6x5kk\" (UID: \"82120625-a0ba-4136-b3d5-23b2b78f72cb\") " pod="openshift-marketplace/community-operators-6x5kk" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.766012 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ngq4q"] Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.883638 4869 patch_prober.go:28] interesting pod/router-default-5444994796-5f46k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 14:49:25 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Mar 12 14:49:25 crc kubenswrapper[4869]: [+]process-running ok Mar 12 14:49:25 crc kubenswrapper[4869]: healthz check failed Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.883729 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5f46k" podUID="e279336b-ac69-4574-9910-11c1fe663252" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.888370 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6x5kk" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.960350 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:25 crc kubenswrapper[4869]: I0312 14:49:25.962260 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-66jjg"] Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.018994 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lxjq9"] Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.191768 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6x5kk"] Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.287025 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5lmx8"] Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.308053 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-njkp6" event={"ID":"93b6b5e8-18cd-419e-9a09-f8e6c29febd2","Type":"ContainerStarted","Data":"39b5bf792047f494771b22e2d5ac472d19a0073a7eb0c4316d1155368ef9a98a"} Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.318568 4869 generic.go:334] "Generic (PLEG): container finished" podID="9b8fa922-8e49-42e1-a4a5-40069c505bf6" containerID="dd5ef113cbf9f5b5f5cc494d7f654afbd93c86f1c15d03f746dd84611adce10d" exitCode=0 Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.318653 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxjq9" event={"ID":"9b8fa922-8e49-42e1-a4a5-40069c505bf6","Type":"ContainerDied","Data":"dd5ef113cbf9f5b5f5cc494d7f654afbd93c86f1c15d03f746dd84611adce10d"} Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.318679 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxjq9" event={"ID":"9b8fa922-8e49-42e1-a4a5-40069c505bf6","Type":"ContainerStarted","Data":"f8859531dcd7eb6bb66b7b9b84212ac829828f6def777b5ae413cf390bcac1f9"} Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.324607 4869 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.325495 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65d49948fd-qwd2g" event={"ID":"549a5bb8-50a4-4380-b8ad-27a1097119de","Type":"ContainerStarted","Data":"05792b58fa366be55ca59abc6dc09665938d8bab87a799ed2b5e3d67a88edc0f"} Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.325577 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65d49948fd-qwd2g" event={"ID":"549a5bb8-50a4-4380-b8ad-27a1097119de","Type":"ContainerStarted","Data":"0173fb3045a51fbdc9485100250cb375296f436f8d1336a71c2d0b66eba0fda2"} Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.326758 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-65d49948fd-qwd2g" Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.328590 4869 patch_prober.go:28] interesting pod/controller-manager-65d49948fd-qwd2g container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.328664 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-65d49948fd-qwd2g" podUID="549a5bb8-50a4-4380-b8ad-27a1097119de" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.350154 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-njkp6" podStartSLOduration=12.350131275 podStartE2EDuration="12.350131275s" podCreationTimestamp="2026-03-12 14:49:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:26.34725258 +0000 UTC m=+118.632477878" watchObservedRunningTime="2026-03-12 14:49:26.350131275 +0000 UTC m=+118.635356553" Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.361732 4869 generic.go:334] "Generic (PLEG): container finished" podID="8bf6f93d-a6f8-494b-abfa-47f8a164b667" containerID="79d87369560d867a7b26c8308a3d5f725140037cd32cfc4b4e787911e3bba063" exitCode=0 Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.383661 4869 generic.go:334] "Generic (PLEG): container finished" podID="1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2" containerID="7947438a7cd6cf4bec5db1b5877b2bf217cc0695f8fc6ad4d40950e18539eb44" exitCode=0 Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.384053 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-65d49948fd-qwd2g" podStartSLOduration=5.384038017 podStartE2EDuration="5.384038017s" podCreationTimestamp="2026-03-12 14:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:26.372235465 +0000 UTC m=+118.657460743" watchObservedRunningTime="2026-03-12 14:49:26.384038017 +0000 UTC m=+118.669263295" Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.385496 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.394696 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.395755 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6x5kk" event={"ID":"82120625-a0ba-4136-b3d5-23b2b78f72cb","Type":"ContainerStarted","Data":"26bda690ce8f68649aef318679e3249a54a428a276826823421e45f3ce3bceda"} Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.395805 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngq4q" event={"ID":"8bf6f93d-a6f8-494b-abfa-47f8a164b667","Type":"ContainerDied","Data":"79d87369560d867a7b26c8308a3d5f725140037cd32cfc4b4e787911e3bba063"} Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.395827 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngq4q" event={"ID":"8bf6f93d-a6f8-494b-abfa-47f8a164b667","Type":"ContainerStarted","Data":"4ac566714f09e1835b523976a7ef6a7c380014ce3aede52bcacc6dfa92c6aea5"} Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.395840 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66jjg" event={"ID":"1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2","Type":"ContainerDied","Data":"7947438a7cd6cf4bec5db1b5877b2bf217cc0695f8fc6ad4d40950e18539eb44"} Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.395855 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66jjg" event={"ID":"1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2","Type":"ContainerStarted","Data":"deac537a60ba7a54097b7177f8f9c17d750cc2efa13edb1c4ccc6f16acc7a29c"} Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.395957 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.399213 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.399597 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.421278 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.448423 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fb85bfc1-88cb-4149-969f-72b0e93106d1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"fb85bfc1-88cb-4149-969f-72b0e93106d1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.449171 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fb85bfc1-88cb-4149-969f-72b0e93106d1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"fb85bfc1-88cb-4149-969f-72b0e93106d1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.550261 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fb85bfc1-88cb-4149-969f-72b0e93106d1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"fb85bfc1-88cb-4149-969f-72b0e93106d1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.550321 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fb85bfc1-88cb-4149-969f-72b0e93106d1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"fb85bfc1-88cb-4149-969f-72b0e93106d1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.550713 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fb85bfc1-88cb-4149-969f-72b0e93106d1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"fb85bfc1-88cb-4149-969f-72b0e93106d1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.593274 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fb85bfc1-88cb-4149-969f-72b0e93106d1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"fb85bfc1-88cb-4149-969f-72b0e93106d1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.722582 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-z4j5n" Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.745055 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.754980 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22b1e4b6-8e9c-4e12-8627-469e056beee5-config-volume\") pod \"22b1e4b6-8e9c-4e12-8627-469e056beee5\" (UID: \"22b1e4b6-8e9c-4e12-8627-469e056beee5\") " Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.755056 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22b1e4b6-8e9c-4e12-8627-469e056beee5-secret-volume\") pod \"22b1e4b6-8e9c-4e12-8627-469e056beee5\" (UID: \"22b1e4b6-8e9c-4e12-8627-469e056beee5\") " Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.755119 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c84fv\" (UniqueName: \"kubernetes.io/projected/22b1e4b6-8e9c-4e12-8627-469e056beee5-kube-api-access-c84fv\") pod \"22b1e4b6-8e9c-4e12-8627-469e056beee5\" (UID: \"22b1e4b6-8e9c-4e12-8627-469e056beee5\") " Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.756990 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22b1e4b6-8e9c-4e12-8627-469e056beee5-config-volume" (OuterVolumeSpecName: "config-volume") pod "22b1e4b6-8e9c-4e12-8627-469e056beee5" (UID: "22b1e4b6-8e9c-4e12-8627-469e056beee5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.774175 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22b1e4b6-8e9c-4e12-8627-469e056beee5-kube-api-access-c84fv" (OuterVolumeSpecName: "kube-api-access-c84fv") pod "22b1e4b6-8e9c-4e12-8627-469e056beee5" (UID: "22b1e4b6-8e9c-4e12-8627-469e056beee5"). InnerVolumeSpecName "kube-api-access-c84fv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.787753 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22b1e4b6-8e9c-4e12-8627-469e056beee5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "22b1e4b6-8e9c-4e12-8627-469e056beee5" (UID: "22b1e4b6-8e9c-4e12-8627-469e056beee5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.858977 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c84fv\" (UniqueName: \"kubernetes.io/projected/22b1e4b6-8e9c-4e12-8627-469e056beee5-kube-api-access-c84fv\") on node \"crc\" DevicePath \"\"" Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.859026 4869 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22b1e4b6-8e9c-4e12-8627-469e056beee5-config-volume\") on node \"crc\" DevicePath \"\"" Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.859038 4869 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22b1e4b6-8e9c-4e12-8627-469e056beee5-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.876099 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jvphv"] Mar 12 14:49:26 crc kubenswrapper[4869]: E0312 14:49:26.876367 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22b1e4b6-8e9c-4e12-8627-469e056beee5" containerName="collect-profiles" Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.876389 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="22b1e4b6-8e9c-4e12-8627-469e056beee5" containerName="collect-profiles" Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.876506 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="22b1e4b6-8e9c-4e12-8627-469e056beee5" containerName="collect-profiles" Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.877369 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jvphv" Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.882940 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.890249 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jvphv"] Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.891729 4869 patch_prober.go:28] interesting pod/router-default-5444994796-5f46k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 14:49:26 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Mar 12 14:49:26 crc kubenswrapper[4869]: [+]process-running ok Mar 12 14:49:26 crc kubenswrapper[4869]: healthz check failed Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.891772 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5f46k" podUID="e279336b-ac69-4574-9910-11c1fe663252" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.959913 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e65737a9-e615-4a51-a72c-e4b561bdd1b0-catalog-content\") pod \"redhat-marketplace-jvphv\" (UID: \"e65737a9-e615-4a51-a72c-e4b561bdd1b0\") " pod="openshift-marketplace/redhat-marketplace-jvphv" Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.959953 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e65737a9-e615-4a51-a72c-e4b561bdd1b0-utilities\") pod \"redhat-marketplace-jvphv\" (UID: \"e65737a9-e615-4a51-a72c-e4b561bdd1b0\") " pod="openshift-marketplace/redhat-marketplace-jvphv" Mar 12 14:49:26 crc kubenswrapper[4869]: I0312 14:49:26.959977 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f52p\" (UniqueName: \"kubernetes.io/projected/e65737a9-e615-4a51-a72c-e4b561bdd1b0-kube-api-access-2f52p\") pod \"redhat-marketplace-jvphv\" (UID: \"e65737a9-e615-4a51-a72c-e4b561bdd1b0\") " pod="openshift-marketplace/redhat-marketplace-jvphv" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.061599 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e65737a9-e615-4a51-a72c-e4b561bdd1b0-catalog-content\") pod \"redhat-marketplace-jvphv\" (UID: \"e65737a9-e615-4a51-a72c-e4b561bdd1b0\") " pod="openshift-marketplace/redhat-marketplace-jvphv" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.061638 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e65737a9-e615-4a51-a72c-e4b561bdd1b0-utilities\") pod \"redhat-marketplace-jvphv\" (UID: \"e65737a9-e615-4a51-a72c-e4b561bdd1b0\") " pod="openshift-marketplace/redhat-marketplace-jvphv" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.061655 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f52p\" (UniqueName: \"kubernetes.io/projected/e65737a9-e615-4a51-a72c-e4b561bdd1b0-kube-api-access-2f52p\") pod \"redhat-marketplace-jvphv\" (UID: \"e65737a9-e615-4a51-a72c-e4b561bdd1b0\") " pod="openshift-marketplace/redhat-marketplace-jvphv" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.062497 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e65737a9-e615-4a51-a72c-e4b561bdd1b0-catalog-content\") pod \"redhat-marketplace-jvphv\" (UID: \"e65737a9-e615-4a51-a72c-e4b561bdd1b0\") " pod="openshift-marketplace/redhat-marketplace-jvphv" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.062733 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e65737a9-e615-4a51-a72c-e4b561bdd1b0-utilities\") pod \"redhat-marketplace-jvphv\" (UID: \"e65737a9-e615-4a51-a72c-e4b561bdd1b0\") " pod="openshift-marketplace/redhat-marketplace-jvphv" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.084838 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f52p\" (UniqueName: \"kubernetes.io/projected/e65737a9-e615-4a51-a72c-e4b561bdd1b0-kube-api-access-2f52p\") pod \"redhat-marketplace-jvphv\" (UID: \"e65737a9-e615-4a51-a72c-e4b561bdd1b0\") " pod="openshift-marketplace/redhat-marketplace-jvphv" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.136968 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.205266 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jvphv" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.288161 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sc6nv"] Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.289159 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sc6nv" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.315288 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sc6nv"] Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.369453 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsc89\" (UniqueName: \"kubernetes.io/projected/078c5068-e532-4d7f-badc-c7f0c23a3191-kube-api-access-hsc89\") pod \"redhat-marketplace-sc6nv\" (UID: \"078c5068-e532-4d7f-badc-c7f0c23a3191\") " pod="openshift-marketplace/redhat-marketplace-sc6nv" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.369533 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/078c5068-e532-4d7f-badc-c7f0c23a3191-catalog-content\") pod \"redhat-marketplace-sc6nv\" (UID: \"078c5068-e532-4d7f-badc-c7f0c23a3191\") " pod="openshift-marketplace/redhat-marketplace-sc6nv" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.369584 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/078c5068-e532-4d7f-badc-c7f0c23a3191-utilities\") pod \"redhat-marketplace-sc6nv\" (UID: \"078c5068-e532-4d7f-badc-c7f0c23a3191\") " pod="openshift-marketplace/redhat-marketplace-sc6nv" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.413213 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fb85bfc1-88cb-4149-969f-72b0e93106d1","Type":"ContainerStarted","Data":"869bf4125e66d7168474f213fa86a2e17548f8b35a135cc1d2711c54d57f4b71"} Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.420073 4869 generic.go:334] "Generic (PLEG): container finished" podID="82120625-a0ba-4136-b3d5-23b2b78f72cb" containerID="ae27ac59a1246817e0f19e4a6992dcd7b2e8521ae0c9fcbd802c66006aaf00eb" exitCode=0 Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.420175 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6x5kk" event={"ID":"82120625-a0ba-4136-b3d5-23b2b78f72cb","Type":"ContainerDied","Data":"ae27ac59a1246817e0f19e4a6992dcd7b2e8521ae0c9fcbd802c66006aaf00eb"} Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.422759 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" event={"ID":"32af1403-874a-49e0-ab8f-96511da15218","Type":"ContainerStarted","Data":"cca4c5f295d56b68dc26dd61a67685b5bf76ac37bb63423dac2daa3f6355f1e9"} Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.422783 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" event={"ID":"32af1403-874a-49e0-ab8f-96511da15218","Type":"ContainerStarted","Data":"dd9defa550f53785b552896d978862b07e6b63b096517727044c3799268a47b7"} Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.422866 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.439368 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-z4j5n" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.440657 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-z4j5n" event={"ID":"22b1e4b6-8e9c-4e12-8627-469e056beee5","Type":"ContainerDied","Data":"041438a689bfd7960bd2bda7c3eba3c49bbd77e961db16269b8c2c068e68487c"} Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.440707 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="041438a689bfd7960bd2bda7c3eba3c49bbd77e961db16269b8c2c068e68487c" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.462973 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-65d49948fd-qwd2g" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.476704 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/078c5068-e532-4d7f-badc-c7f0c23a3191-utilities\") pod \"redhat-marketplace-sc6nv\" (UID: \"078c5068-e532-4d7f-badc-c7f0c23a3191\") " pod="openshift-marketplace/redhat-marketplace-sc6nv" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.476789 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsc89\" (UniqueName: \"kubernetes.io/projected/078c5068-e532-4d7f-badc-c7f0c23a3191-kube-api-access-hsc89\") pod \"redhat-marketplace-sc6nv\" (UID: \"078c5068-e532-4d7f-badc-c7f0c23a3191\") " pod="openshift-marketplace/redhat-marketplace-sc6nv" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.477007 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/078c5068-e532-4d7f-badc-c7f0c23a3191-catalog-content\") pod \"redhat-marketplace-sc6nv\" (UID: \"078c5068-e532-4d7f-badc-c7f0c23a3191\") " pod="openshift-marketplace/redhat-marketplace-sc6nv" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.477730 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/078c5068-e532-4d7f-badc-c7f0c23a3191-catalog-content\") pod \"redhat-marketplace-sc6nv\" (UID: \"078c5068-e532-4d7f-badc-c7f0c23a3191\") " pod="openshift-marketplace/redhat-marketplace-sc6nv" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.479117 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/078c5068-e532-4d7f-badc-c7f0c23a3191-utilities\") pod \"redhat-marketplace-sc6nv\" (UID: \"078c5068-e532-4d7f-badc-c7f0c23a3191\") " pod="openshift-marketplace/redhat-marketplace-sc6nv" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.520551 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsc89\" (UniqueName: \"kubernetes.io/projected/078c5068-e532-4d7f-badc-c7f0c23a3191-kube-api-access-hsc89\") pod \"redhat-marketplace-sc6nv\" (UID: \"078c5068-e532-4d7f-badc-c7f0c23a3191\") " pod="openshift-marketplace/redhat-marketplace-sc6nv" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.532337 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" podStartSLOduration=83.532320484 podStartE2EDuration="1m23.532320484s" podCreationTimestamp="2026-03-12 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:27.525796039 +0000 UTC m=+119.811021317" watchObservedRunningTime="2026-03-12 14:49:27.532320484 +0000 UTC m=+119.817545752" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.569580 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.570213 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.578474 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.578770 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.578791 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a848e70f-8986-4a95-8a1d-8f2adbd41fb6-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a848e70f-8986-4a95-8a1d-8f2adbd41fb6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.578828 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.578893 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a848e70f-8986-4a95-8a1d-8f2adbd41fb6-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a848e70f-8986-4a95-8a1d-8f2adbd41fb6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.616987 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sc6nv" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.649886 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jvphv"] Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.679804 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a848e70f-8986-4a95-8a1d-8f2adbd41fb6-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a848e70f-8986-4a95-8a1d-8f2adbd41fb6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.679888 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a848e70f-8986-4a95-8a1d-8f2adbd41fb6-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a848e70f-8986-4a95-8a1d-8f2adbd41fb6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.680055 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a848e70f-8986-4a95-8a1d-8f2adbd41fb6-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a848e70f-8986-4a95-8a1d-8f2adbd41fb6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.716905 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a848e70f-8986-4a95-8a1d-8f2adbd41fb6-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a848e70f-8986-4a95-8a1d-8f2adbd41fb6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.716975 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-qfqjj" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.717184 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-qfqjj" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.718855 4869 patch_prober.go:28] interesting pod/console-f9d7485db-qfqjj container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.718907 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-qfqjj" podUID="e61f813a-db17-46a6-a380-9f13452ef07b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.807978 4869 patch_prober.go:28] interesting pod/downloads-7954f5f757-6xr6k container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.807995 4869 patch_prober.go:28] interesting pod/downloads-7954f5f757-6xr6k container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.808041 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6xr6k" podUID="7d89d513-f587-4072-9038-09aa4e0a6b0d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.808073 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-6xr6k" podUID="7d89d513-f587-4072-9038-09aa4e0a6b0d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.810585 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.810740 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.831084 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.838850 4869 ???:1] "http: TLS handshake error from 192.168.126.11:45154: no serving certificate available for the kubelet" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.879698 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-5f46k" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.883679 4869 patch_prober.go:28] interesting pod/router-default-5444994796-5f46k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 14:49:27 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Mar 12 14:49:27 crc kubenswrapper[4869]: [+]process-running ok Mar 12 14:49:27 crc kubenswrapper[4869]: healthz check failed Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.883734 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5f46k" podUID="e279336b-ac69-4574-9910-11c1fe663252" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.894633 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 14:49:27 crc kubenswrapper[4869]: I0312 14:49:27.983201 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sc6nv"] Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.072688 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zs5nz"] Mar 12 14:49:28 crc kubenswrapper[4869]: E0312 14:49:28.073020 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="72472770ea7bd55eac4c9d20c39ac3ba553fd8709bff38bbcbb73a2fe241d2f8" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 14:49:28 crc kubenswrapper[4869]: E0312 14:49:28.075428 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="72472770ea7bd55eac4c9d20c39ac3ba553fd8709bff38bbcbb73a2fe241d2f8" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 14:49:28 crc kubenswrapper[4869]: E0312 14:49:28.080881 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="72472770ea7bd55eac4c9d20c39ac3ba553fd8709bff38bbcbb73a2fe241d2f8" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 14:49:28 crc kubenswrapper[4869]: E0312 14:49:28.080975 4869 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-7zmqf" podUID="46d3c5a6-c886-4ae0-b381-95ffb9902718" containerName="kube-multus-additional-cni-plugins" Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.080912 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zs5nz" Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.090268 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.091225 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nn9z\" (UniqueName: \"kubernetes.io/projected/ffa8f886-960d-4115-a0e0-3ca252d2af08-kube-api-access-5nn9z\") pod \"redhat-operators-zs5nz\" (UID: \"ffa8f886-960d-4115-a0e0-3ca252d2af08\") " pod="openshift-marketplace/redhat-operators-zs5nz" Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.091300 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffa8f886-960d-4115-a0e0-3ca252d2af08-utilities\") pod \"redhat-operators-zs5nz\" (UID: \"ffa8f886-960d-4115-a0e0-3ca252d2af08\") " pod="openshift-marketplace/redhat-operators-zs5nz" Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.091324 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffa8f886-960d-4115-a0e0-3ca252d2af08-catalog-content\") pod \"redhat-operators-zs5nz\" (UID: \"ffa8f886-960d-4115-a0e0-3ca252d2af08\") " pod="openshift-marketplace/redhat-operators-zs5nz" Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.095328 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zs5nz"] Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.193277 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nn9z\" (UniqueName: \"kubernetes.io/projected/ffa8f886-960d-4115-a0e0-3ca252d2af08-kube-api-access-5nn9z\") pod \"redhat-operators-zs5nz\" (UID: \"ffa8f886-960d-4115-a0e0-3ca252d2af08\") " pod="openshift-marketplace/redhat-operators-zs5nz" Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.193344 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffa8f886-960d-4115-a0e0-3ca252d2af08-utilities\") pod \"redhat-operators-zs5nz\" (UID: \"ffa8f886-960d-4115-a0e0-3ca252d2af08\") " pod="openshift-marketplace/redhat-operators-zs5nz" Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.193365 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffa8f886-960d-4115-a0e0-3ca252d2af08-catalog-content\") pod \"redhat-operators-zs5nz\" (UID: \"ffa8f886-960d-4115-a0e0-3ca252d2af08\") " pod="openshift-marketplace/redhat-operators-zs5nz" Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.193944 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffa8f886-960d-4115-a0e0-3ca252d2af08-catalog-content\") pod \"redhat-operators-zs5nz\" (UID: \"ffa8f886-960d-4115-a0e0-3ca252d2af08\") " pod="openshift-marketplace/redhat-operators-zs5nz" Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.194244 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffa8f886-960d-4115-a0e0-3ca252d2af08-utilities\") pod \"redhat-operators-zs5nz\" (UID: \"ffa8f886-960d-4115-a0e0-3ca252d2af08\") " pod="openshift-marketplace/redhat-operators-zs5nz" Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.215247 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nn9z\" (UniqueName: \"kubernetes.io/projected/ffa8f886-960d-4115-a0e0-3ca252d2af08-kube-api-access-5nn9z\") pod \"redhat-operators-zs5nz\" (UID: \"ffa8f886-960d-4115-a0e0-3ca252d2af08\") " pod="openshift-marketplace/redhat-operators-zs5nz" Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.271712 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qdm27" Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.423980 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.424937 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zs5nz" Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.502347 4869 generic.go:334] "Generic (PLEG): container finished" podID="fb85bfc1-88cb-4149-969f-72b0e93106d1" containerID="ec9d45a78e8b4936036b2ca4eb8cd9a2be3e9bfa5b520cf2d2d33ba0e364b35d" exitCode=0 Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.503149 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fb85bfc1-88cb-4149-969f-72b0e93106d1","Type":"ContainerDied","Data":"ec9d45a78e8b4936036b2ca4eb8cd9a2be3e9bfa5b520cf2d2d33ba0e364b35d"} Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.504211 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cfj7c"] Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.505421 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cfj7c" Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.514754 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.515805 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.529623 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74481165-f298-4cdc-9af7-feef043fa182-utilities\") pod \"redhat-operators-cfj7c\" (UID: \"74481165-f298-4cdc-9af7-feef043fa182\") " pod="openshift-marketplace/redhat-operators-cfj7c" Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.529671 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nw6v\" (UniqueName: \"kubernetes.io/projected/74481165-f298-4cdc-9af7-feef043fa182-kube-api-access-4nw6v\") pod \"redhat-operators-cfj7c\" (UID: \"74481165-f298-4cdc-9af7-feef043fa182\") " pod="openshift-marketplace/redhat-operators-cfj7c" Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.529712 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74481165-f298-4cdc-9af7-feef043fa182-catalog-content\") pod \"redhat-operators-cfj7c\" (UID: \"74481165-f298-4cdc-9af7-feef043fa182\") " pod="openshift-marketplace/redhat-operators-cfj7c" Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.534094 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.539267 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cfj7c"] Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.594242 4869 generic.go:334] "Generic (PLEG): container finished" podID="e65737a9-e615-4a51-a72c-e4b561bdd1b0" containerID="54c5ce06e9a70aa0363a427c1faff190350503758a8ea3524e3015b04ce7ef45" exitCode=0 Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.594441 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvphv" event={"ID":"e65737a9-e615-4a51-a72c-e4b561bdd1b0","Type":"ContainerDied","Data":"54c5ce06e9a70aa0363a427c1faff190350503758a8ea3524e3015b04ce7ef45"} Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.594515 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvphv" event={"ID":"e65737a9-e615-4a51-a72c-e4b561bdd1b0","Type":"ContainerStarted","Data":"a3c523124e0530612b9ba899ed2d6d98b37b48d0e3afe05bd244b28f7954a0db"} Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.630621 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.633031 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74481165-f298-4cdc-9af7-feef043fa182-utilities\") pod \"redhat-operators-cfj7c\" (UID: \"74481165-f298-4cdc-9af7-feef043fa182\") " pod="openshift-marketplace/redhat-operators-cfj7c" Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.633071 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74481165-f298-4cdc-9af7-feef043fa182-catalog-content\") pod \"redhat-operators-cfj7c\" (UID: \"74481165-f298-4cdc-9af7-feef043fa182\") " pod="openshift-marketplace/redhat-operators-cfj7c" Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.633093 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nw6v\" (UniqueName: \"kubernetes.io/projected/74481165-f298-4cdc-9af7-feef043fa182-kube-api-access-4nw6v\") pod \"redhat-operators-cfj7c\" (UID: \"74481165-f298-4cdc-9af7-feef043fa182\") " pod="openshift-marketplace/redhat-operators-cfj7c" Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.648013 4869 generic.go:334] "Generic (PLEG): container finished" podID="078c5068-e532-4d7f-badc-c7f0c23a3191" containerID="0d79d2423fd54af475a92b1b75c8b60577a3d77277021524f44e05496fc8f29f" exitCode=0 Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.651697 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sc6nv" event={"ID":"078c5068-e532-4d7f-badc-c7f0c23a3191","Type":"ContainerDied","Data":"0d79d2423fd54af475a92b1b75c8b60577a3d77277021524f44e05496fc8f29f"} Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.651737 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sc6nv" event={"ID":"078c5068-e532-4d7f-badc-c7f0c23a3191","Type":"ContainerStarted","Data":"c4893c76e55ac2d47bd46e145d8c2c351efa6c81b90001de554a8f80fd8a051d"} Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.660248 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74481165-f298-4cdc-9af7-feef043fa182-utilities\") pod \"redhat-operators-cfj7c\" (UID: \"74481165-f298-4cdc-9af7-feef043fa182\") " pod="openshift-marketplace/redhat-operators-cfj7c" Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.668813 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74481165-f298-4cdc-9af7-feef043fa182-catalog-content\") pod \"redhat-operators-cfj7c\" (UID: \"74481165-f298-4cdc-9af7-feef043fa182\") " pod="openshift-marketplace/redhat-operators-cfj7c" Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.699396 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-2hvvd" Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.727511 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nw6v\" (UniqueName: \"kubernetes.io/projected/74481165-f298-4cdc-9af7-feef043fa182-kube-api-access-4nw6v\") pod \"redhat-operators-cfj7c\" (UID: \"74481165-f298-4cdc-9af7-feef043fa182\") " pod="openshift-marketplace/redhat-operators-cfj7c" Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.836877 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cfj7c" Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.903031 4869 patch_prober.go:28] interesting pod/router-default-5444994796-5f46k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 14:49:28 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Mar 12 14:49:28 crc kubenswrapper[4869]: [+]process-running ok Mar 12 14:49:28 crc kubenswrapper[4869]: healthz check failed Mar 12 14:49:28 crc kubenswrapper[4869]: I0312 14:49:28.903074 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5f46k" podUID="e279336b-ac69-4574-9910-11c1fe663252" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 14:49:29 crc kubenswrapper[4869]: I0312 14:49:29.346102 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zs5nz"] Mar 12 14:49:29 crc kubenswrapper[4869]: I0312 14:49:29.546356 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cfj7c"] Mar 12 14:49:29 crc kubenswrapper[4869]: I0312 14:49:29.662923 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cfj7c" event={"ID":"74481165-f298-4cdc-9af7-feef043fa182","Type":"ContainerStarted","Data":"0d5ecf96236ca596f2256b0b02e89b1dab6a0f4cd5450bf5d232fa2604763606"} Mar 12 14:49:29 crc kubenswrapper[4869]: I0312 14:49:29.666327 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a848e70f-8986-4a95-8a1d-8f2adbd41fb6","Type":"ContainerStarted","Data":"3fadf6ec43bb56999fc67422e3b28c3477535f651e0adf02f41576cec60e7917"} Mar 12 14:49:29 crc kubenswrapper[4869]: I0312 14:49:29.673892 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zs5nz" event={"ID":"ffa8f886-960d-4115-a0e0-3ca252d2af08","Type":"ContainerStarted","Data":"f9123e78ae5147462aead70ddb6d9442b1f27e1c26fcbd8ac381c67efcebb3e8"} Mar 12 14:49:29 crc kubenswrapper[4869]: I0312 14:49:29.682145 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qmrc7" Mar 12 14:49:29 crc kubenswrapper[4869]: I0312 14:49:29.884049 4869 patch_prober.go:28] interesting pod/router-default-5444994796-5f46k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 14:49:29 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Mar 12 14:49:29 crc kubenswrapper[4869]: [+]process-running ok Mar 12 14:49:29 crc kubenswrapper[4869]: healthz check failed Mar 12 14:49:29 crc kubenswrapper[4869]: I0312 14:49:29.884490 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5f46k" podUID="e279336b-ac69-4574-9910-11c1fe663252" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 14:49:30 crc kubenswrapper[4869]: I0312 14:49:30.012587 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 14:49:30 crc kubenswrapper[4869]: I0312 14:49:30.095522 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fb85bfc1-88cb-4149-969f-72b0e93106d1-kubelet-dir\") pod \"fb85bfc1-88cb-4149-969f-72b0e93106d1\" (UID: \"fb85bfc1-88cb-4149-969f-72b0e93106d1\") " Mar 12 14:49:30 crc kubenswrapper[4869]: I0312 14:49:30.095611 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb85bfc1-88cb-4149-969f-72b0e93106d1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "fb85bfc1-88cb-4149-969f-72b0e93106d1" (UID: "fb85bfc1-88cb-4149-969f-72b0e93106d1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 14:49:30 crc kubenswrapper[4869]: I0312 14:49:30.095711 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fb85bfc1-88cb-4149-969f-72b0e93106d1-kube-api-access\") pod \"fb85bfc1-88cb-4149-969f-72b0e93106d1\" (UID: \"fb85bfc1-88cb-4149-969f-72b0e93106d1\") " Mar 12 14:49:30 crc kubenswrapper[4869]: I0312 14:49:30.096037 4869 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fb85bfc1-88cb-4149-969f-72b0e93106d1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 12 14:49:30 crc kubenswrapper[4869]: I0312 14:49:30.113287 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb85bfc1-88cb-4149-969f-72b0e93106d1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "fb85bfc1-88cb-4149-969f-72b0e93106d1" (UID: "fb85bfc1-88cb-4149-969f-72b0e93106d1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:49:30 crc kubenswrapper[4869]: I0312 14:49:30.206939 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fb85bfc1-88cb-4149-969f-72b0e93106d1-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 14:49:30 crc kubenswrapper[4869]: I0312 14:49:30.690122 4869 generic.go:334] "Generic (PLEG): container finished" podID="a848e70f-8986-4a95-8a1d-8f2adbd41fb6" containerID="36c67e546ffa09ff63ac004b6eae5f06be83853e39ceb824ccbba9678e6f41d6" exitCode=0 Mar 12 14:49:30 crc kubenswrapper[4869]: I0312 14:49:30.690336 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a848e70f-8986-4a95-8a1d-8f2adbd41fb6","Type":"ContainerDied","Data":"36c67e546ffa09ff63ac004b6eae5f06be83853e39ceb824ccbba9678e6f41d6"} Mar 12 14:49:30 crc kubenswrapper[4869]: I0312 14:49:30.696875 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zs5nz" event={"ID":"ffa8f886-960d-4115-a0e0-3ca252d2af08","Type":"ContainerDied","Data":"2ae07afe2e446ef24abdf509609098d57b9aa81be6134615aecf64a2bbc93405"} Mar 12 14:49:30 crc kubenswrapper[4869]: I0312 14:49:30.696799 4869 generic.go:334] "Generic (PLEG): container finished" podID="ffa8f886-960d-4115-a0e0-3ca252d2af08" containerID="2ae07afe2e446ef24abdf509609098d57b9aa81be6134615aecf64a2bbc93405" exitCode=0 Mar 12 14:49:30 crc kubenswrapper[4869]: I0312 14:49:30.701575 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 14:49:30 crc kubenswrapper[4869]: I0312 14:49:30.701601 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fb85bfc1-88cb-4149-969f-72b0e93106d1","Type":"ContainerDied","Data":"869bf4125e66d7168474f213fa86a2e17548f8b35a135cc1d2711c54d57f4b71"} Mar 12 14:49:30 crc kubenswrapper[4869]: I0312 14:49:30.701655 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="869bf4125e66d7168474f213fa86a2e17548f8b35a135cc1d2711c54d57f4b71" Mar 12 14:49:30 crc kubenswrapper[4869]: I0312 14:49:30.714511 4869 generic.go:334] "Generic (PLEG): container finished" podID="74481165-f298-4cdc-9af7-feef043fa182" containerID="cf8a5c712d5a0df89a1285830b7f66dba060dcbeb6f2284e3d0dcb13ce0cdb3d" exitCode=0 Mar 12 14:49:30 crc kubenswrapper[4869]: I0312 14:49:30.714671 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cfj7c" event={"ID":"74481165-f298-4cdc-9af7-feef043fa182","Type":"ContainerDied","Data":"cf8a5c712d5a0df89a1285830b7f66dba060dcbeb6f2284e3d0dcb13ce0cdb3d"} Mar 12 14:49:30 crc kubenswrapper[4869]: I0312 14:49:30.889135 4869 patch_prober.go:28] interesting pod/router-default-5444994796-5f46k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 14:49:30 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Mar 12 14:49:30 crc kubenswrapper[4869]: [+]process-running ok Mar 12 14:49:30 crc kubenswrapper[4869]: healthz check failed Mar 12 14:49:30 crc kubenswrapper[4869]: I0312 14:49:30.889293 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5f46k" podUID="e279336b-ac69-4574-9910-11c1fe663252" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 14:49:31 crc kubenswrapper[4869]: I0312 14:49:31.879770 4869 patch_prober.go:28] interesting pod/router-default-5444994796-5f46k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 14:49:31 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Mar 12 14:49:31 crc kubenswrapper[4869]: [+]process-running ok Mar 12 14:49:31 crc kubenswrapper[4869]: healthz check failed Mar 12 14:49:31 crc kubenswrapper[4869]: I0312 14:49:31.880215 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5f46k" podUID="e279336b-ac69-4574-9910-11c1fe663252" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 14:49:32 crc kubenswrapper[4869]: I0312 14:49:32.272667 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 14:49:32 crc kubenswrapper[4869]: I0312 14:49:32.364887 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a848e70f-8986-4a95-8a1d-8f2adbd41fb6-kubelet-dir\") pod \"a848e70f-8986-4a95-8a1d-8f2adbd41fb6\" (UID: \"a848e70f-8986-4a95-8a1d-8f2adbd41fb6\") " Mar 12 14:49:32 crc kubenswrapper[4869]: I0312 14:49:32.365098 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a848e70f-8986-4a95-8a1d-8f2adbd41fb6-kube-api-access\") pod \"a848e70f-8986-4a95-8a1d-8f2adbd41fb6\" (UID: \"a848e70f-8986-4a95-8a1d-8f2adbd41fb6\") " Mar 12 14:49:32 crc kubenswrapper[4869]: I0312 14:49:32.369719 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a848e70f-8986-4a95-8a1d-8f2adbd41fb6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a848e70f-8986-4a95-8a1d-8f2adbd41fb6" (UID: "a848e70f-8986-4a95-8a1d-8f2adbd41fb6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 14:49:32 crc kubenswrapper[4869]: I0312 14:49:32.381822 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a848e70f-8986-4a95-8a1d-8f2adbd41fb6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a848e70f-8986-4a95-8a1d-8f2adbd41fb6" (UID: "a848e70f-8986-4a95-8a1d-8f2adbd41fb6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:49:32 crc kubenswrapper[4869]: I0312 14:49:32.396273 4869 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a848e70f-8986-4a95-8a1d-8f2adbd41fb6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 12 14:49:32 crc kubenswrapper[4869]: I0312 14:49:32.396321 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a848e70f-8986-4a95-8a1d-8f2adbd41fb6-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 14:49:32 crc kubenswrapper[4869]: I0312 14:49:32.764423 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a848e70f-8986-4a95-8a1d-8f2adbd41fb6","Type":"ContainerDied","Data":"3fadf6ec43bb56999fc67422e3b28c3477535f651e0adf02f41576cec60e7917"} Mar 12 14:49:32 crc kubenswrapper[4869]: I0312 14:49:32.764481 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fadf6ec43bb56999fc67422e3b28c3477535f651e0adf02f41576cec60e7917" Mar 12 14:49:32 crc kubenswrapper[4869]: I0312 14:49:32.764634 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 14:49:32 crc kubenswrapper[4869]: I0312 14:49:32.881641 4869 patch_prober.go:28] interesting pod/router-default-5444994796-5f46k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 14:49:32 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Mar 12 14:49:32 crc kubenswrapper[4869]: [+]process-running ok Mar 12 14:49:32 crc kubenswrapper[4869]: healthz check failed Mar 12 14:49:32 crc kubenswrapper[4869]: I0312 14:49:32.881774 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5f46k" podUID="e279336b-ac69-4574-9910-11c1fe663252" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 14:49:33 crc kubenswrapper[4869]: I0312 14:49:33.211224 4869 ???:1] "http: TLS handshake error from 192.168.126.11:45166: no serving certificate available for the kubelet" Mar 12 14:49:33 crc kubenswrapper[4869]: I0312 14:49:33.368215 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-624kd" Mar 12 14:49:33 crc kubenswrapper[4869]: I0312 14:49:33.881111 4869 patch_prober.go:28] interesting pod/router-default-5444994796-5f46k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 14:49:33 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Mar 12 14:49:33 crc kubenswrapper[4869]: [+]process-running ok Mar 12 14:49:33 crc kubenswrapper[4869]: healthz check failed Mar 12 14:49:33 crc kubenswrapper[4869]: I0312 14:49:33.881200 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5f46k" podUID="e279336b-ac69-4574-9910-11c1fe663252" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 14:49:34 crc kubenswrapper[4869]: I0312 14:49:34.880404 4869 patch_prober.go:28] interesting pod/router-default-5444994796-5f46k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 14:49:34 crc kubenswrapper[4869]: [-]has-synced failed: reason withheld Mar 12 14:49:34 crc kubenswrapper[4869]: [+]process-running ok Mar 12 14:49:34 crc kubenswrapper[4869]: healthz check failed Mar 12 14:49:34 crc kubenswrapper[4869]: I0312 14:49:34.880783 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5f46k" podUID="e279336b-ac69-4574-9910-11c1fe663252" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 14:49:35 crc kubenswrapper[4869]: I0312 14:49:35.515551 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:49:35 crc kubenswrapper[4869]: I0312 14:49:35.880067 4869 patch_prober.go:28] interesting pod/router-default-5444994796-5f46k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 14:49:35 crc kubenswrapper[4869]: [+]has-synced ok Mar 12 14:49:35 crc kubenswrapper[4869]: [+]process-running ok Mar 12 14:49:35 crc kubenswrapper[4869]: healthz check failed Mar 12 14:49:35 crc kubenswrapper[4869]: I0312 14:49:35.880119 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5f46k" podUID="e279336b-ac69-4574-9910-11c1fe663252" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 14:49:36 crc kubenswrapper[4869]: I0312 14:49:36.881232 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-5f46k" Mar 12 14:49:36 crc kubenswrapper[4869]: I0312 14:49:36.884289 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-5f46k" Mar 12 14:49:37 crc kubenswrapper[4869]: I0312 14:49:37.716826 4869 patch_prober.go:28] interesting pod/console-f9d7485db-qfqjj container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 12 14:49:37 crc kubenswrapper[4869]: I0312 14:49:37.717161 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-qfqjj" podUID="e61f813a-db17-46a6-a380-9f13452ef07b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 12 14:49:37 crc kubenswrapper[4869]: I0312 14:49:37.822897 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-6xr6k" Mar 12 14:49:38 crc kubenswrapper[4869]: E0312 14:49:38.076191 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="72472770ea7bd55eac4c9d20c39ac3ba553fd8709bff38bbcbb73a2fe241d2f8" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 14:49:38 crc kubenswrapper[4869]: E0312 14:49:38.078355 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="72472770ea7bd55eac4c9d20c39ac3ba553fd8709bff38bbcbb73a2fe241d2f8" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 14:49:38 crc kubenswrapper[4869]: E0312 14:49:38.080213 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="72472770ea7bd55eac4c9d20c39ac3ba553fd8709bff38bbcbb73a2fe241d2f8" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 14:49:38 crc kubenswrapper[4869]: E0312 14:49:38.080248 4869 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-7zmqf" podUID="46d3c5a6-c886-4ae0-b381-95ffb9902718" containerName="kube-multus-additional-cni-plugins" Mar 12 14:49:38 crc kubenswrapper[4869]: I0312 14:49:38.110357 4869 ???:1] "http: TLS handshake error from 192.168.126.11:38924: no serving certificate available for the kubelet" Mar 12 14:49:40 crc kubenswrapper[4869]: I0312 14:49:40.472870 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-65d49948fd-qwd2g"] Mar 12 14:49:40 crc kubenswrapper[4869]: I0312 14:49:40.473506 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-65d49948fd-qwd2g" podUID="549a5bb8-50a4-4380-b8ad-27a1097119de" containerName="controller-manager" containerID="cri-o://05792b58fa366be55ca59abc6dc09665938d8bab87a799ed2b5e3d67a88edc0f" gracePeriod=30 Mar 12 14:49:40 crc kubenswrapper[4869]: I0312 14:49:40.488045 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f79bfd8b-n2qrd"] Mar 12 14:49:40 crc kubenswrapper[4869]: I0312 14:49:40.488310 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-f79bfd8b-n2qrd" podUID="11f8193e-ba56-4421-9458-e7f1c214db2b" containerName="route-controller-manager" containerID="cri-o://42f3ddd2823d34efa443e55f36b09c083d910108510c4167dfbea2942734b38d" gracePeriod=30 Mar 12 14:49:40 crc kubenswrapper[4869]: I0312 14:49:40.863322 4869 generic.go:334] "Generic (PLEG): container finished" podID="11f8193e-ba56-4421-9458-e7f1c214db2b" containerID="42f3ddd2823d34efa443e55f36b09c083d910108510c4167dfbea2942734b38d" exitCode=0 Mar 12 14:49:40 crc kubenswrapper[4869]: I0312 14:49:40.863389 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f79bfd8b-n2qrd" event={"ID":"11f8193e-ba56-4421-9458-e7f1c214db2b","Type":"ContainerDied","Data":"42f3ddd2823d34efa443e55f36b09c083d910108510c4167dfbea2942734b38d"} Mar 12 14:49:41 crc kubenswrapper[4869]: I0312 14:49:41.871011 4869 generic.go:334] "Generic (PLEG): container finished" podID="549a5bb8-50a4-4380-b8ad-27a1097119de" containerID="05792b58fa366be55ca59abc6dc09665938d8bab87a799ed2b5e3d67a88edc0f" exitCode=0 Mar 12 14:49:41 crc kubenswrapper[4869]: I0312 14:49:41.871093 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65d49948fd-qwd2g" event={"ID":"549a5bb8-50a4-4380-b8ad-27a1097119de","Type":"ContainerDied","Data":"05792b58fa366be55ca59abc6dc09665938d8bab87a799ed2b5e3d67a88edc0f"} Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.191633 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65d49948fd-qwd2g" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.218902 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd"] Mar 12 14:49:43 crc kubenswrapper[4869]: E0312 14:49:43.219278 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="549a5bb8-50a4-4380-b8ad-27a1097119de" containerName="controller-manager" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.219324 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="549a5bb8-50a4-4380-b8ad-27a1097119de" containerName="controller-manager" Mar 12 14:49:43 crc kubenswrapper[4869]: E0312 14:49:43.219355 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb85bfc1-88cb-4149-969f-72b0e93106d1" containerName="pruner" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.219391 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb85bfc1-88cb-4149-969f-72b0e93106d1" containerName="pruner" Mar 12 14:49:43 crc kubenswrapper[4869]: E0312 14:49:43.219412 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a848e70f-8986-4a95-8a1d-8f2adbd41fb6" containerName="pruner" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.219421 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="a848e70f-8986-4a95-8a1d-8f2adbd41fb6" containerName="pruner" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.219607 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="a848e70f-8986-4a95-8a1d-8f2adbd41fb6" containerName="pruner" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.219624 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb85bfc1-88cb-4149-969f-72b0e93106d1" containerName="pruner" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.219662 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="549a5bb8-50a4-4380-b8ad-27a1097119de" containerName="controller-manager" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.220219 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.225535 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd"] Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.239029 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f79bfd8b-n2qrd" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.280410 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11f8193e-ba56-4421-9458-e7f1c214db2b-serving-cert\") pod \"11f8193e-ba56-4421-9458-e7f1c214db2b\" (UID: \"11f8193e-ba56-4421-9458-e7f1c214db2b\") " Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.280482 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11f8193e-ba56-4421-9458-e7f1c214db2b-config\") pod \"11f8193e-ba56-4421-9458-e7f1c214db2b\" (UID: \"11f8193e-ba56-4421-9458-e7f1c214db2b\") " Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.280501 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/549a5bb8-50a4-4380-b8ad-27a1097119de-serving-cert\") pod \"549a5bb8-50a4-4380-b8ad-27a1097119de\" (UID: \"549a5bb8-50a4-4380-b8ad-27a1097119de\") " Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.280518 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/549a5bb8-50a4-4380-b8ad-27a1097119de-client-ca\") pod \"549a5bb8-50a4-4380-b8ad-27a1097119de\" (UID: \"549a5bb8-50a4-4380-b8ad-27a1097119de\") " Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.280545 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgt4v\" (UniqueName: \"kubernetes.io/projected/549a5bb8-50a4-4380-b8ad-27a1097119de-kube-api-access-xgt4v\") pod \"549a5bb8-50a4-4380-b8ad-27a1097119de\" (UID: \"549a5bb8-50a4-4380-b8ad-27a1097119de\") " Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.280998 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/549a5bb8-50a4-4380-b8ad-27a1097119de-config\") pod \"549a5bb8-50a4-4380-b8ad-27a1097119de\" (UID: \"549a5bb8-50a4-4380-b8ad-27a1097119de\") " Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.281050 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/549a5bb8-50a4-4380-b8ad-27a1097119de-proxy-ca-bundles\") pod \"549a5bb8-50a4-4380-b8ad-27a1097119de\" (UID: \"549a5bb8-50a4-4380-b8ad-27a1097119de\") " Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.281091 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqpgb\" (UniqueName: \"kubernetes.io/projected/11f8193e-ba56-4421-9458-e7f1c214db2b-kube-api-access-nqpgb\") pod \"11f8193e-ba56-4421-9458-e7f1c214db2b\" (UID: \"11f8193e-ba56-4421-9458-e7f1c214db2b\") " Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.281113 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11f8193e-ba56-4421-9458-e7f1c214db2b-client-ca\") pod \"11f8193e-ba56-4421-9458-e7f1c214db2b\" (UID: \"11f8193e-ba56-4421-9458-e7f1c214db2b\") " Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.281356 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb103b09-420a-4c89-9e40-f68fbc7b8d3b-serving-cert\") pod \"controller-manager-5b759d5cf5-b9lqd\" (UID: \"fb103b09-420a-4c89-9e40-f68fbc7b8d3b\") " pod="openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.281380 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb103b09-420a-4c89-9e40-f68fbc7b8d3b-proxy-ca-bundles\") pod \"controller-manager-5b759d5cf5-b9lqd\" (UID: \"fb103b09-420a-4c89-9e40-f68fbc7b8d3b\") " pod="openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.281439 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb103b09-420a-4c89-9e40-f68fbc7b8d3b-config\") pod \"controller-manager-5b759d5cf5-b9lqd\" (UID: \"fb103b09-420a-4c89-9e40-f68fbc7b8d3b\") " pod="openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.281506 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb103b09-420a-4c89-9e40-f68fbc7b8d3b-client-ca\") pod \"controller-manager-5b759d5cf5-b9lqd\" (UID: \"fb103b09-420a-4c89-9e40-f68fbc7b8d3b\") " pod="openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.281534 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhv7x\" (UniqueName: \"kubernetes.io/projected/fb103b09-420a-4c89-9e40-f68fbc7b8d3b-kube-api-access-jhv7x\") pod \"controller-manager-5b759d5cf5-b9lqd\" (UID: \"fb103b09-420a-4c89-9e40-f68fbc7b8d3b\") " pod="openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.281935 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/549a5bb8-50a4-4380-b8ad-27a1097119de-client-ca" (OuterVolumeSpecName: "client-ca") pod "549a5bb8-50a4-4380-b8ad-27a1097119de" (UID: "549a5bb8-50a4-4380-b8ad-27a1097119de"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.282057 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/549a5bb8-50a4-4380-b8ad-27a1097119de-config" (OuterVolumeSpecName: "config") pod "549a5bb8-50a4-4380-b8ad-27a1097119de" (UID: "549a5bb8-50a4-4380-b8ad-27a1097119de"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.282097 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/549a5bb8-50a4-4380-b8ad-27a1097119de-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "549a5bb8-50a4-4380-b8ad-27a1097119de" (UID: "549a5bb8-50a4-4380-b8ad-27a1097119de"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.282134 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11f8193e-ba56-4421-9458-e7f1c214db2b-client-ca" (OuterVolumeSpecName: "client-ca") pod "11f8193e-ba56-4421-9458-e7f1c214db2b" (UID: "11f8193e-ba56-4421-9458-e7f1c214db2b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.282224 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11f8193e-ba56-4421-9458-e7f1c214db2b-config" (OuterVolumeSpecName: "config") pod "11f8193e-ba56-4421-9458-e7f1c214db2b" (UID: "11f8193e-ba56-4421-9458-e7f1c214db2b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.286045 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11f8193e-ba56-4421-9458-e7f1c214db2b-kube-api-access-nqpgb" (OuterVolumeSpecName: "kube-api-access-nqpgb") pod "11f8193e-ba56-4421-9458-e7f1c214db2b" (UID: "11f8193e-ba56-4421-9458-e7f1c214db2b"). InnerVolumeSpecName "kube-api-access-nqpgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.286060 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11f8193e-ba56-4421-9458-e7f1c214db2b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "11f8193e-ba56-4421-9458-e7f1c214db2b" (UID: "11f8193e-ba56-4421-9458-e7f1c214db2b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.286174 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/549a5bb8-50a4-4380-b8ad-27a1097119de-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "549a5bb8-50a4-4380-b8ad-27a1097119de" (UID: "549a5bb8-50a4-4380-b8ad-27a1097119de"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.292148 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/549a5bb8-50a4-4380-b8ad-27a1097119de-kube-api-access-xgt4v" (OuterVolumeSpecName: "kube-api-access-xgt4v") pod "549a5bb8-50a4-4380-b8ad-27a1097119de" (UID: "549a5bb8-50a4-4380-b8ad-27a1097119de"). InnerVolumeSpecName "kube-api-access-xgt4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.351593 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.382300 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb103b09-420a-4c89-9e40-f68fbc7b8d3b-client-ca\") pod \"controller-manager-5b759d5cf5-b9lqd\" (UID: \"fb103b09-420a-4c89-9e40-f68fbc7b8d3b\") " pod="openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.382374 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhv7x\" (UniqueName: \"kubernetes.io/projected/fb103b09-420a-4c89-9e40-f68fbc7b8d3b-kube-api-access-jhv7x\") pod \"controller-manager-5b759d5cf5-b9lqd\" (UID: \"fb103b09-420a-4c89-9e40-f68fbc7b8d3b\") " pod="openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.382445 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb103b09-420a-4c89-9e40-f68fbc7b8d3b-serving-cert\") pod \"controller-manager-5b759d5cf5-b9lqd\" (UID: \"fb103b09-420a-4c89-9e40-f68fbc7b8d3b\") " pod="openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.382467 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb103b09-420a-4c89-9e40-f68fbc7b8d3b-proxy-ca-bundles\") pod \"controller-manager-5b759d5cf5-b9lqd\" (UID: \"fb103b09-420a-4c89-9e40-f68fbc7b8d3b\") " pod="openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.382494 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb103b09-420a-4c89-9e40-f68fbc7b8d3b-config\") pod \"controller-manager-5b759d5cf5-b9lqd\" (UID: \"fb103b09-420a-4c89-9e40-f68fbc7b8d3b\") " pod="openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.382542 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11f8193e-ba56-4421-9458-e7f1c214db2b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.382558 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/549a5bb8-50a4-4380-b8ad-27a1097119de-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.382571 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11f8193e-ba56-4421-9458-e7f1c214db2b-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.382597 4869 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/549a5bb8-50a4-4380-b8ad-27a1097119de-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.382607 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgt4v\" (UniqueName: \"kubernetes.io/projected/549a5bb8-50a4-4380-b8ad-27a1097119de-kube-api-access-xgt4v\") on node \"crc\" DevicePath \"\"" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.382615 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/549a5bb8-50a4-4380-b8ad-27a1097119de-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.382623 4869 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/549a5bb8-50a4-4380-b8ad-27a1097119de-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.382632 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqpgb\" (UniqueName: \"kubernetes.io/projected/11f8193e-ba56-4421-9458-e7f1c214db2b-kube-api-access-nqpgb\") on node \"crc\" DevicePath \"\"" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.382640 4869 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11f8193e-ba56-4421-9458-e7f1c214db2b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.383323 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb103b09-420a-4c89-9e40-f68fbc7b8d3b-client-ca\") pod \"controller-manager-5b759d5cf5-b9lqd\" (UID: \"fb103b09-420a-4c89-9e40-f68fbc7b8d3b\") " pod="openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.385122 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb103b09-420a-4c89-9e40-f68fbc7b8d3b-proxy-ca-bundles\") pod \"controller-manager-5b759d5cf5-b9lqd\" (UID: \"fb103b09-420a-4c89-9e40-f68fbc7b8d3b\") " pod="openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.388165 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb103b09-420a-4c89-9e40-f68fbc7b8d3b-serving-cert\") pod \"controller-manager-5b759d5cf5-b9lqd\" (UID: \"fb103b09-420a-4c89-9e40-f68fbc7b8d3b\") " pod="openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.395982 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb103b09-420a-4c89-9e40-f68fbc7b8d3b-config\") pod \"controller-manager-5b759d5cf5-b9lqd\" (UID: \"fb103b09-420a-4c89-9e40-f68fbc7b8d3b\") " pod="openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.398145 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhv7x\" (UniqueName: \"kubernetes.io/projected/fb103b09-420a-4c89-9e40-f68fbc7b8d3b-kube-api-access-jhv7x\") pod \"controller-manager-5b759d5cf5-b9lqd\" (UID: \"fb103b09-420a-4c89-9e40-f68fbc7b8d3b\") " pod="openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.556284 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.763606 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd"] Mar 12 14:49:43 crc kubenswrapper[4869]: W0312 14:49:43.775004 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb103b09_420a_4c89_9e40_f68fbc7b8d3b.slice/crio-a5813c8081b8cde77e3e99e317f5ddd76df5f6ca0750d37860642af38b8a3460 WatchSource:0}: Error finding container a5813c8081b8cde77e3e99e317f5ddd76df5f6ca0750d37860642af38b8a3460: Status 404 returned error can't find the container with id a5813c8081b8cde77e3e99e317f5ddd76df5f6ca0750d37860642af38b8a3460 Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.882788 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65d49948fd-qwd2g" event={"ID":"549a5bb8-50a4-4380-b8ad-27a1097119de","Type":"ContainerDied","Data":"0173fb3045a51fbdc9485100250cb375296f436f8d1336a71c2d0b66eba0fda2"} Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.882809 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65d49948fd-qwd2g" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.882874 4869 scope.go:117] "RemoveContainer" containerID="05792b58fa366be55ca59abc6dc09665938d8bab87a799ed2b5e3d67a88edc0f" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.883544 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd" event={"ID":"fb103b09-420a-4c89-9e40-f68fbc7b8d3b","Type":"ContainerStarted","Data":"a5813c8081b8cde77e3e99e317f5ddd76df5f6ca0750d37860642af38b8a3460"} Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.885664 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f79bfd8b-n2qrd" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.885815 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f79bfd8b-n2qrd" event={"ID":"11f8193e-ba56-4421-9458-e7f1c214db2b","Type":"ContainerDied","Data":"eb9f28434bbe3754b5c023891f0f289fd2a812cf9a42175e834c11dbdcc82149"} Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.899852 4869 scope.go:117] "RemoveContainer" containerID="42f3ddd2823d34efa443e55f36b09c083d910108510c4167dfbea2942734b38d" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.900274 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=0.900250367 podStartE2EDuration="900.250367ms" podCreationTimestamp="2026-03-12 14:49:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:43.897762484 +0000 UTC m=+136.182987762" watchObservedRunningTime="2026-03-12 14:49:43.900250367 +0000 UTC m=+136.185475645" Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.928021 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f79bfd8b-n2qrd"] Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.932367 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f79bfd8b-n2qrd"] Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.939779 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-65d49948fd-qwd2g"] Mar 12 14:49:43 crc kubenswrapper[4869]: I0312 14:49:43.942796 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-65d49948fd-qwd2g"] Mar 12 14:49:44 crc kubenswrapper[4869]: I0312 14:49:44.173020 4869 patch_prober.go:28] interesting pod/route-controller-manager-f79bfd8b-n2qrd container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:8443/healthz\": context deadline exceeded" start-of-body= Mar 12 14:49:44 crc kubenswrapper[4869]: I0312 14:49:44.173100 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-f79bfd8b-n2qrd" podUID="11f8193e-ba56-4421-9458-e7f1c214db2b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.44:8443/healthz\": context deadline exceeded" Mar 12 14:49:44 crc kubenswrapper[4869]: I0312 14:49:44.342564 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11f8193e-ba56-4421-9458-e7f1c214db2b" path="/var/lib/kubelet/pods/11f8193e-ba56-4421-9458-e7f1c214db2b/volumes" Mar 12 14:49:44 crc kubenswrapper[4869]: I0312 14:49:44.343402 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="549a5bb8-50a4-4380-b8ad-27a1097119de" path="/var/lib/kubelet/pods/549a5bb8-50a4-4380-b8ad-27a1097119de/volumes" Mar 12 14:49:44 crc kubenswrapper[4869]: I0312 14:49:44.893877 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd" event={"ID":"fb103b09-420a-4c89-9e40-f68fbc7b8d3b","Type":"ContainerStarted","Data":"af1af46808ddc8c90dfcf2c7d3addeeaf18a7d49b4fbfcecee71cd49d7ff25ad"} Mar 12 14:49:44 crc kubenswrapper[4869]: I0312 14:49:44.895043 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd" Mar 12 14:49:44 crc kubenswrapper[4869]: I0312 14:49:44.901133 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd" Mar 12 14:49:44 crc kubenswrapper[4869]: I0312 14:49:44.920424 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd" podStartSLOduration=4.920400562 podStartE2EDuration="4.920400562s" podCreationTimestamp="2026-03-12 14:49:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:44.916188742 +0000 UTC m=+137.201414040" watchObservedRunningTime="2026-03-12 14:49:44.920400562 +0000 UTC m=+137.205625840" Mar 12 14:49:45 crc kubenswrapper[4869]: I0312 14:49:45.863104 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84995f8c99-znz8g"] Mar 12 14:49:45 crc kubenswrapper[4869]: E0312 14:49:45.863658 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11f8193e-ba56-4421-9458-e7f1c214db2b" containerName="route-controller-manager" Mar 12 14:49:45 crc kubenswrapper[4869]: I0312 14:49:45.863670 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="11f8193e-ba56-4421-9458-e7f1c214db2b" containerName="route-controller-manager" Mar 12 14:49:45 crc kubenswrapper[4869]: I0312 14:49:45.863783 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="11f8193e-ba56-4421-9458-e7f1c214db2b" containerName="route-controller-manager" Mar 12 14:49:45 crc kubenswrapper[4869]: I0312 14:49:45.864107 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84995f8c99-znz8g" Mar 12 14:49:45 crc kubenswrapper[4869]: I0312 14:49:45.866543 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 12 14:49:45 crc kubenswrapper[4869]: I0312 14:49:45.867020 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 12 14:49:45 crc kubenswrapper[4869]: I0312 14:49:45.867266 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 12 14:49:45 crc kubenswrapper[4869]: I0312 14:49:45.867598 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 12 14:49:45 crc kubenswrapper[4869]: I0312 14:49:45.867866 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 12 14:49:45 crc kubenswrapper[4869]: I0312 14:49:45.875923 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 12 14:49:45 crc kubenswrapper[4869]: I0312 14:49:45.876151 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84995f8c99-znz8g"] Mar 12 14:49:45 crc kubenswrapper[4869]: I0312 14:49:45.934017 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f845ac0-2137-425f-b2e7-e9326cb15eea-client-ca\") pod \"route-controller-manager-84995f8c99-znz8g\" (UID: \"0f845ac0-2137-425f-b2e7-e9326cb15eea\") " pod="openshift-route-controller-manager/route-controller-manager-84995f8c99-znz8g" Mar 12 14:49:45 crc kubenswrapper[4869]: I0312 14:49:45.934122 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgnm7\" (UniqueName: \"kubernetes.io/projected/0f845ac0-2137-425f-b2e7-e9326cb15eea-kube-api-access-sgnm7\") pod \"route-controller-manager-84995f8c99-znz8g\" (UID: \"0f845ac0-2137-425f-b2e7-e9326cb15eea\") " pod="openshift-route-controller-manager/route-controller-manager-84995f8c99-znz8g" Mar 12 14:49:45 crc kubenswrapper[4869]: I0312 14:49:45.934440 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f845ac0-2137-425f-b2e7-e9326cb15eea-serving-cert\") pod \"route-controller-manager-84995f8c99-znz8g\" (UID: \"0f845ac0-2137-425f-b2e7-e9326cb15eea\") " pod="openshift-route-controller-manager/route-controller-manager-84995f8c99-znz8g" Mar 12 14:49:45 crc kubenswrapper[4869]: I0312 14:49:45.934611 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f845ac0-2137-425f-b2e7-e9326cb15eea-config\") pod \"route-controller-manager-84995f8c99-znz8g\" (UID: \"0f845ac0-2137-425f-b2e7-e9326cb15eea\") " pod="openshift-route-controller-manager/route-controller-manager-84995f8c99-znz8g" Mar 12 14:49:45 crc kubenswrapper[4869]: I0312 14:49:45.968271 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:49:46 crc kubenswrapper[4869]: I0312 14:49:46.036296 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgnm7\" (UniqueName: \"kubernetes.io/projected/0f845ac0-2137-425f-b2e7-e9326cb15eea-kube-api-access-sgnm7\") pod \"route-controller-manager-84995f8c99-znz8g\" (UID: \"0f845ac0-2137-425f-b2e7-e9326cb15eea\") " pod="openshift-route-controller-manager/route-controller-manager-84995f8c99-znz8g" Mar 12 14:49:46 crc kubenswrapper[4869]: I0312 14:49:46.036425 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f845ac0-2137-425f-b2e7-e9326cb15eea-serving-cert\") pod \"route-controller-manager-84995f8c99-znz8g\" (UID: \"0f845ac0-2137-425f-b2e7-e9326cb15eea\") " pod="openshift-route-controller-manager/route-controller-manager-84995f8c99-znz8g" Mar 12 14:49:46 crc kubenswrapper[4869]: I0312 14:49:46.036469 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f845ac0-2137-425f-b2e7-e9326cb15eea-config\") pod \"route-controller-manager-84995f8c99-znz8g\" (UID: \"0f845ac0-2137-425f-b2e7-e9326cb15eea\") " pod="openshift-route-controller-manager/route-controller-manager-84995f8c99-znz8g" Mar 12 14:49:46 crc kubenswrapper[4869]: I0312 14:49:46.036510 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f845ac0-2137-425f-b2e7-e9326cb15eea-client-ca\") pod \"route-controller-manager-84995f8c99-znz8g\" (UID: \"0f845ac0-2137-425f-b2e7-e9326cb15eea\") " pod="openshift-route-controller-manager/route-controller-manager-84995f8c99-znz8g" Mar 12 14:49:46 crc kubenswrapper[4869]: I0312 14:49:46.039747 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f845ac0-2137-425f-b2e7-e9326cb15eea-config\") pod \"route-controller-manager-84995f8c99-znz8g\" (UID: \"0f845ac0-2137-425f-b2e7-e9326cb15eea\") " pod="openshift-route-controller-manager/route-controller-manager-84995f8c99-znz8g" Mar 12 14:49:46 crc kubenswrapper[4869]: I0312 14:49:46.044706 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f845ac0-2137-425f-b2e7-e9326cb15eea-client-ca\") pod \"route-controller-manager-84995f8c99-znz8g\" (UID: \"0f845ac0-2137-425f-b2e7-e9326cb15eea\") " pod="openshift-route-controller-manager/route-controller-manager-84995f8c99-znz8g" Mar 12 14:49:46 crc kubenswrapper[4869]: I0312 14:49:46.053204 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f845ac0-2137-425f-b2e7-e9326cb15eea-serving-cert\") pod \"route-controller-manager-84995f8c99-znz8g\" (UID: \"0f845ac0-2137-425f-b2e7-e9326cb15eea\") " pod="openshift-route-controller-manager/route-controller-manager-84995f8c99-znz8g" Mar 12 14:49:46 crc kubenswrapper[4869]: I0312 14:49:46.078325 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgnm7\" (UniqueName: \"kubernetes.io/projected/0f845ac0-2137-425f-b2e7-e9326cb15eea-kube-api-access-sgnm7\") pod \"route-controller-manager-84995f8c99-znz8g\" (UID: \"0f845ac0-2137-425f-b2e7-e9326cb15eea\") " pod="openshift-route-controller-manager/route-controller-manager-84995f8c99-znz8g" Mar 12 14:49:46 crc kubenswrapper[4869]: I0312 14:49:46.190186 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84995f8c99-znz8g" Mar 12 14:49:47 crc kubenswrapper[4869]: I0312 14:49:47.762523 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-qfqjj" Mar 12 14:49:47 crc kubenswrapper[4869]: I0312 14:49:47.767860 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-qfqjj" Mar 12 14:49:48 crc kubenswrapper[4869]: E0312 14:49:48.059394 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="72472770ea7bd55eac4c9d20c39ac3ba553fd8709bff38bbcbb73a2fe241d2f8" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 14:49:48 crc kubenswrapper[4869]: E0312 14:49:48.060747 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="72472770ea7bd55eac4c9d20c39ac3ba553fd8709bff38bbcbb73a2fe241d2f8" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 14:49:48 crc kubenswrapper[4869]: E0312 14:49:48.062181 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="72472770ea7bd55eac4c9d20c39ac3ba553fd8709bff38bbcbb73a2fe241d2f8" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 14:49:48 crc kubenswrapper[4869]: E0312 14:49:48.062223 4869 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-7zmqf" podUID="46d3c5a6-c886-4ae0-b381-95ffb9902718" containerName="kube-multus-additional-cni-plugins" Mar 12 14:49:54 crc kubenswrapper[4869]: I0312 14:49:54.976379 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-7zmqf_46d3c5a6-c886-4ae0-b381-95ffb9902718/kube-multus-additional-cni-plugins/0.log" Mar 12 14:49:54 crc kubenswrapper[4869]: I0312 14:49:54.976904 4869 generic.go:334] "Generic (PLEG): container finished" podID="46d3c5a6-c886-4ae0-b381-95ffb9902718" containerID="72472770ea7bd55eac4c9d20c39ac3ba553fd8709bff38bbcbb73a2fe241d2f8" exitCode=137 Mar 12 14:49:54 crc kubenswrapper[4869]: I0312 14:49:54.976938 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-7zmqf" event={"ID":"46d3c5a6-c886-4ae0-b381-95ffb9902718","Type":"ContainerDied","Data":"72472770ea7bd55eac4c9d20c39ac3ba553fd8709bff38bbcbb73a2fe241d2f8"} Mar 12 14:49:57 crc kubenswrapper[4869]: I0312 14:49:57.640076 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-7zmqf_46d3c5a6-c886-4ae0-b381-95ffb9902718/kube-multus-additional-cni-plugins/0.log" Mar 12 14:49:57 crc kubenswrapper[4869]: I0312 14:49:57.640349 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-7zmqf" Mar 12 14:49:57 crc kubenswrapper[4869]: E0312 14:49:57.670501 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 12 14:49:57 crc kubenswrapper[4869]: E0312 14:49:57.670659 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mzvvw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-6x5kk_openshift-marketplace(82120625-a0ba-4136-b3d5-23b2b78f72cb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 12 14:49:57 crc kubenswrapper[4869]: E0312 14:49:57.672534 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-6x5kk" podUID="82120625-a0ba-4136-b3d5-23b2b78f72cb" Mar 12 14:49:57 crc kubenswrapper[4869]: I0312 14:49:57.699034 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7fpl\" (UniqueName: \"kubernetes.io/projected/46d3c5a6-c886-4ae0-b381-95ffb9902718-kube-api-access-v7fpl\") pod \"46d3c5a6-c886-4ae0-b381-95ffb9902718\" (UID: \"46d3c5a6-c886-4ae0-b381-95ffb9902718\") " Mar 12 14:49:57 crc kubenswrapper[4869]: I0312 14:49:57.699090 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/46d3c5a6-c886-4ae0-b381-95ffb9902718-tuning-conf-dir\") pod \"46d3c5a6-c886-4ae0-b381-95ffb9902718\" (UID: \"46d3c5a6-c886-4ae0-b381-95ffb9902718\") " Mar 12 14:49:57 crc kubenswrapper[4869]: I0312 14:49:57.699117 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/46d3c5a6-c886-4ae0-b381-95ffb9902718-cni-sysctl-allowlist\") pod \"46d3c5a6-c886-4ae0-b381-95ffb9902718\" (UID: \"46d3c5a6-c886-4ae0-b381-95ffb9902718\") " Mar 12 14:49:57 crc kubenswrapper[4869]: I0312 14:49:57.699183 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/46d3c5a6-c886-4ae0-b381-95ffb9902718-ready\") pod \"46d3c5a6-c886-4ae0-b381-95ffb9902718\" (UID: \"46d3c5a6-c886-4ae0-b381-95ffb9902718\") " Mar 12 14:49:57 crc kubenswrapper[4869]: I0312 14:49:57.699756 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46d3c5a6-c886-4ae0-b381-95ffb9902718-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "46d3c5a6-c886-4ae0-b381-95ffb9902718" (UID: "46d3c5a6-c886-4ae0-b381-95ffb9902718"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 14:49:57 crc kubenswrapper[4869]: I0312 14:49:57.699897 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46d3c5a6-c886-4ae0-b381-95ffb9902718-ready" (OuterVolumeSpecName: "ready") pod "46d3c5a6-c886-4ae0-b381-95ffb9902718" (UID: "46d3c5a6-c886-4ae0-b381-95ffb9902718"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:49:57 crc kubenswrapper[4869]: I0312 14:49:57.700181 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46d3c5a6-c886-4ae0-b381-95ffb9902718-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "46d3c5a6-c886-4ae0-b381-95ffb9902718" (UID: "46d3c5a6-c886-4ae0-b381-95ffb9902718"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:49:57 crc kubenswrapper[4869]: I0312 14:49:57.705360 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46d3c5a6-c886-4ae0-b381-95ffb9902718-kube-api-access-v7fpl" (OuterVolumeSpecName: "kube-api-access-v7fpl") pod "46d3c5a6-c886-4ae0-b381-95ffb9902718" (UID: "46d3c5a6-c886-4ae0-b381-95ffb9902718"). InnerVolumeSpecName "kube-api-access-v7fpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:49:57 crc kubenswrapper[4869]: I0312 14:49:57.800278 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7fpl\" (UniqueName: \"kubernetes.io/projected/46d3c5a6-c886-4ae0-b381-95ffb9902718-kube-api-access-v7fpl\") on node \"crc\" DevicePath \"\"" Mar 12 14:49:57 crc kubenswrapper[4869]: I0312 14:49:57.800606 4869 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/46d3c5a6-c886-4ae0-b381-95ffb9902718-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Mar 12 14:49:57 crc kubenswrapper[4869]: I0312 14:49:57.800619 4869 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/46d3c5a6-c886-4ae0-b381-95ffb9902718-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 12 14:49:57 crc kubenswrapper[4869]: I0312 14:49:57.800628 4869 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/46d3c5a6-c886-4ae0-b381-95ffb9902718-ready\") on node \"crc\" DevicePath \"\"" Mar 12 14:49:57 crc kubenswrapper[4869]: I0312 14:49:57.976416 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8jwlf" Mar 12 14:49:57 crc kubenswrapper[4869]: I0312 14:49:57.995107 4869 generic.go:334] "Generic (PLEG): container finished" podID="e65737a9-e615-4a51-a72c-e4b561bdd1b0" containerID="f8657cc80b562c840a5962dbf99a18f7c23c9fb943d0a4f378426ad6ea01fe7a" exitCode=0 Mar 12 14:49:57 crc kubenswrapper[4869]: I0312 14:49:57.995179 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvphv" event={"ID":"e65737a9-e615-4a51-a72c-e4b561bdd1b0","Type":"ContainerDied","Data":"f8657cc80b562c840a5962dbf99a18f7c23c9fb943d0a4f378426ad6ea01fe7a"} Mar 12 14:49:58 crc kubenswrapper[4869]: I0312 14:49:58.003200 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cfj7c" event={"ID":"74481165-f298-4cdc-9af7-feef043fa182","Type":"ContainerStarted","Data":"3c2f2705967e07c6b85029ab5f3fc5d385b807153bc7f13c8df461231acea340"} Mar 12 14:49:58 crc kubenswrapper[4869]: I0312 14:49:58.014354 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-7zmqf_46d3c5a6-c886-4ae0-b381-95ffb9902718/kube-multus-additional-cni-plugins/0.log" Mar 12 14:49:58 crc kubenswrapper[4869]: I0312 14:49:58.014738 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-7zmqf" event={"ID":"46d3c5a6-c886-4ae0-b381-95ffb9902718","Type":"ContainerDied","Data":"c08f009987d3d7a4091e63b8f8bf03c83adb6ead2095c0f6f8383bf42eef72f0"} Mar 12 14:49:58 crc kubenswrapper[4869]: I0312 14:49:58.014867 4869 scope.go:117] "RemoveContainer" containerID="72472770ea7bd55eac4c9d20c39ac3ba553fd8709bff38bbcbb73a2fe241d2f8" Mar 12 14:49:58 crc kubenswrapper[4869]: I0312 14:49:58.014749 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-7zmqf" Mar 12 14:49:58 crc kubenswrapper[4869]: I0312 14:49:58.028046 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngq4q" event={"ID":"8bf6f93d-a6f8-494b-abfa-47f8a164b667","Type":"ContainerStarted","Data":"5f069dafa1443afdf8854b3f8a9943d1403024586b5b16823c3872cf9380cff1"} Mar 12 14:49:58 crc kubenswrapper[4869]: I0312 14:49:58.035612 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66jjg" event={"ID":"1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2","Type":"ContainerStarted","Data":"ee7998dbe3181a67f3f31d6d2c2f8cb53fec28f68f15bdab5d0cd0f5f74a0927"} Mar 12 14:49:58 crc kubenswrapper[4869]: I0312 14:49:58.049394 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxjq9" event={"ID":"9b8fa922-8e49-42e1-a4a5-40069c505bf6","Type":"ContainerStarted","Data":"c9b40e76993fec66ffcc9316489fec0b6cdc784726c11fc9bff161cf0b543389"} Mar 12 14:49:58 crc kubenswrapper[4869]: I0312 14:49:58.054974 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84995f8c99-znz8g"] Mar 12 14:49:58 crc kubenswrapper[4869]: I0312 14:49:58.065462 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sc6nv" event={"ID":"078c5068-e532-4d7f-badc-c7f0c23a3191","Type":"ContainerStarted","Data":"a54059a61ff0154f6277e5821c5e38ba15e6957de731c93b3025e09b06c7da83"} Mar 12 14:49:58 crc kubenswrapper[4869]: I0312 14:49:58.073414 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zs5nz" event={"ID":"ffa8f886-960d-4115-a0e0-3ca252d2af08","Type":"ContainerStarted","Data":"83688d11ff4972d61fc95637b12f80cd8df70d8bb3987e90ba90cf6e72e253be"} Mar 12 14:49:58 crc kubenswrapper[4869]: E0312 14:49:58.081217 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-6x5kk" podUID="82120625-a0ba-4136-b3d5-23b2b78f72cb" Mar 12 14:49:58 crc kubenswrapper[4869]: I0312 14:49:58.108397 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-7zmqf"] Mar 12 14:49:58 crc kubenswrapper[4869]: I0312 14:49:58.112771 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-7zmqf"] Mar 12 14:49:58 crc kubenswrapper[4869]: I0312 14:49:58.345181 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46d3c5a6-c886-4ae0-b381-95ffb9902718" path="/var/lib/kubelet/pods/46d3c5a6-c886-4ae0-b381-95ffb9902718/volumes" Mar 12 14:49:59 crc kubenswrapper[4869]: I0312 14:49:59.081197 4869 generic.go:334] "Generic (PLEG): container finished" podID="1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2" containerID="ee7998dbe3181a67f3f31d6d2c2f8cb53fec28f68f15bdab5d0cd0f5f74a0927" exitCode=0 Mar 12 14:49:59 crc kubenswrapper[4869]: I0312 14:49:59.081283 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66jjg" event={"ID":"1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2","Type":"ContainerDied","Data":"ee7998dbe3181a67f3f31d6d2c2f8cb53fec28f68f15bdab5d0cd0f5f74a0927"} Mar 12 14:49:59 crc kubenswrapper[4869]: I0312 14:49:59.083639 4869 generic.go:334] "Generic (PLEG): container finished" podID="9b8fa922-8e49-42e1-a4a5-40069c505bf6" containerID="c9b40e76993fec66ffcc9316489fec0b6cdc784726c11fc9bff161cf0b543389" exitCode=0 Mar 12 14:49:59 crc kubenswrapper[4869]: I0312 14:49:59.083884 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxjq9" event={"ID":"9b8fa922-8e49-42e1-a4a5-40069c505bf6","Type":"ContainerDied","Data":"c9b40e76993fec66ffcc9316489fec0b6cdc784726c11fc9bff161cf0b543389"} Mar 12 14:49:59 crc kubenswrapper[4869]: I0312 14:49:59.087617 4869 generic.go:334] "Generic (PLEG): container finished" podID="078c5068-e532-4d7f-badc-c7f0c23a3191" containerID="a54059a61ff0154f6277e5821c5e38ba15e6957de731c93b3025e09b06c7da83" exitCode=0 Mar 12 14:49:59 crc kubenswrapper[4869]: I0312 14:49:59.087761 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sc6nv" event={"ID":"078c5068-e532-4d7f-badc-c7f0c23a3191","Type":"ContainerDied","Data":"a54059a61ff0154f6277e5821c5e38ba15e6957de731c93b3025e09b06c7da83"} Mar 12 14:49:59 crc kubenswrapper[4869]: I0312 14:49:59.091445 4869 generic.go:334] "Generic (PLEG): container finished" podID="ffa8f886-960d-4115-a0e0-3ca252d2af08" containerID="83688d11ff4972d61fc95637b12f80cd8df70d8bb3987e90ba90cf6e72e253be" exitCode=0 Mar 12 14:49:59 crc kubenswrapper[4869]: I0312 14:49:59.091502 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zs5nz" event={"ID":"ffa8f886-960d-4115-a0e0-3ca252d2af08","Type":"ContainerDied","Data":"83688d11ff4972d61fc95637b12f80cd8df70d8bb3987e90ba90cf6e72e253be"} Mar 12 14:49:59 crc kubenswrapper[4869]: I0312 14:49:59.095486 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84995f8c99-znz8g" event={"ID":"0f845ac0-2137-425f-b2e7-e9326cb15eea","Type":"ContainerStarted","Data":"968f6568c8d464d6251819c72ba6e979eb93dbab5d69b37e135da870619d3f00"} Mar 12 14:49:59 crc kubenswrapper[4869]: I0312 14:49:59.095512 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84995f8c99-znz8g" event={"ID":"0f845ac0-2137-425f-b2e7-e9326cb15eea","Type":"ContainerStarted","Data":"875e8c6ccfb7b1f52a2388f1f16581f2cb10fb2c365148b5999cffc702dff677"} Mar 12 14:49:59 crc kubenswrapper[4869]: I0312 14:49:59.095947 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-84995f8c99-znz8g" Mar 12 14:49:59 crc kubenswrapper[4869]: I0312 14:49:59.105030 4869 generic.go:334] "Generic (PLEG): container finished" podID="8bf6f93d-a6f8-494b-abfa-47f8a164b667" containerID="5f069dafa1443afdf8854b3f8a9943d1403024586b5b16823c3872cf9380cff1" exitCode=0 Mar 12 14:49:59 crc kubenswrapper[4869]: I0312 14:49:59.105093 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngq4q" event={"ID":"8bf6f93d-a6f8-494b-abfa-47f8a164b667","Type":"ContainerDied","Data":"5f069dafa1443afdf8854b3f8a9943d1403024586b5b16823c3872cf9380cff1"} Mar 12 14:49:59 crc kubenswrapper[4869]: I0312 14:49:59.111996 4869 generic.go:334] "Generic (PLEG): container finished" podID="74481165-f298-4cdc-9af7-feef043fa182" containerID="3c2f2705967e07c6b85029ab5f3fc5d385b807153bc7f13c8df461231acea340" exitCode=0 Mar 12 14:49:59 crc kubenswrapper[4869]: I0312 14:49:59.112131 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cfj7c" event={"ID":"74481165-f298-4cdc-9af7-feef043fa182","Type":"ContainerDied","Data":"3c2f2705967e07c6b85029ab5f3fc5d385b807153bc7f13c8df461231acea340"} Mar 12 14:49:59 crc kubenswrapper[4869]: I0312 14:49:59.113689 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-84995f8c99-znz8g" Mar 12 14:49:59 crc kubenswrapper[4869]: I0312 14:49:59.177359 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-84995f8c99-znz8g" podStartSLOduration=19.177337767 podStartE2EDuration="19.177337767s" podCreationTimestamp="2026-03-12 14:49:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:49:59.176365424 +0000 UTC m=+151.461590702" watchObservedRunningTime="2026-03-12 14:49:59.177337767 +0000 UTC m=+151.462563045" Mar 12 14:50:00 crc kubenswrapper[4869]: I0312 14:50:00.132286 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555450-qsz9r"] Mar 12 14:50:00 crc kubenswrapper[4869]: E0312 14:50:00.132884 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d3c5a6-c886-4ae0-b381-95ffb9902718" containerName="kube-multus-additional-cni-plugins" Mar 12 14:50:00 crc kubenswrapper[4869]: I0312 14:50:00.132898 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d3c5a6-c886-4ae0-b381-95ffb9902718" containerName="kube-multus-additional-cni-plugins" Mar 12 14:50:00 crc kubenswrapper[4869]: I0312 14:50:00.133040 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="46d3c5a6-c886-4ae0-b381-95ffb9902718" containerName="kube-multus-additional-cni-plugins" Mar 12 14:50:00 crc kubenswrapper[4869]: I0312 14:50:00.134531 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555450-qsz9r" Mar 12 14:50:00 crc kubenswrapper[4869]: I0312 14:50:00.137775 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:50:00 crc kubenswrapper[4869]: I0312 14:50:00.137870 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 14:50:00 crc kubenswrapper[4869]: I0312 14:50:00.137995 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:50:00 crc kubenswrapper[4869]: I0312 14:50:00.138836 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555450-qsz9r"] Mar 12 14:50:00 crc kubenswrapper[4869]: I0312 14:50:00.241337 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8frrv\" (UniqueName: \"kubernetes.io/projected/c6a83972-34aa-4505-8b6f-b1345b7981cd-kube-api-access-8frrv\") pod \"auto-csr-approver-29555450-qsz9r\" (UID: \"c6a83972-34aa-4505-8b6f-b1345b7981cd\") " pod="openshift-infra/auto-csr-approver-29555450-qsz9r" Mar 12 14:50:00 crc kubenswrapper[4869]: I0312 14:50:00.342334 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8frrv\" (UniqueName: \"kubernetes.io/projected/c6a83972-34aa-4505-8b6f-b1345b7981cd-kube-api-access-8frrv\") pod \"auto-csr-approver-29555450-qsz9r\" (UID: \"c6a83972-34aa-4505-8b6f-b1345b7981cd\") " pod="openshift-infra/auto-csr-approver-29555450-qsz9r" Mar 12 14:50:00 crc kubenswrapper[4869]: I0312 14:50:00.362346 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8frrv\" (UniqueName: \"kubernetes.io/projected/c6a83972-34aa-4505-8b6f-b1345b7981cd-kube-api-access-8frrv\") pod \"auto-csr-approver-29555450-qsz9r\" (UID: \"c6a83972-34aa-4505-8b6f-b1345b7981cd\") " pod="openshift-infra/auto-csr-approver-29555450-qsz9r" Mar 12 14:50:00 crc kubenswrapper[4869]: I0312 14:50:00.444211 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd"] Mar 12 14:50:00 crc kubenswrapper[4869]: I0312 14:50:00.444487 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd" podUID="fb103b09-420a-4c89-9e40-f68fbc7b8d3b" containerName="controller-manager" containerID="cri-o://af1af46808ddc8c90dfcf2c7d3addeeaf18a7d49b4fbfcecee71cd49d7ff25ad" gracePeriod=30 Mar 12 14:50:00 crc kubenswrapper[4869]: I0312 14:50:00.459174 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555450-qsz9r" Mar 12 14:50:00 crc kubenswrapper[4869]: I0312 14:50:00.575946 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84995f8c99-znz8g"] Mar 12 14:50:00 crc kubenswrapper[4869]: I0312 14:50:00.822900 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fdwk9"] Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.009491 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555450-qsz9r"] Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.131095 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvphv" event={"ID":"e65737a9-e615-4a51-a72c-e4b561bdd1b0","Type":"ContainerStarted","Data":"d01c7a54a8b6952334e4b198cabcab20732adf70bb899af7e59c988a8e4d0439"} Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.132049 4869 generic.go:334] "Generic (PLEG): container finished" podID="fb103b09-420a-4c89-9e40-f68fbc7b8d3b" containerID="af1af46808ddc8c90dfcf2c7d3addeeaf18a7d49b4fbfcecee71cd49d7ff25ad" exitCode=0 Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.132107 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd" event={"ID":"fb103b09-420a-4c89-9e40-f68fbc7b8d3b","Type":"ContainerDied","Data":"af1af46808ddc8c90dfcf2c7d3addeeaf18a7d49b4fbfcecee71cd49d7ff25ad"} Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.133045 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555450-qsz9r" event={"ID":"c6a83972-34aa-4505-8b6f-b1345b7981cd","Type":"ContainerStarted","Data":"3475af765ae81cda0a49bac97f192cd9c18d7fb2ceee2066bbc34a99ac85bc8a"} Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.152085 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jvphv" podStartSLOduration=3.309752457 podStartE2EDuration="35.152063459s" podCreationTimestamp="2026-03-12 14:49:26 +0000 UTC" firstStartedPulling="2026-03-12 14:49:28.603216312 +0000 UTC m=+120.888441590" lastFinishedPulling="2026-03-12 14:50:00.445527324 +0000 UTC m=+152.730752592" observedRunningTime="2026-03-12 14:50:01.148835391 +0000 UTC m=+153.434060699" watchObservedRunningTime="2026-03-12 14:50:01.152063459 +0000 UTC m=+153.437288737" Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.233079 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd" Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.358994 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb103b09-420a-4c89-9e40-f68fbc7b8d3b-proxy-ca-bundles\") pod \"fb103b09-420a-4c89-9e40-f68fbc7b8d3b\" (UID: \"fb103b09-420a-4c89-9e40-f68fbc7b8d3b\") " Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.359041 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb103b09-420a-4c89-9e40-f68fbc7b8d3b-config\") pod \"fb103b09-420a-4c89-9e40-f68fbc7b8d3b\" (UID: \"fb103b09-420a-4c89-9e40-f68fbc7b8d3b\") " Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.359182 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb103b09-420a-4c89-9e40-f68fbc7b8d3b-client-ca\") pod \"fb103b09-420a-4c89-9e40-f68fbc7b8d3b\" (UID: \"fb103b09-420a-4c89-9e40-f68fbc7b8d3b\") " Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.359331 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb103b09-420a-4c89-9e40-f68fbc7b8d3b-serving-cert\") pod \"fb103b09-420a-4c89-9e40-f68fbc7b8d3b\" (UID: \"fb103b09-420a-4c89-9e40-f68fbc7b8d3b\") " Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.359371 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhv7x\" (UniqueName: \"kubernetes.io/projected/fb103b09-420a-4c89-9e40-f68fbc7b8d3b-kube-api-access-jhv7x\") pod \"fb103b09-420a-4c89-9e40-f68fbc7b8d3b\" (UID: \"fb103b09-420a-4c89-9e40-f68fbc7b8d3b\") " Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.360067 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb103b09-420a-4c89-9e40-f68fbc7b8d3b-client-ca" (OuterVolumeSpecName: "client-ca") pod "fb103b09-420a-4c89-9e40-f68fbc7b8d3b" (UID: "fb103b09-420a-4c89-9e40-f68fbc7b8d3b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.360090 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb103b09-420a-4c89-9e40-f68fbc7b8d3b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "fb103b09-420a-4c89-9e40-f68fbc7b8d3b" (UID: "fb103b09-420a-4c89-9e40-f68fbc7b8d3b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.360117 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb103b09-420a-4c89-9e40-f68fbc7b8d3b-config" (OuterVolumeSpecName: "config") pod "fb103b09-420a-4c89-9e40-f68fbc7b8d3b" (UID: "fb103b09-420a-4c89-9e40-f68fbc7b8d3b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.366259 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb103b09-420a-4c89-9e40-f68fbc7b8d3b-kube-api-access-jhv7x" (OuterVolumeSpecName: "kube-api-access-jhv7x") pod "fb103b09-420a-4c89-9e40-f68fbc7b8d3b" (UID: "fb103b09-420a-4c89-9e40-f68fbc7b8d3b"). InnerVolumeSpecName "kube-api-access-jhv7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.367860 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb103b09-420a-4c89-9e40-f68fbc7b8d3b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fb103b09-420a-4c89-9e40-f68fbc7b8d3b" (UID: "fb103b09-420a-4c89-9e40-f68fbc7b8d3b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.412343 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.462906 4869 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb103b09-420a-4c89-9e40-f68fbc7b8d3b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.462930 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb103b09-420a-4c89-9e40-f68fbc7b8d3b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.462940 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhv7x\" (UniqueName: \"kubernetes.io/projected/fb103b09-420a-4c89-9e40-f68fbc7b8d3b-kube-api-access-jhv7x\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.462954 4869 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb103b09-420a-4c89-9e40-f68fbc7b8d3b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.462966 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb103b09-420a-4c89-9e40-f68fbc7b8d3b-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.882732 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn"] Mar 12 14:50:01 crc kubenswrapper[4869]: E0312 14:50:01.883319 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb103b09-420a-4c89-9e40-f68fbc7b8d3b" containerName="controller-manager" Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.883335 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb103b09-420a-4c89-9e40-f68fbc7b8d3b" containerName="controller-manager" Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.883465 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb103b09-420a-4c89-9e40-f68fbc7b8d3b" containerName="controller-manager" Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.884097 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn" Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.891690 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn"] Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.952420 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.953140 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.956229 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.956459 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.961991 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.971231 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cae8e09-616d-4793-b010-9479017f9d6c-serving-cert\") pod \"controller-manager-6868b8f9f5-tvgwn\" (UID: \"9cae8e09-616d-4793-b010-9479017f9d6c\") " pod="openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn" Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.971271 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55pm4\" (UniqueName: \"kubernetes.io/projected/9cae8e09-616d-4793-b010-9479017f9d6c-kube-api-access-55pm4\") pod \"controller-manager-6868b8f9f5-tvgwn\" (UID: \"9cae8e09-616d-4793-b010-9479017f9d6c\") " pod="openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn" Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.971332 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9cae8e09-616d-4793-b010-9479017f9d6c-client-ca\") pod \"controller-manager-6868b8f9f5-tvgwn\" (UID: \"9cae8e09-616d-4793-b010-9479017f9d6c\") " pod="openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn" Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.971493 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cae8e09-616d-4793-b010-9479017f9d6c-config\") pod \"controller-manager-6868b8f9f5-tvgwn\" (UID: \"9cae8e09-616d-4793-b010-9479017f9d6c\") " pod="openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn" Mar 12 14:50:01 crc kubenswrapper[4869]: I0312 14:50:01.971562 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9cae8e09-616d-4793-b010-9479017f9d6c-proxy-ca-bundles\") pod \"controller-manager-6868b8f9f5-tvgwn\" (UID: \"9cae8e09-616d-4793-b010-9479017f9d6c\") " pod="openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn" Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.072904 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9cae8e09-616d-4793-b010-9479017f9d6c-client-ca\") pod \"controller-manager-6868b8f9f5-tvgwn\" (UID: \"9cae8e09-616d-4793-b010-9479017f9d6c\") " pod="openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn" Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.072964 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53c0810d-8571-4727-bbfe-cc5f0e9570e5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"53c0810d-8571-4727-bbfe-cc5f0e9570e5\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.072998 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cae8e09-616d-4793-b010-9479017f9d6c-config\") pod \"controller-manager-6868b8f9f5-tvgwn\" (UID: \"9cae8e09-616d-4793-b010-9479017f9d6c\") " pod="openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn" Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.073022 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9cae8e09-616d-4793-b010-9479017f9d6c-proxy-ca-bundles\") pod \"controller-manager-6868b8f9f5-tvgwn\" (UID: \"9cae8e09-616d-4793-b010-9479017f9d6c\") " pod="openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn" Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.073050 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53c0810d-8571-4727-bbfe-cc5f0e9570e5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"53c0810d-8571-4727-bbfe-cc5f0e9570e5\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.073104 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cae8e09-616d-4793-b010-9479017f9d6c-serving-cert\") pod \"controller-manager-6868b8f9f5-tvgwn\" (UID: \"9cae8e09-616d-4793-b010-9479017f9d6c\") " pod="openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn" Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.073127 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55pm4\" (UniqueName: \"kubernetes.io/projected/9cae8e09-616d-4793-b010-9479017f9d6c-kube-api-access-55pm4\") pod \"controller-manager-6868b8f9f5-tvgwn\" (UID: \"9cae8e09-616d-4793-b010-9479017f9d6c\") " pod="openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn" Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.074628 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9cae8e09-616d-4793-b010-9479017f9d6c-client-ca\") pod \"controller-manager-6868b8f9f5-tvgwn\" (UID: \"9cae8e09-616d-4793-b010-9479017f9d6c\") " pod="openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn" Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.075054 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cae8e09-616d-4793-b010-9479017f9d6c-config\") pod \"controller-manager-6868b8f9f5-tvgwn\" (UID: \"9cae8e09-616d-4793-b010-9479017f9d6c\") " pod="openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn" Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.075469 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9cae8e09-616d-4793-b010-9479017f9d6c-proxy-ca-bundles\") pod \"controller-manager-6868b8f9f5-tvgwn\" (UID: \"9cae8e09-616d-4793-b010-9479017f9d6c\") " pod="openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn" Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.099255 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cae8e09-616d-4793-b010-9479017f9d6c-serving-cert\") pod \"controller-manager-6868b8f9f5-tvgwn\" (UID: \"9cae8e09-616d-4793-b010-9479017f9d6c\") " pod="openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn" Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.102981 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55pm4\" (UniqueName: \"kubernetes.io/projected/9cae8e09-616d-4793-b010-9479017f9d6c-kube-api-access-55pm4\") pod \"controller-manager-6868b8f9f5-tvgwn\" (UID: \"9cae8e09-616d-4793-b010-9479017f9d6c\") " pod="openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn" Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.140195 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd" event={"ID":"fb103b09-420a-4c89-9e40-f68fbc7b8d3b","Type":"ContainerDied","Data":"a5813c8081b8cde77e3e99e317f5ddd76df5f6ca0750d37860642af38b8a3460"} Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.140253 4869 scope.go:117] "RemoveContainer" containerID="af1af46808ddc8c90dfcf2c7d3addeeaf18a7d49b4fbfcecee71cd49d7ff25ad" Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.140363 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd" Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.151510 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxjq9" event={"ID":"9b8fa922-8e49-42e1-a4a5-40069c505bf6","Type":"ContainerStarted","Data":"e0bc81cd57072cc8e509dbf4315f8c4748f81116d8fee2eb326b338a0f6bf66b"} Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.152257 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-84995f8c99-znz8g" podUID="0f845ac0-2137-425f-b2e7-e9326cb15eea" containerName="route-controller-manager" containerID="cri-o://968f6568c8d464d6251819c72ba6e979eb93dbab5d69b37e135da870619d3f00" gracePeriod=30 Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.175307 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53c0810d-8571-4727-bbfe-cc5f0e9570e5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"53c0810d-8571-4727-bbfe-cc5f0e9570e5\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.175509 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53c0810d-8571-4727-bbfe-cc5f0e9570e5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"53c0810d-8571-4727-bbfe-cc5f0e9570e5\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.175835 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53c0810d-8571-4727-bbfe-cc5f0e9570e5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"53c0810d-8571-4727-bbfe-cc5f0e9570e5\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.191788 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lxjq9" podStartSLOduration=1.907869849 podStartE2EDuration="37.191765946s" podCreationTimestamp="2026-03-12 14:49:25 +0000 UTC" firstStartedPulling="2026-03-12 14:49:26.324306275 +0000 UTC m=+118.609531553" lastFinishedPulling="2026-03-12 14:50:01.608202372 +0000 UTC m=+153.893427650" observedRunningTime="2026-03-12 14:50:02.191444875 +0000 UTC m=+154.476670153" watchObservedRunningTime="2026-03-12 14:50:02.191765946 +0000 UTC m=+154.476991224" Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.218282 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53c0810d-8571-4727-bbfe-cc5f0e9570e5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"53c0810d-8571-4727-bbfe-cc5f0e9570e5\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.222805 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd"] Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.231395 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5b759d5cf5-b9lqd"] Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.239942 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn" Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.280752 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.343701 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb103b09-420a-4c89-9e40-f68fbc7b8d3b" path="/var/lib/kubelet/pods/fb103b09-420a-4c89-9e40-f68fbc7b8d3b/volumes" Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.811702 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84995f8c99-znz8g" Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.885150 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f845ac0-2137-425f-b2e7-e9326cb15eea-config\") pod \"0f845ac0-2137-425f-b2e7-e9326cb15eea\" (UID: \"0f845ac0-2137-425f-b2e7-e9326cb15eea\") " Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.885231 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f845ac0-2137-425f-b2e7-e9326cb15eea-serving-cert\") pod \"0f845ac0-2137-425f-b2e7-e9326cb15eea\" (UID: \"0f845ac0-2137-425f-b2e7-e9326cb15eea\") " Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.885303 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f845ac0-2137-425f-b2e7-e9326cb15eea-client-ca\") pod \"0f845ac0-2137-425f-b2e7-e9326cb15eea\" (UID: \"0f845ac0-2137-425f-b2e7-e9326cb15eea\") " Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.885330 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgnm7\" (UniqueName: \"kubernetes.io/projected/0f845ac0-2137-425f-b2e7-e9326cb15eea-kube-api-access-sgnm7\") pod \"0f845ac0-2137-425f-b2e7-e9326cb15eea\" (UID: \"0f845ac0-2137-425f-b2e7-e9326cb15eea\") " Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.886089 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f845ac0-2137-425f-b2e7-e9326cb15eea-client-ca" (OuterVolumeSpecName: "client-ca") pod "0f845ac0-2137-425f-b2e7-e9326cb15eea" (UID: "0f845ac0-2137-425f-b2e7-e9326cb15eea"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.886102 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f845ac0-2137-425f-b2e7-e9326cb15eea-config" (OuterVolumeSpecName: "config") pod "0f845ac0-2137-425f-b2e7-e9326cb15eea" (UID: "0f845ac0-2137-425f-b2e7-e9326cb15eea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.886966 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f845ac0-2137-425f-b2e7-e9326cb15eea-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.886983 4869 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f845ac0-2137-425f-b2e7-e9326cb15eea-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.893914 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f845ac0-2137-425f-b2e7-e9326cb15eea-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0f845ac0-2137-425f-b2e7-e9326cb15eea" (UID: "0f845ac0-2137-425f-b2e7-e9326cb15eea"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.899807 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f845ac0-2137-425f-b2e7-e9326cb15eea-kube-api-access-sgnm7" (OuterVolumeSpecName: "kube-api-access-sgnm7") pod "0f845ac0-2137-425f-b2e7-e9326cb15eea" (UID: "0f845ac0-2137-425f-b2e7-e9326cb15eea"). InnerVolumeSpecName "kube-api-access-sgnm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.988597 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgnm7\" (UniqueName: \"kubernetes.io/projected/0f845ac0-2137-425f-b2e7-e9326cb15eea-kube-api-access-sgnm7\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:02 crc kubenswrapper[4869]: I0312 14:50:02.989022 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f845ac0-2137-425f-b2e7-e9326cb15eea-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:03 crc kubenswrapper[4869]: I0312 14:50:03.048111 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn"] Mar 12 14:50:03 crc kubenswrapper[4869]: I0312 14:50:03.077050 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 12 14:50:03 crc kubenswrapper[4869]: W0312 14:50:03.077422 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod53c0810d_8571_4727_bbfe_cc5f0e9570e5.slice/crio-51e5149f63290d1cf1ef71371bd92f69065409595bb18d689e5fd938e80d36eb WatchSource:0}: Error finding container 51e5149f63290d1cf1ef71371bd92f69065409595bb18d689e5fd938e80d36eb: Status 404 returned error can't find the container with id 51e5149f63290d1cf1ef71371bd92f69065409595bb18d689e5fd938e80d36eb Mar 12 14:50:03 crc kubenswrapper[4869]: I0312 14:50:03.165753 4869 generic.go:334] "Generic (PLEG): container finished" podID="0f845ac0-2137-425f-b2e7-e9326cb15eea" containerID="968f6568c8d464d6251819c72ba6e979eb93dbab5d69b37e135da870619d3f00" exitCode=0 Mar 12 14:50:03 crc kubenswrapper[4869]: I0312 14:50:03.165809 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84995f8c99-znz8g" event={"ID":"0f845ac0-2137-425f-b2e7-e9326cb15eea","Type":"ContainerDied","Data":"968f6568c8d464d6251819c72ba6e979eb93dbab5d69b37e135da870619d3f00"} Mar 12 14:50:03 crc kubenswrapper[4869]: I0312 14:50:03.165836 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84995f8c99-znz8g" event={"ID":"0f845ac0-2137-425f-b2e7-e9326cb15eea","Type":"ContainerDied","Data":"875e8c6ccfb7b1f52a2388f1f16581f2cb10fb2c365148b5999cffc702dff677"} Mar 12 14:50:03 crc kubenswrapper[4869]: I0312 14:50:03.165853 4869 scope.go:117] "RemoveContainer" containerID="968f6568c8d464d6251819c72ba6e979eb93dbab5d69b37e135da870619d3f00" Mar 12 14:50:03 crc kubenswrapper[4869]: I0312 14:50:03.165930 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84995f8c99-znz8g" Mar 12 14:50:03 crc kubenswrapper[4869]: I0312 14:50:03.173636 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"53c0810d-8571-4727-bbfe-cc5f0e9570e5","Type":"ContainerStarted","Data":"51e5149f63290d1cf1ef71371bd92f69065409595bb18d689e5fd938e80d36eb"} Mar 12 14:50:03 crc kubenswrapper[4869]: I0312 14:50:03.178636 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn" event={"ID":"9cae8e09-616d-4793-b010-9479017f9d6c","Type":"ContainerStarted","Data":"a1f3c7d1a9639bfc1206009ea7b574871045fec67809bf4ad6b1ded2f1a21a03"} Mar 12 14:50:03 crc kubenswrapper[4869]: I0312 14:50:03.196481 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84995f8c99-znz8g"] Mar 12 14:50:03 crc kubenswrapper[4869]: I0312 14:50:03.199195 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84995f8c99-znz8g"] Mar 12 14:50:03 crc kubenswrapper[4869]: I0312 14:50:03.882587 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d77485985-fpspc"] Mar 12 14:50:03 crc kubenswrapper[4869]: E0312 14:50:03.883018 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f845ac0-2137-425f-b2e7-e9326cb15eea" containerName="route-controller-manager" Mar 12 14:50:03 crc kubenswrapper[4869]: I0312 14:50:03.883032 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f845ac0-2137-425f-b2e7-e9326cb15eea" containerName="route-controller-manager" Mar 12 14:50:03 crc kubenswrapper[4869]: I0312 14:50:03.883121 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f845ac0-2137-425f-b2e7-e9326cb15eea" containerName="route-controller-manager" Mar 12 14:50:03 crc kubenswrapper[4869]: I0312 14:50:03.883462 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d77485985-fpspc" Mar 12 14:50:03 crc kubenswrapper[4869]: I0312 14:50:03.885732 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 12 14:50:03 crc kubenswrapper[4869]: I0312 14:50:03.885901 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 12 14:50:03 crc kubenswrapper[4869]: I0312 14:50:03.886009 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 12 14:50:03 crc kubenswrapper[4869]: I0312 14:50:03.886193 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 12 14:50:03 crc kubenswrapper[4869]: I0312 14:50:03.886283 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 12 14:50:03 crc kubenswrapper[4869]: I0312 14:50:03.886361 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 12 14:50:03 crc kubenswrapper[4869]: I0312 14:50:03.895096 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d77485985-fpspc"] Mar 12 14:50:04 crc kubenswrapper[4869]: I0312 14:50:04.004001 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00d136cf-ede3-4e08-976a-e72f20f469f7-serving-cert\") pod \"route-controller-manager-6d77485985-fpspc\" (UID: \"00d136cf-ede3-4e08-976a-e72f20f469f7\") " pod="openshift-route-controller-manager/route-controller-manager-6d77485985-fpspc" Mar 12 14:50:04 crc kubenswrapper[4869]: I0312 14:50:04.004069 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00d136cf-ede3-4e08-976a-e72f20f469f7-client-ca\") pod \"route-controller-manager-6d77485985-fpspc\" (UID: \"00d136cf-ede3-4e08-976a-e72f20f469f7\") " pod="openshift-route-controller-manager/route-controller-manager-6d77485985-fpspc" Mar 12 14:50:04 crc kubenswrapper[4869]: I0312 14:50:04.004099 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00d136cf-ede3-4e08-976a-e72f20f469f7-config\") pod \"route-controller-manager-6d77485985-fpspc\" (UID: \"00d136cf-ede3-4e08-976a-e72f20f469f7\") " pod="openshift-route-controller-manager/route-controller-manager-6d77485985-fpspc" Mar 12 14:50:04 crc kubenswrapper[4869]: I0312 14:50:04.004120 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k95p\" (UniqueName: \"kubernetes.io/projected/00d136cf-ede3-4e08-976a-e72f20f469f7-kube-api-access-8k95p\") pod \"route-controller-manager-6d77485985-fpspc\" (UID: \"00d136cf-ede3-4e08-976a-e72f20f469f7\") " pod="openshift-route-controller-manager/route-controller-manager-6d77485985-fpspc" Mar 12 14:50:04 crc kubenswrapper[4869]: I0312 14:50:04.081773 4869 scope.go:117] "RemoveContainer" containerID="968f6568c8d464d6251819c72ba6e979eb93dbab5d69b37e135da870619d3f00" Mar 12 14:50:04 crc kubenswrapper[4869]: E0312 14:50:04.082630 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"968f6568c8d464d6251819c72ba6e979eb93dbab5d69b37e135da870619d3f00\": container with ID starting with 968f6568c8d464d6251819c72ba6e979eb93dbab5d69b37e135da870619d3f00 not found: ID does not exist" containerID="968f6568c8d464d6251819c72ba6e979eb93dbab5d69b37e135da870619d3f00" Mar 12 14:50:04 crc kubenswrapper[4869]: I0312 14:50:04.082697 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"968f6568c8d464d6251819c72ba6e979eb93dbab5d69b37e135da870619d3f00"} err="failed to get container status \"968f6568c8d464d6251819c72ba6e979eb93dbab5d69b37e135da870619d3f00\": rpc error: code = NotFound desc = could not find container \"968f6568c8d464d6251819c72ba6e979eb93dbab5d69b37e135da870619d3f00\": container with ID starting with 968f6568c8d464d6251819c72ba6e979eb93dbab5d69b37e135da870619d3f00 not found: ID does not exist" Mar 12 14:50:04 crc kubenswrapper[4869]: I0312 14:50:04.105342 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00d136cf-ede3-4e08-976a-e72f20f469f7-serving-cert\") pod \"route-controller-manager-6d77485985-fpspc\" (UID: \"00d136cf-ede3-4e08-976a-e72f20f469f7\") " pod="openshift-route-controller-manager/route-controller-manager-6d77485985-fpspc" Mar 12 14:50:04 crc kubenswrapper[4869]: I0312 14:50:04.105419 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00d136cf-ede3-4e08-976a-e72f20f469f7-client-ca\") pod \"route-controller-manager-6d77485985-fpspc\" (UID: \"00d136cf-ede3-4e08-976a-e72f20f469f7\") " pod="openshift-route-controller-manager/route-controller-manager-6d77485985-fpspc" Mar 12 14:50:04 crc kubenswrapper[4869]: I0312 14:50:04.105452 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00d136cf-ede3-4e08-976a-e72f20f469f7-config\") pod \"route-controller-manager-6d77485985-fpspc\" (UID: \"00d136cf-ede3-4e08-976a-e72f20f469f7\") " pod="openshift-route-controller-manager/route-controller-manager-6d77485985-fpspc" Mar 12 14:50:04 crc kubenswrapper[4869]: I0312 14:50:04.105482 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k95p\" (UniqueName: \"kubernetes.io/projected/00d136cf-ede3-4e08-976a-e72f20f469f7-kube-api-access-8k95p\") pod \"route-controller-manager-6d77485985-fpspc\" (UID: \"00d136cf-ede3-4e08-976a-e72f20f469f7\") " pod="openshift-route-controller-manager/route-controller-manager-6d77485985-fpspc" Mar 12 14:50:04 crc kubenswrapper[4869]: I0312 14:50:04.107349 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00d136cf-ede3-4e08-976a-e72f20f469f7-client-ca\") pod \"route-controller-manager-6d77485985-fpspc\" (UID: \"00d136cf-ede3-4e08-976a-e72f20f469f7\") " pod="openshift-route-controller-manager/route-controller-manager-6d77485985-fpspc" Mar 12 14:50:04 crc kubenswrapper[4869]: I0312 14:50:04.107924 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00d136cf-ede3-4e08-976a-e72f20f469f7-config\") pod \"route-controller-manager-6d77485985-fpspc\" (UID: \"00d136cf-ede3-4e08-976a-e72f20f469f7\") " pod="openshift-route-controller-manager/route-controller-manager-6d77485985-fpspc" Mar 12 14:50:04 crc kubenswrapper[4869]: I0312 14:50:04.113345 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00d136cf-ede3-4e08-976a-e72f20f469f7-serving-cert\") pod \"route-controller-manager-6d77485985-fpspc\" (UID: \"00d136cf-ede3-4e08-976a-e72f20f469f7\") " pod="openshift-route-controller-manager/route-controller-manager-6d77485985-fpspc" Mar 12 14:50:04 crc kubenswrapper[4869]: I0312 14:50:04.122357 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k95p\" (UniqueName: \"kubernetes.io/projected/00d136cf-ede3-4e08-976a-e72f20f469f7-kube-api-access-8k95p\") pod \"route-controller-manager-6d77485985-fpspc\" (UID: \"00d136cf-ede3-4e08-976a-e72f20f469f7\") " pod="openshift-route-controller-manager/route-controller-manager-6d77485985-fpspc" Mar 12 14:50:04 crc kubenswrapper[4869]: I0312 14:50:04.184595 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zs5nz" event={"ID":"ffa8f886-960d-4115-a0e0-3ca252d2af08","Type":"ContainerStarted","Data":"cb9a2a3df6841a00da4a501abf635b79e546fc54625f796a96cca53b32361a9e"} Mar 12 14:50:04 crc kubenswrapper[4869]: I0312 14:50:04.201499 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d77485985-fpspc" Mar 12 14:50:04 crc kubenswrapper[4869]: I0312 14:50:04.342595 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f845ac0-2137-425f-b2e7-e9326cb15eea" path="/var/lib/kubelet/pods/0f845ac0-2137-425f-b2e7-e9326cb15eea/volumes" Mar 12 14:50:05 crc kubenswrapper[4869]: I0312 14:50:05.028026 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d77485985-fpspc"] Mar 12 14:50:05 crc kubenswrapper[4869]: W0312 14:50:05.038977 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00d136cf_ede3_4e08_976a_e72f20f469f7.slice/crio-14ca06fbbf063354188164e3824404dc79c3258523cee555f5472add5666c890 WatchSource:0}: Error finding container 14ca06fbbf063354188164e3824404dc79c3258523cee555f5472add5666c890: Status 404 returned error can't find the container with id 14ca06fbbf063354188164e3824404dc79c3258523cee555f5472add5666c890 Mar 12 14:50:05 crc kubenswrapper[4869]: I0312 14:50:05.192054 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d77485985-fpspc" event={"ID":"00d136cf-ede3-4e08-976a-e72f20f469f7","Type":"ContainerStarted","Data":"14ca06fbbf063354188164e3824404dc79c3258523cee555f5472add5666c890"} Mar 12 14:50:05 crc kubenswrapper[4869]: I0312 14:50:05.209031 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zs5nz" podStartSLOduration=5.054384757 podStartE2EDuration="37.209013858s" podCreationTimestamp="2026-03-12 14:49:28 +0000 UTC" firstStartedPulling="2026-03-12 14:49:30.703925004 +0000 UTC m=+122.989150282" lastFinishedPulling="2026-03-12 14:50:02.858554105 +0000 UTC m=+155.143779383" observedRunningTime="2026-03-12 14:50:05.207996904 +0000 UTC m=+157.493222192" watchObservedRunningTime="2026-03-12 14:50:05.209013858 +0000 UTC m=+157.494239136" Mar 12 14:50:05 crc kubenswrapper[4869]: I0312 14:50:05.431313 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lxjq9" Mar 12 14:50:05 crc kubenswrapper[4869]: I0312 14:50:05.431403 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lxjq9" Mar 12 14:50:06 crc kubenswrapper[4869]: I0312 14:50:06.117499 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lxjq9" Mar 12 14:50:06 crc kubenswrapper[4869]: I0312 14:50:06.217922 4869 generic.go:334] "Generic (PLEG): container finished" podID="53c0810d-8571-4727-bbfe-cc5f0e9570e5" containerID="fef00170e6679c00b1251ce1587b9abddcd7c3274d44fbd36ef5f4fb22c86b95" exitCode=0 Mar 12 14:50:06 crc kubenswrapper[4869]: I0312 14:50:06.218161 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"53c0810d-8571-4727-bbfe-cc5f0e9570e5","Type":"ContainerDied","Data":"fef00170e6679c00b1251ce1587b9abddcd7c3274d44fbd36ef5f4fb22c86b95"} Mar 12 14:50:06 crc kubenswrapper[4869]: I0312 14:50:06.220946 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn" event={"ID":"9cae8e09-616d-4793-b010-9479017f9d6c","Type":"ContainerStarted","Data":"a4c0fea1b26394323b64abe2a14207ac8e7c17f4726dbe29f7a0cea29155238c"} Mar 12 14:50:06 crc kubenswrapper[4869]: I0312 14:50:06.221746 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn" Mar 12 14:50:06 crc kubenswrapper[4869]: I0312 14:50:06.226751 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn" Mar 12 14:50:06 crc kubenswrapper[4869]: I0312 14:50:06.227865 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sc6nv" event={"ID":"078c5068-e532-4d7f-badc-c7f0c23a3191","Type":"ContainerStarted","Data":"03cc0faa6d01c2e9a7d025393b5ae5388cd6bd992de8ad659d4895dc117947bb"} Mar 12 14:50:06 crc kubenswrapper[4869]: I0312 14:50:06.237231 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d77485985-fpspc" event={"ID":"00d136cf-ede3-4e08-976a-e72f20f469f7","Type":"ContainerStarted","Data":"9e451664f3c6c584face0b96e136b3ef2cdc8425f4ce8aaf4da939d090d474fc"} Mar 12 14:50:06 crc kubenswrapper[4869]: I0312 14:50:06.237473 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6d77485985-fpspc" Mar 12 14:50:06 crc kubenswrapper[4869]: I0312 14:50:06.257946 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sc6nv" podStartSLOduration=3.168554944 podStartE2EDuration="39.257892051s" podCreationTimestamp="2026-03-12 14:49:27 +0000 UTC" firstStartedPulling="2026-03-12 14:49:28.675741385 +0000 UTC m=+120.960966663" lastFinishedPulling="2026-03-12 14:50:04.765078492 +0000 UTC m=+157.050303770" observedRunningTime="2026-03-12 14:50:06.252725639 +0000 UTC m=+158.537950917" watchObservedRunningTime="2026-03-12 14:50:06.257892051 +0000 UTC m=+158.543117329" Mar 12 14:50:06 crc kubenswrapper[4869]: I0312 14:50:06.274312 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn" podStartSLOduration=6.274295448 podStartE2EDuration="6.274295448s" podCreationTimestamp="2026-03-12 14:50:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:50:06.271236526 +0000 UTC m=+158.556461794" watchObservedRunningTime="2026-03-12 14:50:06.274295448 +0000 UTC m=+158.559520716" Mar 12 14:50:06 crc kubenswrapper[4869]: I0312 14:50:06.297744 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6d77485985-fpspc" podStartSLOduration=6.297721779 podStartE2EDuration="6.297721779s" podCreationTimestamp="2026-03-12 14:50:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:50:06.294946637 +0000 UTC m=+158.580171915" watchObservedRunningTime="2026-03-12 14:50:06.297721779 +0000 UTC m=+158.582947057" Mar 12 14:50:06 crc kubenswrapper[4869]: I0312 14:50:06.306801 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lxjq9" Mar 12 14:50:06 crc kubenswrapper[4869]: I0312 14:50:06.629894 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6d77485985-fpspc" Mar 12 14:50:07 crc kubenswrapper[4869]: I0312 14:50:07.206296 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jvphv" Mar 12 14:50:07 crc kubenswrapper[4869]: I0312 14:50:07.206359 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jvphv" Mar 12 14:50:07 crc kubenswrapper[4869]: I0312 14:50:07.246300 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngq4q" event={"ID":"8bf6f93d-a6f8-494b-abfa-47f8a164b667","Type":"ContainerStarted","Data":"47f226d09b13229e429c8112fdcda62314f1524f8689557e2310c93169c1764e"} Mar 12 14:50:07 crc kubenswrapper[4869]: I0312 14:50:07.250247 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cfj7c" event={"ID":"74481165-f298-4cdc-9af7-feef043fa182","Type":"ContainerStarted","Data":"abb9213ab794ad7bd95a2c3c3205330903217c6a16a6cdf8df0dc21afd19781c"} Mar 12 14:50:07 crc kubenswrapper[4869]: I0312 14:50:07.254259 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66jjg" event={"ID":"1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2","Type":"ContainerStarted","Data":"d8b0b007acc0e6c6b1f1ed4aaa7abca1d21eb859ca7baf3d20b899986b2b5687"} Mar 12 14:50:07 crc kubenswrapper[4869]: I0312 14:50:07.270779 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jvphv" Mar 12 14:50:07 crc kubenswrapper[4869]: I0312 14:50:07.288784 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ngq4q" podStartSLOduration=3.721217638 podStartE2EDuration="43.288769064s" podCreationTimestamp="2026-03-12 14:49:24 +0000 UTC" firstStartedPulling="2026-03-12 14:49:26.380308765 +0000 UTC m=+118.665534043" lastFinishedPulling="2026-03-12 14:50:05.947860191 +0000 UTC m=+158.233085469" observedRunningTime="2026-03-12 14:50:07.275048156 +0000 UTC m=+159.560273434" watchObservedRunningTime="2026-03-12 14:50:07.288769064 +0000 UTC m=+159.573994332" Mar 12 14:50:07 crc kubenswrapper[4869]: I0312 14:50:07.344154 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-66jjg" podStartSLOduration=2.592016029 podStartE2EDuration="42.34413164s" podCreationTimestamp="2026-03-12 14:49:25 +0000 UTC" firstStartedPulling="2026-03-12 14:49:26.391827929 +0000 UTC m=+118.677053207" lastFinishedPulling="2026-03-12 14:50:06.14394354 +0000 UTC m=+158.429168818" observedRunningTime="2026-03-12 14:50:07.342355641 +0000 UTC m=+159.627580929" watchObservedRunningTime="2026-03-12 14:50:07.34413164 +0000 UTC m=+159.629356918" Mar 12 14:50:07 crc kubenswrapper[4869]: I0312 14:50:07.364332 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cfj7c" podStartSLOduration=3.884721687 podStartE2EDuration="39.364313923s" podCreationTimestamp="2026-03-12 14:49:28 +0000 UTC" firstStartedPulling="2026-03-12 14:49:30.720786307 +0000 UTC m=+123.006011585" lastFinishedPulling="2026-03-12 14:50:06.200378543 +0000 UTC m=+158.485603821" observedRunningTime="2026-03-12 14:50:07.362819553 +0000 UTC m=+159.648044831" watchObservedRunningTime="2026-03-12 14:50:07.364313923 +0000 UTC m=+159.649539201" Mar 12 14:50:07 crc kubenswrapper[4869]: I0312 14:50:07.380832 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jvphv" Mar 12 14:50:07 crc kubenswrapper[4869]: I0312 14:50:07.617955 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sc6nv" Mar 12 14:50:07 crc kubenswrapper[4869]: I0312 14:50:07.618179 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sc6nv" Mar 12 14:50:07 crc kubenswrapper[4869]: I0312 14:50:07.647529 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 14:50:07 crc kubenswrapper[4869]: I0312 14:50:07.755858 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53c0810d-8571-4727-bbfe-cc5f0e9570e5-kubelet-dir\") pod \"53c0810d-8571-4727-bbfe-cc5f0e9570e5\" (UID: \"53c0810d-8571-4727-bbfe-cc5f0e9570e5\") " Mar 12 14:50:07 crc kubenswrapper[4869]: I0312 14:50:07.755908 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53c0810d-8571-4727-bbfe-cc5f0e9570e5-kube-api-access\") pod \"53c0810d-8571-4727-bbfe-cc5f0e9570e5\" (UID: \"53c0810d-8571-4727-bbfe-cc5f0e9570e5\") " Mar 12 14:50:07 crc kubenswrapper[4869]: I0312 14:50:07.756232 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/53c0810d-8571-4727-bbfe-cc5f0e9570e5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "53c0810d-8571-4727-bbfe-cc5f0e9570e5" (UID: "53c0810d-8571-4727-bbfe-cc5f0e9570e5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 14:50:07 crc kubenswrapper[4869]: I0312 14:50:07.762852 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53c0810d-8571-4727-bbfe-cc5f0e9570e5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "53c0810d-8571-4727-bbfe-cc5f0e9570e5" (UID: "53c0810d-8571-4727-bbfe-cc5f0e9570e5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:50:07 crc kubenswrapper[4869]: I0312 14:50:07.857697 4869 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53c0810d-8571-4727-bbfe-cc5f0e9570e5-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:07 crc kubenswrapper[4869]: I0312 14:50:07.857932 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53c0810d-8571-4727-bbfe-cc5f0e9570e5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:08 crc kubenswrapper[4869]: I0312 14:50:08.263743 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"53c0810d-8571-4727-bbfe-cc5f0e9570e5","Type":"ContainerDied","Data":"51e5149f63290d1cf1ef71371bd92f69065409595bb18d689e5fd938e80d36eb"} Mar 12 14:50:08 crc kubenswrapper[4869]: I0312 14:50:08.263792 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51e5149f63290d1cf1ef71371bd92f69065409595bb18d689e5fd938e80d36eb" Mar 12 14:50:08 crc kubenswrapper[4869]: I0312 14:50:08.263855 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 14:50:08 crc kubenswrapper[4869]: I0312 14:50:08.426992 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zs5nz" Mar 12 14:50:08 crc kubenswrapper[4869]: I0312 14:50:08.427072 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zs5nz" Mar 12 14:50:08 crc kubenswrapper[4869]: I0312 14:50:08.661289 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-sc6nv" podUID="078c5068-e532-4d7f-badc-c7f0c23a3191" containerName="registry-server" probeResult="failure" output=< Mar 12 14:50:08 crc kubenswrapper[4869]: timeout: failed to connect service ":50051" within 1s Mar 12 14:50:08 crc kubenswrapper[4869]: > Mar 12 14:50:08 crc kubenswrapper[4869]: I0312 14:50:08.838079 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cfj7c" Mar 12 14:50:08 crc kubenswrapper[4869]: I0312 14:50:08.838266 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cfj7c" Mar 12 14:50:09 crc kubenswrapper[4869]: I0312 14:50:09.477108 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zs5nz" podUID="ffa8f886-960d-4115-a0e0-3ca252d2af08" containerName="registry-server" probeResult="failure" output=< Mar 12 14:50:09 crc kubenswrapper[4869]: timeout: failed to connect service ":50051" within 1s Mar 12 14:50:09 crc kubenswrapper[4869]: > Mar 12 14:50:09 crc kubenswrapper[4869]: I0312 14:50:09.900865 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cfj7c" podUID="74481165-f298-4cdc-9af7-feef043fa182" containerName="registry-server" probeResult="failure" output=< Mar 12 14:50:09 crc kubenswrapper[4869]: timeout: failed to connect service ":50051" within 1s Mar 12 14:50:09 crc kubenswrapper[4869]: > Mar 12 14:50:12 crc kubenswrapper[4869]: I0312 14:50:12.157385 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 12 14:50:12 crc kubenswrapper[4869]: E0312 14:50:12.157659 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53c0810d-8571-4727-bbfe-cc5f0e9570e5" containerName="pruner" Mar 12 14:50:12 crc kubenswrapper[4869]: I0312 14:50:12.157671 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="53c0810d-8571-4727-bbfe-cc5f0e9570e5" containerName="pruner" Mar 12 14:50:12 crc kubenswrapper[4869]: I0312 14:50:12.157776 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="53c0810d-8571-4727-bbfe-cc5f0e9570e5" containerName="pruner" Mar 12 14:50:12 crc kubenswrapper[4869]: I0312 14:50:12.158127 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 12 14:50:12 crc kubenswrapper[4869]: I0312 14:50:12.159452 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 12 14:50:12 crc kubenswrapper[4869]: I0312 14:50:12.168478 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 12 14:50:12 crc kubenswrapper[4869]: I0312 14:50:12.173824 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 12 14:50:12 crc kubenswrapper[4869]: I0312 14:50:12.215658 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2-kubelet-dir\") pod \"installer-9-crc\" (UID: \"fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 14:50:12 crc kubenswrapper[4869]: I0312 14:50:12.215763 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2-kube-api-access\") pod \"installer-9-crc\" (UID: \"fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 14:50:12 crc kubenswrapper[4869]: I0312 14:50:12.215788 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2-var-lock\") pod \"installer-9-crc\" (UID: \"fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 14:50:12 crc kubenswrapper[4869]: I0312 14:50:12.316586 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2-kube-api-access\") pod \"installer-9-crc\" (UID: \"fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 14:50:12 crc kubenswrapper[4869]: I0312 14:50:12.316636 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2-var-lock\") pod \"installer-9-crc\" (UID: \"fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 14:50:12 crc kubenswrapper[4869]: I0312 14:50:12.316662 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2-kubelet-dir\") pod \"installer-9-crc\" (UID: \"fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 14:50:12 crc kubenswrapper[4869]: I0312 14:50:12.316737 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2-kubelet-dir\") pod \"installer-9-crc\" (UID: \"fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 14:50:12 crc kubenswrapper[4869]: I0312 14:50:12.316759 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2-var-lock\") pod \"installer-9-crc\" (UID: \"fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 14:50:12 crc kubenswrapper[4869]: I0312 14:50:12.378298 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2-kube-api-access\") pod \"installer-9-crc\" (UID: \"fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 14:50:12 crc kubenswrapper[4869]: I0312 14:50:12.477038 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 12 14:50:15 crc kubenswrapper[4869]: I0312 14:50:15.241139 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ngq4q" Mar 12 14:50:15 crc kubenswrapper[4869]: I0312 14:50:15.241192 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ngq4q" Mar 12 14:50:15 crc kubenswrapper[4869]: I0312 14:50:15.286296 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ngq4q" Mar 12 14:50:15 crc kubenswrapper[4869]: I0312 14:50:15.353093 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ngq4q" Mar 12 14:50:15 crc kubenswrapper[4869]: I0312 14:50:15.609249 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-66jjg" Mar 12 14:50:15 crc kubenswrapper[4869]: I0312 14:50:15.609302 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-66jjg" Mar 12 14:50:15 crc kubenswrapper[4869]: I0312 14:50:15.646041 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-66jjg" Mar 12 14:50:16 crc kubenswrapper[4869]: I0312 14:50:16.377357 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-66jjg" Mar 12 14:50:16 crc kubenswrapper[4869]: I0312 14:50:16.807844 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 12 14:50:16 crc kubenswrapper[4869]: W0312 14:50:16.850663 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podfafe913d_7e55_4f19_8ebd_7ef3d6fdd4a2.slice/crio-f27749ee1196b7f5a52c3909687a81dda32434db6622b688576121ca82014312 WatchSource:0}: Error finding container f27749ee1196b7f5a52c3909687a81dda32434db6622b688576121ca82014312: Status 404 returned error can't find the container with id f27749ee1196b7f5a52c3909687a81dda32434db6622b688576121ca82014312 Mar 12 14:50:17 crc kubenswrapper[4869]: I0312 14:50:17.077694 4869 csr.go:261] certificate signing request csr-4p5cj is approved, waiting to be issued Mar 12 14:50:17 crc kubenswrapper[4869]: I0312 14:50:17.083831 4869 csr.go:257] certificate signing request csr-4p5cj is issued Mar 12 14:50:17 crc kubenswrapper[4869]: I0312 14:50:17.325799 4869 generic.go:334] "Generic (PLEG): container finished" podID="c6a83972-34aa-4505-8b6f-b1345b7981cd" containerID="27d456d07c8dc744d30a6ee25564a39108e1680d5cba5b71d11be77d04345460" exitCode=0 Mar 12 14:50:17 crc kubenswrapper[4869]: I0312 14:50:17.326199 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555450-qsz9r" event={"ID":"c6a83972-34aa-4505-8b6f-b1345b7981cd","Type":"ContainerDied","Data":"27d456d07c8dc744d30a6ee25564a39108e1680d5cba5b71d11be77d04345460"} Mar 12 14:50:17 crc kubenswrapper[4869]: I0312 14:50:17.328150 4869 generic.go:334] "Generic (PLEG): container finished" podID="82120625-a0ba-4136-b3d5-23b2b78f72cb" containerID="aab48e4b0b825c0047d0bf8b3aac6f3b96207b21f69f699241195797185f37a1" exitCode=0 Mar 12 14:50:17 crc kubenswrapper[4869]: I0312 14:50:17.328234 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6x5kk" event={"ID":"82120625-a0ba-4136-b3d5-23b2b78f72cb","Type":"ContainerDied","Data":"aab48e4b0b825c0047d0bf8b3aac6f3b96207b21f69f699241195797185f37a1"} Mar 12 14:50:17 crc kubenswrapper[4869]: I0312 14:50:17.333638 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2","Type":"ContainerStarted","Data":"ffe530aa4230f2adc429700dea2edfed048b5e43bc5e2268dee47c2440632fc7"} Mar 12 14:50:17 crc kubenswrapper[4869]: I0312 14:50:17.333697 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2","Type":"ContainerStarted","Data":"f27749ee1196b7f5a52c3909687a81dda32434db6622b688576121ca82014312"} Mar 12 14:50:17 crc kubenswrapper[4869]: I0312 14:50:17.363516 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=5.363494581 podStartE2EDuration="5.363494581s" podCreationTimestamp="2026-03-12 14:50:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:50:17.357637445 +0000 UTC m=+169.642862763" watchObservedRunningTime="2026-03-12 14:50:17.363494581 +0000 UTC m=+169.648719899" Mar 12 14:50:17 crc kubenswrapper[4869]: I0312 14:50:17.667139 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sc6nv" Mar 12 14:50:17 crc kubenswrapper[4869]: I0312 14:50:17.731246 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sc6nv" Mar 12 14:50:17 crc kubenswrapper[4869]: I0312 14:50:17.915263 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-66jjg"] Mar 12 14:50:18 crc kubenswrapper[4869]: I0312 14:50:18.085951 4869 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-11 09:29:16.286765293 +0000 UTC Mar 12 14:50:18 crc kubenswrapper[4869]: I0312 14:50:18.086008 4869 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6570h38m58.200761014s for next certificate rotation Mar 12 14:50:18 crc kubenswrapper[4869]: I0312 14:50:18.340228 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-66jjg" podUID="1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2" containerName="registry-server" containerID="cri-o://d8b0b007acc0e6c6b1f1ed4aaa7abca1d21eb859ca7baf3d20b899986b2b5687" gracePeriod=2 Mar 12 14:50:18 crc kubenswrapper[4869]: I0312 14:50:18.487096 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zs5nz" Mar 12 14:50:18 crc kubenswrapper[4869]: I0312 14:50:18.540562 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zs5nz" Mar 12 14:50:18 crc kubenswrapper[4869]: I0312 14:50:18.687742 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555450-qsz9r" Mar 12 14:50:18 crc kubenswrapper[4869]: I0312 14:50:18.796893 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8frrv\" (UniqueName: \"kubernetes.io/projected/c6a83972-34aa-4505-8b6f-b1345b7981cd-kube-api-access-8frrv\") pod \"c6a83972-34aa-4505-8b6f-b1345b7981cd\" (UID: \"c6a83972-34aa-4505-8b6f-b1345b7981cd\") " Mar 12 14:50:18 crc kubenswrapper[4869]: I0312 14:50:18.801844 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6a83972-34aa-4505-8b6f-b1345b7981cd-kube-api-access-8frrv" (OuterVolumeSpecName: "kube-api-access-8frrv") pod "c6a83972-34aa-4505-8b6f-b1345b7981cd" (UID: "c6a83972-34aa-4505-8b6f-b1345b7981cd"). InnerVolumeSpecName "kube-api-access-8frrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:50:18 crc kubenswrapper[4869]: I0312 14:50:18.813740 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-66jjg" Mar 12 14:50:18 crc kubenswrapper[4869]: I0312 14:50:18.876829 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cfj7c" Mar 12 14:50:18 crc kubenswrapper[4869]: I0312 14:50:18.899405 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2-catalog-content\") pod \"1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2\" (UID: \"1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2\") " Mar 12 14:50:18 crc kubenswrapper[4869]: I0312 14:50:18.899468 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prjbj\" (UniqueName: \"kubernetes.io/projected/1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2-kube-api-access-prjbj\") pod \"1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2\" (UID: \"1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2\") " Mar 12 14:50:18 crc kubenswrapper[4869]: I0312 14:50:18.899503 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2-utilities\") pod \"1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2\" (UID: \"1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2\") " Mar 12 14:50:18 crc kubenswrapper[4869]: I0312 14:50:18.899890 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8frrv\" (UniqueName: \"kubernetes.io/projected/c6a83972-34aa-4505-8b6f-b1345b7981cd-kube-api-access-8frrv\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:18 crc kubenswrapper[4869]: I0312 14:50:18.900350 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2-utilities" (OuterVolumeSpecName: "utilities") pod "1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2" (UID: "1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:50:18 crc kubenswrapper[4869]: I0312 14:50:18.907776 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2-kube-api-access-prjbj" (OuterVolumeSpecName: "kube-api-access-prjbj") pod "1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2" (UID: "1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2"). InnerVolumeSpecName "kube-api-access-prjbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:50:18 crc kubenswrapper[4869]: I0312 14:50:18.931868 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cfj7c" Mar 12 14:50:18 crc kubenswrapper[4869]: I0312 14:50:18.956531 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2" (UID: "1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:50:19 crc kubenswrapper[4869]: I0312 14:50:19.001619 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prjbj\" (UniqueName: \"kubernetes.io/projected/1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2-kube-api-access-prjbj\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:19 crc kubenswrapper[4869]: I0312 14:50:19.001656 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:19 crc kubenswrapper[4869]: I0312 14:50:19.001669 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:19 crc kubenswrapper[4869]: I0312 14:50:19.087043 4869 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-23 10:15:03.010194593 +0000 UTC Mar 12 14:50:19 crc kubenswrapper[4869]: I0312 14:50:19.087101 4869 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6139h24m43.923098646s for next certificate rotation Mar 12 14:50:19 crc kubenswrapper[4869]: I0312 14:50:19.352252 4869 generic.go:334] "Generic (PLEG): container finished" podID="1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2" containerID="d8b0b007acc0e6c6b1f1ed4aaa7abca1d21eb859ca7baf3d20b899986b2b5687" exitCode=0 Mar 12 14:50:19 crc kubenswrapper[4869]: I0312 14:50:19.352337 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66jjg" event={"ID":"1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2","Type":"ContainerDied","Data":"d8b0b007acc0e6c6b1f1ed4aaa7abca1d21eb859ca7baf3d20b899986b2b5687"} Mar 12 14:50:19 crc kubenswrapper[4869]: I0312 14:50:19.352341 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-66jjg" Mar 12 14:50:19 crc kubenswrapper[4869]: I0312 14:50:19.352378 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-66jjg" event={"ID":"1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2","Type":"ContainerDied","Data":"deac537a60ba7a54097b7177f8f9c17d750cc2efa13edb1c4ccc6f16acc7a29c"} Mar 12 14:50:19 crc kubenswrapper[4869]: I0312 14:50:19.352399 4869 scope.go:117] "RemoveContainer" containerID="d8b0b007acc0e6c6b1f1ed4aaa7abca1d21eb859ca7baf3d20b899986b2b5687" Mar 12 14:50:19 crc kubenswrapper[4869]: I0312 14:50:19.355353 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555450-qsz9r" Mar 12 14:50:19 crc kubenswrapper[4869]: I0312 14:50:19.355404 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555450-qsz9r" event={"ID":"c6a83972-34aa-4505-8b6f-b1345b7981cd","Type":"ContainerDied","Data":"3475af765ae81cda0a49bac97f192cd9c18d7fb2ceee2066bbc34a99ac85bc8a"} Mar 12 14:50:19 crc kubenswrapper[4869]: I0312 14:50:19.355455 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3475af765ae81cda0a49bac97f192cd9c18d7fb2ceee2066bbc34a99ac85bc8a" Mar 12 14:50:19 crc kubenswrapper[4869]: I0312 14:50:19.359503 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6x5kk" event={"ID":"82120625-a0ba-4136-b3d5-23b2b78f72cb","Type":"ContainerStarted","Data":"c25c773e978f8ff56001b54b577884aaedf0678d86137cc5c068fc32de36da63"} Mar 12 14:50:19 crc kubenswrapper[4869]: I0312 14:50:19.374797 4869 scope.go:117] "RemoveContainer" containerID="ee7998dbe3181a67f3f31d6d2c2f8cb53fec28f68f15bdab5d0cd0f5f74a0927" Mar 12 14:50:19 crc kubenswrapper[4869]: I0312 14:50:19.399373 4869 scope.go:117] "RemoveContainer" containerID="7947438a7cd6cf4bec5db1b5877b2bf217cc0695f8fc6ad4d40950e18539eb44" Mar 12 14:50:19 crc kubenswrapper[4869]: I0312 14:50:19.401565 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6x5kk" podStartSLOduration=3.084924484 podStartE2EDuration="54.401524044s" podCreationTimestamp="2026-03-12 14:49:25 +0000 UTC" firstStartedPulling="2026-03-12 14:49:27.421910751 +0000 UTC m=+119.707136029" lastFinishedPulling="2026-03-12 14:50:18.738510321 +0000 UTC m=+171.023735589" observedRunningTime="2026-03-12 14:50:19.398827814 +0000 UTC m=+171.684053092" watchObservedRunningTime="2026-03-12 14:50:19.401524044 +0000 UTC m=+171.686749342" Mar 12 14:50:19 crc kubenswrapper[4869]: I0312 14:50:19.425500 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-66jjg"] Mar 12 14:50:19 crc kubenswrapper[4869]: I0312 14:50:19.429653 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-66jjg"] Mar 12 14:50:19 crc kubenswrapper[4869]: I0312 14:50:19.430182 4869 scope.go:117] "RemoveContainer" containerID="d8b0b007acc0e6c6b1f1ed4aaa7abca1d21eb859ca7baf3d20b899986b2b5687" Mar 12 14:50:19 crc kubenswrapper[4869]: E0312 14:50:19.430703 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8b0b007acc0e6c6b1f1ed4aaa7abca1d21eb859ca7baf3d20b899986b2b5687\": container with ID starting with d8b0b007acc0e6c6b1f1ed4aaa7abca1d21eb859ca7baf3d20b899986b2b5687 not found: ID does not exist" containerID="d8b0b007acc0e6c6b1f1ed4aaa7abca1d21eb859ca7baf3d20b899986b2b5687" Mar 12 14:50:19 crc kubenswrapper[4869]: I0312 14:50:19.430738 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8b0b007acc0e6c6b1f1ed4aaa7abca1d21eb859ca7baf3d20b899986b2b5687"} err="failed to get container status \"d8b0b007acc0e6c6b1f1ed4aaa7abca1d21eb859ca7baf3d20b899986b2b5687\": rpc error: code = NotFound desc = could not find container \"d8b0b007acc0e6c6b1f1ed4aaa7abca1d21eb859ca7baf3d20b899986b2b5687\": container with ID starting with d8b0b007acc0e6c6b1f1ed4aaa7abca1d21eb859ca7baf3d20b899986b2b5687 not found: ID does not exist" Mar 12 14:50:19 crc kubenswrapper[4869]: I0312 14:50:19.430758 4869 scope.go:117] "RemoveContainer" containerID="ee7998dbe3181a67f3f31d6d2c2f8cb53fec28f68f15bdab5d0cd0f5f74a0927" Mar 12 14:50:19 crc kubenswrapper[4869]: E0312 14:50:19.433345 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee7998dbe3181a67f3f31d6d2c2f8cb53fec28f68f15bdab5d0cd0f5f74a0927\": container with ID starting with ee7998dbe3181a67f3f31d6d2c2f8cb53fec28f68f15bdab5d0cd0f5f74a0927 not found: ID does not exist" containerID="ee7998dbe3181a67f3f31d6d2c2f8cb53fec28f68f15bdab5d0cd0f5f74a0927" Mar 12 14:50:19 crc kubenswrapper[4869]: I0312 14:50:19.433391 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee7998dbe3181a67f3f31d6d2c2f8cb53fec28f68f15bdab5d0cd0f5f74a0927"} err="failed to get container status \"ee7998dbe3181a67f3f31d6d2c2f8cb53fec28f68f15bdab5d0cd0f5f74a0927\": rpc error: code = NotFound desc = could not find container \"ee7998dbe3181a67f3f31d6d2c2f8cb53fec28f68f15bdab5d0cd0f5f74a0927\": container with ID starting with ee7998dbe3181a67f3f31d6d2c2f8cb53fec28f68f15bdab5d0cd0f5f74a0927 not found: ID does not exist" Mar 12 14:50:19 crc kubenswrapper[4869]: I0312 14:50:19.433406 4869 scope.go:117] "RemoveContainer" containerID="7947438a7cd6cf4bec5db1b5877b2bf217cc0695f8fc6ad4d40950e18539eb44" Mar 12 14:50:19 crc kubenswrapper[4869]: E0312 14:50:19.433794 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7947438a7cd6cf4bec5db1b5877b2bf217cc0695f8fc6ad4d40950e18539eb44\": container with ID starting with 7947438a7cd6cf4bec5db1b5877b2bf217cc0695f8fc6ad4d40950e18539eb44 not found: ID does not exist" containerID="7947438a7cd6cf4bec5db1b5877b2bf217cc0695f8fc6ad4d40950e18539eb44" Mar 12 14:50:19 crc kubenswrapper[4869]: I0312 14:50:19.433824 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7947438a7cd6cf4bec5db1b5877b2bf217cc0695f8fc6ad4d40950e18539eb44"} err="failed to get container status \"7947438a7cd6cf4bec5db1b5877b2bf217cc0695f8fc6ad4d40950e18539eb44\": rpc error: code = NotFound desc = could not find container \"7947438a7cd6cf4bec5db1b5877b2bf217cc0695f8fc6ad4d40950e18539eb44\": container with ID starting with 7947438a7cd6cf4bec5db1b5877b2bf217cc0695f8fc6ad4d40950e18539eb44 not found: ID does not exist" Mar 12 14:50:20 crc kubenswrapper[4869]: I0312 14:50:20.316482 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sc6nv"] Mar 12 14:50:20 crc kubenswrapper[4869]: I0312 14:50:20.317000 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sc6nv" podUID="078c5068-e532-4d7f-badc-c7f0c23a3191" containerName="registry-server" containerID="cri-o://03cc0faa6d01c2e9a7d025393b5ae5388cd6bd992de8ad659d4895dc117947bb" gracePeriod=2 Mar 12 14:50:20 crc kubenswrapper[4869]: I0312 14:50:20.344909 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2" path="/var/lib/kubelet/pods/1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2/volumes" Mar 12 14:50:20 crc kubenswrapper[4869]: I0312 14:50:20.462154 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn"] Mar 12 14:50:20 crc kubenswrapper[4869]: I0312 14:50:20.462414 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn" podUID="9cae8e09-616d-4793-b010-9479017f9d6c" containerName="controller-manager" containerID="cri-o://a4c0fea1b26394323b64abe2a14207ac8e7c17f4726dbe29f7a0cea29155238c" gracePeriod=30 Mar 12 14:50:20 crc kubenswrapper[4869]: I0312 14:50:20.480272 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d77485985-fpspc"] Mar 12 14:50:20 crc kubenswrapper[4869]: I0312 14:50:20.481496 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6d77485985-fpspc" podUID="00d136cf-ede3-4e08-976a-e72f20f469f7" containerName="route-controller-manager" containerID="cri-o://9e451664f3c6c584face0b96e136b3ef2cdc8425f4ce8aaf4da939d090d474fc" gracePeriod=30 Mar 12 14:50:20 crc kubenswrapper[4869]: I0312 14:50:20.895571 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sc6nv" Mar 12 14:50:20 crc kubenswrapper[4869]: I0312 14:50:20.927016 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/078c5068-e532-4d7f-badc-c7f0c23a3191-utilities\") pod \"078c5068-e532-4d7f-badc-c7f0c23a3191\" (UID: \"078c5068-e532-4d7f-badc-c7f0c23a3191\") " Mar 12 14:50:20 crc kubenswrapper[4869]: I0312 14:50:20.927109 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/078c5068-e532-4d7f-badc-c7f0c23a3191-catalog-content\") pod \"078c5068-e532-4d7f-badc-c7f0c23a3191\" (UID: \"078c5068-e532-4d7f-badc-c7f0c23a3191\") " Mar 12 14:50:20 crc kubenswrapper[4869]: I0312 14:50:20.927148 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsc89\" (UniqueName: \"kubernetes.io/projected/078c5068-e532-4d7f-badc-c7f0c23a3191-kube-api-access-hsc89\") pod \"078c5068-e532-4d7f-badc-c7f0c23a3191\" (UID: \"078c5068-e532-4d7f-badc-c7f0c23a3191\") " Mar 12 14:50:20 crc kubenswrapper[4869]: I0312 14:50:20.932608 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/078c5068-e532-4d7f-badc-c7f0c23a3191-kube-api-access-hsc89" (OuterVolumeSpecName: "kube-api-access-hsc89") pod "078c5068-e532-4d7f-badc-c7f0c23a3191" (UID: "078c5068-e532-4d7f-badc-c7f0c23a3191"). InnerVolumeSpecName "kube-api-access-hsc89". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:50:20 crc kubenswrapper[4869]: I0312 14:50:20.954573 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/078c5068-e532-4d7f-badc-c7f0c23a3191-utilities" (OuterVolumeSpecName: "utilities") pod "078c5068-e532-4d7f-badc-c7f0c23a3191" (UID: "078c5068-e532-4d7f-badc-c7f0c23a3191"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:50:20 crc kubenswrapper[4869]: I0312 14:50:20.966018 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/078c5068-e532-4d7f-badc-c7f0c23a3191-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "078c5068-e532-4d7f-badc-c7f0c23a3191" (UID: "078c5068-e532-4d7f-badc-c7f0c23a3191"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.021907 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d77485985-fpspc" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.028443 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/078c5068-e532-4d7f-badc-c7f0c23a3191-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.028480 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/078c5068-e532-4d7f-badc-c7f0c23a3191-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.028495 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsc89\" (UniqueName: \"kubernetes.io/projected/078c5068-e532-4d7f-badc-c7f0c23a3191-kube-api-access-hsc89\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.123454 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.129150 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00d136cf-ede3-4e08-976a-e72f20f469f7-config\") pod \"00d136cf-ede3-4e08-976a-e72f20f469f7\" (UID: \"00d136cf-ede3-4e08-976a-e72f20f469f7\") " Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.129202 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00d136cf-ede3-4e08-976a-e72f20f469f7-client-ca\") pod \"00d136cf-ede3-4e08-976a-e72f20f469f7\" (UID: \"00d136cf-ede3-4e08-976a-e72f20f469f7\") " Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.129260 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00d136cf-ede3-4e08-976a-e72f20f469f7-serving-cert\") pod \"00d136cf-ede3-4e08-976a-e72f20f469f7\" (UID: \"00d136cf-ede3-4e08-976a-e72f20f469f7\") " Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.129296 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8k95p\" (UniqueName: \"kubernetes.io/projected/00d136cf-ede3-4e08-976a-e72f20f469f7-kube-api-access-8k95p\") pod \"00d136cf-ede3-4e08-976a-e72f20f469f7\" (UID: \"00d136cf-ede3-4e08-976a-e72f20f469f7\") " Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.129962 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00d136cf-ede3-4e08-976a-e72f20f469f7-client-ca" (OuterVolumeSpecName: "client-ca") pod "00d136cf-ede3-4e08-976a-e72f20f469f7" (UID: "00d136cf-ede3-4e08-976a-e72f20f469f7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.129993 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00d136cf-ede3-4e08-976a-e72f20f469f7-config" (OuterVolumeSpecName: "config") pod "00d136cf-ede3-4e08-976a-e72f20f469f7" (UID: "00d136cf-ede3-4e08-976a-e72f20f469f7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.132137 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00d136cf-ede3-4e08-976a-e72f20f469f7-kube-api-access-8k95p" (OuterVolumeSpecName: "kube-api-access-8k95p") pod "00d136cf-ede3-4e08-976a-e72f20f469f7" (UID: "00d136cf-ede3-4e08-976a-e72f20f469f7"). InnerVolumeSpecName "kube-api-access-8k95p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.132828 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00d136cf-ede3-4e08-976a-e72f20f469f7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "00d136cf-ede3-4e08-976a-e72f20f469f7" (UID: "00d136cf-ede3-4e08-976a-e72f20f469f7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.230307 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55pm4\" (UniqueName: \"kubernetes.io/projected/9cae8e09-616d-4793-b010-9479017f9d6c-kube-api-access-55pm4\") pod \"9cae8e09-616d-4793-b010-9479017f9d6c\" (UID: \"9cae8e09-616d-4793-b010-9479017f9d6c\") " Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.230386 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cae8e09-616d-4793-b010-9479017f9d6c-serving-cert\") pod \"9cae8e09-616d-4793-b010-9479017f9d6c\" (UID: \"9cae8e09-616d-4793-b010-9479017f9d6c\") " Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.230427 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9cae8e09-616d-4793-b010-9479017f9d6c-proxy-ca-bundles\") pod \"9cae8e09-616d-4793-b010-9479017f9d6c\" (UID: \"9cae8e09-616d-4793-b010-9479017f9d6c\") " Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.230473 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cae8e09-616d-4793-b010-9479017f9d6c-config\") pod \"9cae8e09-616d-4793-b010-9479017f9d6c\" (UID: \"9cae8e09-616d-4793-b010-9479017f9d6c\") " Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.230519 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9cae8e09-616d-4793-b010-9479017f9d6c-client-ca\") pod \"9cae8e09-616d-4793-b010-9479017f9d6c\" (UID: \"9cae8e09-616d-4793-b010-9479017f9d6c\") " Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.230743 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00d136cf-ede3-4e08-976a-e72f20f469f7-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.230755 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8k95p\" (UniqueName: \"kubernetes.io/projected/00d136cf-ede3-4e08-976a-e72f20f469f7-kube-api-access-8k95p\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.230766 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00d136cf-ede3-4e08-976a-e72f20f469f7-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.230774 4869 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00d136cf-ede3-4e08-976a-e72f20f469f7-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.231269 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cae8e09-616d-4793-b010-9479017f9d6c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9cae8e09-616d-4793-b010-9479017f9d6c" (UID: "9cae8e09-616d-4793-b010-9479017f9d6c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.231292 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cae8e09-616d-4793-b010-9479017f9d6c-client-ca" (OuterVolumeSpecName: "client-ca") pod "9cae8e09-616d-4793-b010-9479017f9d6c" (UID: "9cae8e09-616d-4793-b010-9479017f9d6c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.231420 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cae8e09-616d-4793-b010-9479017f9d6c-config" (OuterVolumeSpecName: "config") pod "9cae8e09-616d-4793-b010-9479017f9d6c" (UID: "9cae8e09-616d-4793-b010-9479017f9d6c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.233520 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cae8e09-616d-4793-b010-9479017f9d6c-kube-api-access-55pm4" (OuterVolumeSpecName: "kube-api-access-55pm4") pod "9cae8e09-616d-4793-b010-9479017f9d6c" (UID: "9cae8e09-616d-4793-b010-9479017f9d6c"). InnerVolumeSpecName "kube-api-access-55pm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.233670 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cae8e09-616d-4793-b010-9479017f9d6c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9cae8e09-616d-4793-b010-9479017f9d6c" (UID: "9cae8e09-616d-4793-b010-9479017f9d6c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.331921 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cae8e09-616d-4793-b010-9479017f9d6c-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.331962 4869 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9cae8e09-616d-4793-b010-9479017f9d6c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.331980 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cae8e09-616d-4793-b010-9479017f9d6c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.331993 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55pm4\" (UniqueName: \"kubernetes.io/projected/9cae8e09-616d-4793-b010-9479017f9d6c-kube-api-access-55pm4\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.332006 4869 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9cae8e09-616d-4793-b010-9479017f9d6c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.372120 4869 generic.go:334] "Generic (PLEG): container finished" podID="00d136cf-ede3-4e08-976a-e72f20f469f7" containerID="9e451664f3c6c584face0b96e136b3ef2cdc8425f4ce8aaf4da939d090d474fc" exitCode=0 Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.372183 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d77485985-fpspc" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.372203 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d77485985-fpspc" event={"ID":"00d136cf-ede3-4e08-976a-e72f20f469f7","Type":"ContainerDied","Data":"9e451664f3c6c584face0b96e136b3ef2cdc8425f4ce8aaf4da939d090d474fc"} Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.372434 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d77485985-fpspc" event={"ID":"00d136cf-ede3-4e08-976a-e72f20f469f7","Type":"ContainerDied","Data":"14ca06fbbf063354188164e3824404dc79c3258523cee555f5472add5666c890"} Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.372482 4869 scope.go:117] "RemoveContainer" containerID="9e451664f3c6c584face0b96e136b3ef2cdc8425f4ce8aaf4da939d090d474fc" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.374632 4869 generic.go:334] "Generic (PLEG): container finished" podID="9cae8e09-616d-4793-b010-9479017f9d6c" containerID="a4c0fea1b26394323b64abe2a14207ac8e7c17f4726dbe29f7a0cea29155238c" exitCode=0 Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.374711 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn" event={"ID":"9cae8e09-616d-4793-b010-9479017f9d6c","Type":"ContainerDied","Data":"a4c0fea1b26394323b64abe2a14207ac8e7c17f4726dbe29f7a0cea29155238c"} Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.374751 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn" event={"ID":"9cae8e09-616d-4793-b010-9479017f9d6c","Type":"ContainerDied","Data":"a1f3c7d1a9639bfc1206009ea7b574871045fec67809bf4ad6b1ded2f1a21a03"} Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.374825 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.385354 4869 generic.go:334] "Generic (PLEG): container finished" podID="078c5068-e532-4d7f-badc-c7f0c23a3191" containerID="03cc0faa6d01c2e9a7d025393b5ae5388cd6bd992de8ad659d4895dc117947bb" exitCode=0 Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.385401 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sc6nv" event={"ID":"078c5068-e532-4d7f-badc-c7f0c23a3191","Type":"ContainerDied","Data":"03cc0faa6d01c2e9a7d025393b5ae5388cd6bd992de8ad659d4895dc117947bb"} Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.385431 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sc6nv" event={"ID":"078c5068-e532-4d7f-badc-c7f0c23a3191","Type":"ContainerDied","Data":"c4893c76e55ac2d47bd46e145d8c2c351efa6c81b90001de554a8f80fd8a051d"} Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.385496 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sc6nv" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.390461 4869 scope.go:117] "RemoveContainer" containerID="9e451664f3c6c584face0b96e136b3ef2cdc8425f4ce8aaf4da939d090d474fc" Mar 12 14:50:21 crc kubenswrapper[4869]: E0312 14:50:21.397634 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e451664f3c6c584face0b96e136b3ef2cdc8425f4ce8aaf4da939d090d474fc\": container with ID starting with 9e451664f3c6c584face0b96e136b3ef2cdc8425f4ce8aaf4da939d090d474fc not found: ID does not exist" containerID="9e451664f3c6c584face0b96e136b3ef2cdc8425f4ce8aaf4da939d090d474fc" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.397686 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e451664f3c6c584face0b96e136b3ef2cdc8425f4ce8aaf4da939d090d474fc"} err="failed to get container status \"9e451664f3c6c584face0b96e136b3ef2cdc8425f4ce8aaf4da939d090d474fc\": rpc error: code = NotFound desc = could not find container \"9e451664f3c6c584face0b96e136b3ef2cdc8425f4ce8aaf4da939d090d474fc\": container with ID starting with 9e451664f3c6c584face0b96e136b3ef2cdc8425f4ce8aaf4da939d090d474fc not found: ID does not exist" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.397725 4869 scope.go:117] "RemoveContainer" containerID="a4c0fea1b26394323b64abe2a14207ac8e7c17f4726dbe29f7a0cea29155238c" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.416669 4869 scope.go:117] "RemoveContainer" containerID="a4c0fea1b26394323b64abe2a14207ac8e7c17f4726dbe29f7a0cea29155238c" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.417488 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d77485985-fpspc"] Mar 12 14:50:21 crc kubenswrapper[4869]: E0312 14:50:21.419949 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4c0fea1b26394323b64abe2a14207ac8e7c17f4726dbe29f7a0cea29155238c\": container with ID starting with a4c0fea1b26394323b64abe2a14207ac8e7c17f4726dbe29f7a0cea29155238c not found: ID does not exist" containerID="a4c0fea1b26394323b64abe2a14207ac8e7c17f4726dbe29f7a0cea29155238c" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.419985 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4c0fea1b26394323b64abe2a14207ac8e7c17f4726dbe29f7a0cea29155238c"} err="failed to get container status \"a4c0fea1b26394323b64abe2a14207ac8e7c17f4726dbe29f7a0cea29155238c\": rpc error: code = NotFound desc = could not find container \"a4c0fea1b26394323b64abe2a14207ac8e7c17f4726dbe29f7a0cea29155238c\": container with ID starting with a4c0fea1b26394323b64abe2a14207ac8e7c17f4726dbe29f7a0cea29155238c not found: ID does not exist" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.420009 4869 scope.go:117] "RemoveContainer" containerID="03cc0faa6d01c2e9a7d025393b5ae5388cd6bd992de8ad659d4895dc117947bb" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.421140 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d77485985-fpspc"] Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.432398 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn"] Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.439785 4869 scope.go:117] "RemoveContainer" containerID="a54059a61ff0154f6277e5821c5e38ba15e6957de731c93b3025e09b06c7da83" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.442279 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6868b8f9f5-tvgwn"] Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.446800 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sc6nv"] Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.449459 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sc6nv"] Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.457352 4869 scope.go:117] "RemoveContainer" containerID="0d79d2423fd54af475a92b1b75c8b60577a3d77277021524f44e05496fc8f29f" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.474277 4869 scope.go:117] "RemoveContainer" containerID="03cc0faa6d01c2e9a7d025393b5ae5388cd6bd992de8ad659d4895dc117947bb" Mar 12 14:50:21 crc kubenswrapper[4869]: E0312 14:50:21.474804 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03cc0faa6d01c2e9a7d025393b5ae5388cd6bd992de8ad659d4895dc117947bb\": container with ID starting with 03cc0faa6d01c2e9a7d025393b5ae5388cd6bd992de8ad659d4895dc117947bb not found: ID does not exist" containerID="03cc0faa6d01c2e9a7d025393b5ae5388cd6bd992de8ad659d4895dc117947bb" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.474835 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03cc0faa6d01c2e9a7d025393b5ae5388cd6bd992de8ad659d4895dc117947bb"} err="failed to get container status \"03cc0faa6d01c2e9a7d025393b5ae5388cd6bd992de8ad659d4895dc117947bb\": rpc error: code = NotFound desc = could not find container \"03cc0faa6d01c2e9a7d025393b5ae5388cd6bd992de8ad659d4895dc117947bb\": container with ID starting with 03cc0faa6d01c2e9a7d025393b5ae5388cd6bd992de8ad659d4895dc117947bb not found: ID does not exist" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.474859 4869 scope.go:117] "RemoveContainer" containerID="a54059a61ff0154f6277e5821c5e38ba15e6957de731c93b3025e09b06c7da83" Mar 12 14:50:21 crc kubenswrapper[4869]: E0312 14:50:21.475196 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a54059a61ff0154f6277e5821c5e38ba15e6957de731c93b3025e09b06c7da83\": container with ID starting with a54059a61ff0154f6277e5821c5e38ba15e6957de731c93b3025e09b06c7da83 not found: ID does not exist" containerID="a54059a61ff0154f6277e5821c5e38ba15e6957de731c93b3025e09b06c7da83" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.475220 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a54059a61ff0154f6277e5821c5e38ba15e6957de731c93b3025e09b06c7da83"} err="failed to get container status \"a54059a61ff0154f6277e5821c5e38ba15e6957de731c93b3025e09b06c7da83\": rpc error: code = NotFound desc = could not find container \"a54059a61ff0154f6277e5821c5e38ba15e6957de731c93b3025e09b06c7da83\": container with ID starting with a54059a61ff0154f6277e5821c5e38ba15e6957de731c93b3025e09b06c7da83 not found: ID does not exist" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.475234 4869 scope.go:117] "RemoveContainer" containerID="0d79d2423fd54af475a92b1b75c8b60577a3d77277021524f44e05496fc8f29f" Mar 12 14:50:21 crc kubenswrapper[4869]: E0312 14:50:21.475533 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d79d2423fd54af475a92b1b75c8b60577a3d77277021524f44e05496fc8f29f\": container with ID starting with 0d79d2423fd54af475a92b1b75c8b60577a3d77277021524f44e05496fc8f29f not found: ID does not exist" containerID="0d79d2423fd54af475a92b1b75c8b60577a3d77277021524f44e05496fc8f29f" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.475593 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d79d2423fd54af475a92b1b75c8b60577a3d77277021524f44e05496fc8f29f"} err="failed to get container status \"0d79d2423fd54af475a92b1b75c8b60577a3d77277021524f44e05496fc8f29f\": rpc error: code = NotFound desc = could not find container \"0d79d2423fd54af475a92b1b75c8b60577a3d77277021524f44e05496fc8f29f\": container with ID starting with 0d79d2423fd54af475a92b1b75c8b60577a3d77277021524f44e05496fc8f29f not found: ID does not exist" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.895940 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dbd769694-l5ft7"] Mar 12 14:50:21 crc kubenswrapper[4869]: E0312 14:50:21.896253 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2" containerName="extract-content" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.896278 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2" containerName="extract-content" Mar 12 14:50:21 crc kubenswrapper[4869]: E0312 14:50:21.896293 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2" containerName="registry-server" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.896302 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2" containerName="registry-server" Mar 12 14:50:21 crc kubenswrapper[4869]: E0312 14:50:21.896319 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="078c5068-e532-4d7f-badc-c7f0c23a3191" containerName="extract-content" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.896330 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="078c5068-e532-4d7f-badc-c7f0c23a3191" containerName="extract-content" Mar 12 14:50:21 crc kubenswrapper[4869]: E0312 14:50:21.896348 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="078c5068-e532-4d7f-badc-c7f0c23a3191" containerName="registry-server" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.896356 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="078c5068-e532-4d7f-badc-c7f0c23a3191" containerName="registry-server" Mar 12 14:50:21 crc kubenswrapper[4869]: E0312 14:50:21.896366 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6a83972-34aa-4505-8b6f-b1345b7981cd" containerName="oc" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.896374 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6a83972-34aa-4505-8b6f-b1345b7981cd" containerName="oc" Mar 12 14:50:21 crc kubenswrapper[4869]: E0312 14:50:21.896392 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cae8e09-616d-4793-b010-9479017f9d6c" containerName="controller-manager" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.896400 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cae8e09-616d-4793-b010-9479017f9d6c" containerName="controller-manager" Mar 12 14:50:21 crc kubenswrapper[4869]: E0312 14:50:21.896413 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00d136cf-ede3-4e08-976a-e72f20f469f7" containerName="route-controller-manager" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.896421 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="00d136cf-ede3-4e08-976a-e72f20f469f7" containerName="route-controller-manager" Mar 12 14:50:21 crc kubenswrapper[4869]: E0312 14:50:21.896441 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="078c5068-e532-4d7f-badc-c7f0c23a3191" containerName="extract-utilities" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.896449 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="078c5068-e532-4d7f-badc-c7f0c23a3191" containerName="extract-utilities" Mar 12 14:50:21 crc kubenswrapper[4869]: E0312 14:50:21.896462 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2" containerName="extract-utilities" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.896470 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2" containerName="extract-utilities" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.896616 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="078c5068-e532-4d7f-badc-c7f0c23a3191" containerName="registry-server" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.896634 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cae8e09-616d-4793-b010-9479017f9d6c" containerName="controller-manager" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.896646 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6a83972-34aa-4505-8b6f-b1345b7981cd" containerName="oc" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.896658 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="00d136cf-ede3-4e08-976a-e72f20f469f7" containerName="route-controller-manager" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.896669 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fbbcb7b-e57c-421e-84b8-24b5b27fd6a2" containerName="registry-server" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.897162 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dbd769694-l5ft7" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.898666 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.899100 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.899474 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.899625 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.900978 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.901786 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.902957 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-665bf8c958-gqpvl"] Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.903786 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-665bf8c958-gqpvl" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.906112 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.906317 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.906419 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.907640 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.907739 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.907827 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.908630 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dbd769694-l5ft7"] Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.912361 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.936127 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-665bf8c958-gqpvl"] Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.936674 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1b2ae3b-cb2f-4625-b30d-c92c31084966-client-ca\") pod \"route-controller-manager-dbd769694-l5ft7\" (UID: \"d1b2ae3b-cb2f-4625-b30d-c92c31084966\") " pod="openshift-route-controller-manager/route-controller-manager-dbd769694-l5ft7" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.936724 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1f711c19-4a9c-4fc6-b728-b0aa0749597b-client-ca\") pod \"controller-manager-665bf8c958-gqpvl\" (UID: \"1f711c19-4a9c-4fc6-b728-b0aa0749597b\") " pod="openshift-controller-manager/controller-manager-665bf8c958-gqpvl" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.936771 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1f711c19-4a9c-4fc6-b728-b0aa0749597b-proxy-ca-bundles\") pod \"controller-manager-665bf8c958-gqpvl\" (UID: \"1f711c19-4a9c-4fc6-b728-b0aa0749597b\") " pod="openshift-controller-manager/controller-manager-665bf8c958-gqpvl" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.936841 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f711c19-4a9c-4fc6-b728-b0aa0749597b-config\") pod \"controller-manager-665bf8c958-gqpvl\" (UID: \"1f711c19-4a9c-4fc6-b728-b0aa0749597b\") " pod="openshift-controller-manager/controller-manager-665bf8c958-gqpvl" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.936870 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f711c19-4a9c-4fc6-b728-b0aa0749597b-serving-cert\") pod \"controller-manager-665bf8c958-gqpvl\" (UID: \"1f711c19-4a9c-4fc6-b728-b0aa0749597b\") " pod="openshift-controller-manager/controller-manager-665bf8c958-gqpvl" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.936894 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1b2ae3b-cb2f-4625-b30d-c92c31084966-serving-cert\") pod \"route-controller-manager-dbd769694-l5ft7\" (UID: \"d1b2ae3b-cb2f-4625-b30d-c92c31084966\") " pod="openshift-route-controller-manager/route-controller-manager-dbd769694-l5ft7" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.936910 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6f4g\" (UniqueName: \"kubernetes.io/projected/d1b2ae3b-cb2f-4625-b30d-c92c31084966-kube-api-access-d6f4g\") pod \"route-controller-manager-dbd769694-l5ft7\" (UID: \"d1b2ae3b-cb2f-4625-b30d-c92c31084966\") " pod="openshift-route-controller-manager/route-controller-manager-dbd769694-l5ft7" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.936930 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1b2ae3b-cb2f-4625-b30d-c92c31084966-config\") pod \"route-controller-manager-dbd769694-l5ft7\" (UID: \"d1b2ae3b-cb2f-4625-b30d-c92c31084966\") " pod="openshift-route-controller-manager/route-controller-manager-dbd769694-l5ft7" Mar 12 14:50:21 crc kubenswrapper[4869]: I0312 14:50:21.936947 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkjhf\" (UniqueName: \"kubernetes.io/projected/1f711c19-4a9c-4fc6-b728-b0aa0749597b-kube-api-access-kkjhf\") pod \"controller-manager-665bf8c958-gqpvl\" (UID: \"1f711c19-4a9c-4fc6-b728-b0aa0749597b\") " pod="openshift-controller-manager/controller-manager-665bf8c958-gqpvl" Mar 12 14:50:22 crc kubenswrapper[4869]: I0312 14:50:22.037774 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1f711c19-4a9c-4fc6-b728-b0aa0749597b-proxy-ca-bundles\") pod \"controller-manager-665bf8c958-gqpvl\" (UID: \"1f711c19-4a9c-4fc6-b728-b0aa0749597b\") " pod="openshift-controller-manager/controller-manager-665bf8c958-gqpvl" Mar 12 14:50:22 crc kubenswrapper[4869]: I0312 14:50:22.037856 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f711c19-4a9c-4fc6-b728-b0aa0749597b-config\") pod \"controller-manager-665bf8c958-gqpvl\" (UID: \"1f711c19-4a9c-4fc6-b728-b0aa0749597b\") " pod="openshift-controller-manager/controller-manager-665bf8c958-gqpvl" Mar 12 14:50:22 crc kubenswrapper[4869]: I0312 14:50:22.037882 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f711c19-4a9c-4fc6-b728-b0aa0749597b-serving-cert\") pod \"controller-manager-665bf8c958-gqpvl\" (UID: \"1f711c19-4a9c-4fc6-b728-b0aa0749597b\") " pod="openshift-controller-manager/controller-manager-665bf8c958-gqpvl" Mar 12 14:50:22 crc kubenswrapper[4869]: I0312 14:50:22.037916 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1b2ae3b-cb2f-4625-b30d-c92c31084966-serving-cert\") pod \"route-controller-manager-dbd769694-l5ft7\" (UID: \"d1b2ae3b-cb2f-4625-b30d-c92c31084966\") " pod="openshift-route-controller-manager/route-controller-manager-dbd769694-l5ft7" Mar 12 14:50:22 crc kubenswrapper[4869]: I0312 14:50:22.037940 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6f4g\" (UniqueName: \"kubernetes.io/projected/d1b2ae3b-cb2f-4625-b30d-c92c31084966-kube-api-access-d6f4g\") pod \"route-controller-manager-dbd769694-l5ft7\" (UID: \"d1b2ae3b-cb2f-4625-b30d-c92c31084966\") " pod="openshift-route-controller-manager/route-controller-manager-dbd769694-l5ft7" Mar 12 14:50:22 crc kubenswrapper[4869]: I0312 14:50:22.037968 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1b2ae3b-cb2f-4625-b30d-c92c31084966-config\") pod \"route-controller-manager-dbd769694-l5ft7\" (UID: \"d1b2ae3b-cb2f-4625-b30d-c92c31084966\") " pod="openshift-route-controller-manager/route-controller-manager-dbd769694-l5ft7" Mar 12 14:50:22 crc kubenswrapper[4869]: I0312 14:50:22.037991 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkjhf\" (UniqueName: \"kubernetes.io/projected/1f711c19-4a9c-4fc6-b728-b0aa0749597b-kube-api-access-kkjhf\") pod \"controller-manager-665bf8c958-gqpvl\" (UID: \"1f711c19-4a9c-4fc6-b728-b0aa0749597b\") " pod="openshift-controller-manager/controller-manager-665bf8c958-gqpvl" Mar 12 14:50:22 crc kubenswrapper[4869]: I0312 14:50:22.038021 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1b2ae3b-cb2f-4625-b30d-c92c31084966-client-ca\") pod \"route-controller-manager-dbd769694-l5ft7\" (UID: \"d1b2ae3b-cb2f-4625-b30d-c92c31084966\") " pod="openshift-route-controller-manager/route-controller-manager-dbd769694-l5ft7" Mar 12 14:50:22 crc kubenswrapper[4869]: I0312 14:50:22.038075 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1f711c19-4a9c-4fc6-b728-b0aa0749597b-client-ca\") pod \"controller-manager-665bf8c958-gqpvl\" (UID: \"1f711c19-4a9c-4fc6-b728-b0aa0749597b\") " pod="openshift-controller-manager/controller-manager-665bf8c958-gqpvl" Mar 12 14:50:22 crc kubenswrapper[4869]: I0312 14:50:22.039055 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1f711c19-4a9c-4fc6-b728-b0aa0749597b-client-ca\") pod \"controller-manager-665bf8c958-gqpvl\" (UID: \"1f711c19-4a9c-4fc6-b728-b0aa0749597b\") " pod="openshift-controller-manager/controller-manager-665bf8c958-gqpvl" Mar 12 14:50:22 crc kubenswrapper[4869]: I0312 14:50:22.039071 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1f711c19-4a9c-4fc6-b728-b0aa0749597b-proxy-ca-bundles\") pod \"controller-manager-665bf8c958-gqpvl\" (UID: \"1f711c19-4a9c-4fc6-b728-b0aa0749597b\") " pod="openshift-controller-manager/controller-manager-665bf8c958-gqpvl" Mar 12 14:50:22 crc kubenswrapper[4869]: I0312 14:50:22.039430 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f711c19-4a9c-4fc6-b728-b0aa0749597b-config\") pod \"controller-manager-665bf8c958-gqpvl\" (UID: \"1f711c19-4a9c-4fc6-b728-b0aa0749597b\") " pod="openshift-controller-manager/controller-manager-665bf8c958-gqpvl" Mar 12 14:50:22 crc kubenswrapper[4869]: I0312 14:50:22.039719 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1b2ae3b-cb2f-4625-b30d-c92c31084966-config\") pod \"route-controller-manager-dbd769694-l5ft7\" (UID: \"d1b2ae3b-cb2f-4625-b30d-c92c31084966\") " pod="openshift-route-controller-manager/route-controller-manager-dbd769694-l5ft7" Mar 12 14:50:22 crc kubenswrapper[4869]: I0312 14:50:22.039929 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1b2ae3b-cb2f-4625-b30d-c92c31084966-client-ca\") pod \"route-controller-manager-dbd769694-l5ft7\" (UID: \"d1b2ae3b-cb2f-4625-b30d-c92c31084966\") " pod="openshift-route-controller-manager/route-controller-manager-dbd769694-l5ft7" Mar 12 14:50:22 crc kubenswrapper[4869]: I0312 14:50:22.056499 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f711c19-4a9c-4fc6-b728-b0aa0749597b-serving-cert\") pod \"controller-manager-665bf8c958-gqpvl\" (UID: \"1f711c19-4a9c-4fc6-b728-b0aa0749597b\") " pod="openshift-controller-manager/controller-manager-665bf8c958-gqpvl" Mar 12 14:50:22 crc kubenswrapper[4869]: I0312 14:50:22.057070 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkjhf\" (UniqueName: \"kubernetes.io/projected/1f711c19-4a9c-4fc6-b728-b0aa0749597b-kube-api-access-kkjhf\") pod \"controller-manager-665bf8c958-gqpvl\" (UID: \"1f711c19-4a9c-4fc6-b728-b0aa0749597b\") " pod="openshift-controller-manager/controller-manager-665bf8c958-gqpvl" Mar 12 14:50:22 crc kubenswrapper[4869]: I0312 14:50:22.057586 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1b2ae3b-cb2f-4625-b30d-c92c31084966-serving-cert\") pod \"route-controller-manager-dbd769694-l5ft7\" (UID: \"d1b2ae3b-cb2f-4625-b30d-c92c31084966\") " pod="openshift-route-controller-manager/route-controller-manager-dbd769694-l5ft7" Mar 12 14:50:22 crc kubenswrapper[4869]: I0312 14:50:22.058434 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6f4g\" (UniqueName: \"kubernetes.io/projected/d1b2ae3b-cb2f-4625-b30d-c92c31084966-kube-api-access-d6f4g\") pod \"route-controller-manager-dbd769694-l5ft7\" (UID: \"d1b2ae3b-cb2f-4625-b30d-c92c31084966\") " pod="openshift-route-controller-manager/route-controller-manager-dbd769694-l5ft7" Mar 12 14:50:22 crc kubenswrapper[4869]: I0312 14:50:22.221489 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dbd769694-l5ft7" Mar 12 14:50:22 crc kubenswrapper[4869]: I0312 14:50:22.236282 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-665bf8c958-gqpvl" Mar 12 14:50:22 crc kubenswrapper[4869]: I0312 14:50:22.346377 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00d136cf-ede3-4e08-976a-e72f20f469f7" path="/var/lib/kubelet/pods/00d136cf-ede3-4e08-976a-e72f20f469f7/volumes" Mar 12 14:50:22 crc kubenswrapper[4869]: I0312 14:50:22.347365 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="078c5068-e532-4d7f-badc-c7f0c23a3191" path="/var/lib/kubelet/pods/078c5068-e532-4d7f-badc-c7f0c23a3191/volumes" Mar 12 14:50:22 crc kubenswrapper[4869]: I0312 14:50:22.348132 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cae8e09-616d-4793-b010-9479017f9d6c" path="/var/lib/kubelet/pods/9cae8e09-616d-4793-b010-9479017f9d6c/volumes" Mar 12 14:50:22 crc kubenswrapper[4869]: I0312 14:50:22.403369 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dbd769694-l5ft7"] Mar 12 14:50:22 crc kubenswrapper[4869]: W0312 14:50:22.411528 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b2ae3b_cb2f_4625_b30d_c92c31084966.slice/crio-d9732cbe6bc657d3b9bb6004b754988588e2d50d55714770907349b4e7b42307 WatchSource:0}: Error finding container d9732cbe6bc657d3b9bb6004b754988588e2d50d55714770907349b4e7b42307: Status 404 returned error can't find the container with id d9732cbe6bc657d3b9bb6004b754988588e2d50d55714770907349b4e7b42307 Mar 12 14:50:22 crc kubenswrapper[4869]: I0312 14:50:22.464235 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-665bf8c958-gqpvl"] Mar 12 14:50:22 crc kubenswrapper[4869]: W0312 14:50:22.479635 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f711c19_4a9c_4fc6_b728_b0aa0749597b.slice/crio-768a872213ef60ef8df10461d0a83ec9a99a7e46f0b97b17f50cbc23a07bd00e WatchSource:0}: Error finding container 768a872213ef60ef8df10461d0a83ec9a99a7e46f0b97b17f50cbc23a07bd00e: Status 404 returned error can't find the container with id 768a872213ef60ef8df10461d0a83ec9a99a7e46f0b97b17f50cbc23a07bd00e Mar 12 14:50:22 crc kubenswrapper[4869]: I0312 14:50:22.712448 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cfj7c"] Mar 12 14:50:22 crc kubenswrapper[4869]: I0312 14:50:22.712887 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cfj7c" podUID="74481165-f298-4cdc-9af7-feef043fa182" containerName="registry-server" containerID="cri-o://abb9213ab794ad7bd95a2c3c3205330903217c6a16a6cdf8df0dc21afd19781c" gracePeriod=2 Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.305428 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cfj7c" Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.381999 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74481165-f298-4cdc-9af7-feef043fa182-catalog-content\") pod \"74481165-f298-4cdc-9af7-feef043fa182\" (UID: \"74481165-f298-4cdc-9af7-feef043fa182\") " Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.382059 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nw6v\" (UniqueName: \"kubernetes.io/projected/74481165-f298-4cdc-9af7-feef043fa182-kube-api-access-4nw6v\") pod \"74481165-f298-4cdc-9af7-feef043fa182\" (UID: \"74481165-f298-4cdc-9af7-feef043fa182\") " Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.382115 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74481165-f298-4cdc-9af7-feef043fa182-utilities\") pod \"74481165-f298-4cdc-9af7-feef043fa182\" (UID: \"74481165-f298-4cdc-9af7-feef043fa182\") " Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.391476 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74481165-f298-4cdc-9af7-feef043fa182-utilities" (OuterVolumeSpecName: "utilities") pod "74481165-f298-4cdc-9af7-feef043fa182" (UID: "74481165-f298-4cdc-9af7-feef043fa182"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.395242 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74481165-f298-4cdc-9af7-feef043fa182-kube-api-access-4nw6v" (OuterVolumeSpecName: "kube-api-access-4nw6v") pod "74481165-f298-4cdc-9af7-feef043fa182" (UID: "74481165-f298-4cdc-9af7-feef043fa182"). InnerVolumeSpecName "kube-api-access-4nw6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.414367 4869 generic.go:334] "Generic (PLEG): container finished" podID="74481165-f298-4cdc-9af7-feef043fa182" containerID="abb9213ab794ad7bd95a2c3c3205330903217c6a16a6cdf8df0dc21afd19781c" exitCode=0 Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.414433 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cfj7c" Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.414468 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cfj7c" event={"ID":"74481165-f298-4cdc-9af7-feef043fa182","Type":"ContainerDied","Data":"abb9213ab794ad7bd95a2c3c3205330903217c6a16a6cdf8df0dc21afd19781c"} Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.414506 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cfj7c" event={"ID":"74481165-f298-4cdc-9af7-feef043fa182","Type":"ContainerDied","Data":"0d5ecf96236ca596f2256b0b02e89b1dab6a0f4cd5450bf5d232fa2604763606"} Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.414531 4869 scope.go:117] "RemoveContainer" containerID="abb9213ab794ad7bd95a2c3c3205330903217c6a16a6cdf8df0dc21afd19781c" Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.416492 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dbd769694-l5ft7" event={"ID":"d1b2ae3b-cb2f-4625-b30d-c92c31084966","Type":"ContainerStarted","Data":"f88ea4c916b1b58abcdc02b48f1019fe7ab7f2707bb463db284bd9d934ce8d69"} Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.416568 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-dbd769694-l5ft7" Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.416584 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dbd769694-l5ft7" event={"ID":"d1b2ae3b-cb2f-4625-b30d-c92c31084966","Type":"ContainerStarted","Data":"d9732cbe6bc657d3b9bb6004b754988588e2d50d55714770907349b4e7b42307"} Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.417381 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-665bf8c958-gqpvl" event={"ID":"1f711c19-4a9c-4fc6-b728-b0aa0749597b","Type":"ContainerStarted","Data":"f673f36a6fbd2ea5690c865f6a86e78170d49d0606527c7d08ef9c92d1441448"} Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.417418 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-665bf8c958-gqpvl" event={"ID":"1f711c19-4a9c-4fc6-b728-b0aa0749597b","Type":"ContainerStarted","Data":"768a872213ef60ef8df10461d0a83ec9a99a7e46f0b97b17f50cbc23a07bd00e"} Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.417700 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-665bf8c958-gqpvl" Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.421308 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-dbd769694-l5ft7" Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.422420 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-665bf8c958-gqpvl" Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.432246 4869 scope.go:117] "RemoveContainer" containerID="3c2f2705967e07c6b85029ab5f3fc5d385b807153bc7f13c8df461231acea340" Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.444871 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-dbd769694-l5ft7" podStartSLOduration=3.444848689 podStartE2EDuration="3.444848689s" podCreationTimestamp="2026-03-12 14:50:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:50:23.438959282 +0000 UTC m=+175.724184560" watchObservedRunningTime="2026-03-12 14:50:23.444848689 +0000 UTC m=+175.730073977" Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.458422 4869 scope.go:117] "RemoveContainer" containerID="cf8a5c712d5a0df89a1285830b7f66dba060dcbeb6f2284e3d0dcb13ce0cdb3d" Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.468095 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-665bf8c958-gqpvl" podStartSLOduration=3.468068963 podStartE2EDuration="3.468068963s" podCreationTimestamp="2026-03-12 14:50:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:50:23.460837122 +0000 UTC m=+175.746062400" watchObservedRunningTime="2026-03-12 14:50:23.468068963 +0000 UTC m=+175.753294241" Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.487597 4869 scope.go:117] "RemoveContainer" containerID="abb9213ab794ad7bd95a2c3c3205330903217c6a16a6cdf8df0dc21afd19781c" Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.487754 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nw6v\" (UniqueName: \"kubernetes.io/projected/74481165-f298-4cdc-9af7-feef043fa182-kube-api-access-4nw6v\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.487777 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74481165-f298-4cdc-9af7-feef043fa182-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:23 crc kubenswrapper[4869]: E0312 14:50:23.488383 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abb9213ab794ad7bd95a2c3c3205330903217c6a16a6cdf8df0dc21afd19781c\": container with ID starting with abb9213ab794ad7bd95a2c3c3205330903217c6a16a6cdf8df0dc21afd19781c not found: ID does not exist" containerID="abb9213ab794ad7bd95a2c3c3205330903217c6a16a6cdf8df0dc21afd19781c" Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.488430 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abb9213ab794ad7bd95a2c3c3205330903217c6a16a6cdf8df0dc21afd19781c"} err="failed to get container status \"abb9213ab794ad7bd95a2c3c3205330903217c6a16a6cdf8df0dc21afd19781c\": rpc error: code = NotFound desc = could not find container \"abb9213ab794ad7bd95a2c3c3205330903217c6a16a6cdf8df0dc21afd19781c\": container with ID starting with abb9213ab794ad7bd95a2c3c3205330903217c6a16a6cdf8df0dc21afd19781c not found: ID does not exist" Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.488456 4869 scope.go:117] "RemoveContainer" containerID="3c2f2705967e07c6b85029ab5f3fc5d385b807153bc7f13c8df461231acea340" Mar 12 14:50:23 crc kubenswrapper[4869]: E0312 14:50:23.489063 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c2f2705967e07c6b85029ab5f3fc5d385b807153bc7f13c8df461231acea340\": container with ID starting with 3c2f2705967e07c6b85029ab5f3fc5d385b807153bc7f13c8df461231acea340 not found: ID does not exist" containerID="3c2f2705967e07c6b85029ab5f3fc5d385b807153bc7f13c8df461231acea340" Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.489086 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c2f2705967e07c6b85029ab5f3fc5d385b807153bc7f13c8df461231acea340"} err="failed to get container status \"3c2f2705967e07c6b85029ab5f3fc5d385b807153bc7f13c8df461231acea340\": rpc error: code = NotFound desc = could not find container \"3c2f2705967e07c6b85029ab5f3fc5d385b807153bc7f13c8df461231acea340\": container with ID starting with 3c2f2705967e07c6b85029ab5f3fc5d385b807153bc7f13c8df461231acea340 not found: ID does not exist" Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.489103 4869 scope.go:117] "RemoveContainer" containerID="cf8a5c712d5a0df89a1285830b7f66dba060dcbeb6f2284e3d0dcb13ce0cdb3d" Mar 12 14:50:23 crc kubenswrapper[4869]: E0312 14:50:23.489368 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf8a5c712d5a0df89a1285830b7f66dba060dcbeb6f2284e3d0dcb13ce0cdb3d\": container with ID starting with cf8a5c712d5a0df89a1285830b7f66dba060dcbeb6f2284e3d0dcb13ce0cdb3d not found: ID does not exist" containerID="cf8a5c712d5a0df89a1285830b7f66dba060dcbeb6f2284e3d0dcb13ce0cdb3d" Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.489409 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf8a5c712d5a0df89a1285830b7f66dba060dcbeb6f2284e3d0dcb13ce0cdb3d"} err="failed to get container status \"cf8a5c712d5a0df89a1285830b7f66dba060dcbeb6f2284e3d0dcb13ce0cdb3d\": rpc error: code = NotFound desc = could not find container \"cf8a5c712d5a0df89a1285830b7f66dba060dcbeb6f2284e3d0dcb13ce0cdb3d\": container with ID starting with cf8a5c712d5a0df89a1285830b7f66dba060dcbeb6f2284e3d0dcb13ce0cdb3d not found: ID does not exist" Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.537358 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74481165-f298-4cdc-9af7-feef043fa182-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "74481165-f298-4cdc-9af7-feef043fa182" (UID: "74481165-f298-4cdc-9af7-feef043fa182"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.591134 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74481165-f298-4cdc-9af7-feef043fa182-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.745419 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cfj7c"] Mar 12 14:50:23 crc kubenswrapper[4869]: I0312 14:50:23.748640 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cfj7c"] Mar 12 14:50:24 crc kubenswrapper[4869]: I0312 14:50:24.344748 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74481165-f298-4cdc-9af7-feef043fa182" path="/var/lib/kubelet/pods/74481165-f298-4cdc-9af7-feef043fa182/volumes" Mar 12 14:50:25 crc kubenswrapper[4869]: I0312 14:50:25.875746 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" podUID="d640d651-bd78-49dc-b945-f0c749666c66" containerName="oauth-openshift" containerID="cri-o://ef1e4918cba5339ca58ec7ddc01989527447fcc18a406f5b98ae1f11a77ddb57" gracePeriod=15 Mar 12 14:50:25 crc kubenswrapper[4869]: I0312 14:50:25.889342 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6x5kk" Mar 12 14:50:25 crc kubenswrapper[4869]: I0312 14:50:25.889433 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6x5kk" Mar 12 14:50:25 crc kubenswrapper[4869]: I0312 14:50:25.937412 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6x5kk" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.311744 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.352505 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4c5h\" (UniqueName: \"kubernetes.io/projected/d640d651-bd78-49dc-b945-f0c749666c66-kube-api-access-x4c5h\") pod \"d640d651-bd78-49dc-b945-f0c749666c66\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.352643 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d640d651-bd78-49dc-b945-f0c749666c66-audit-policies\") pod \"d640d651-bd78-49dc-b945-f0c749666c66\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.352674 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d640d651-bd78-49dc-b945-f0c749666c66-audit-dir\") pod \"d640d651-bd78-49dc-b945-f0c749666c66\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.352696 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-service-ca\") pod \"d640d651-bd78-49dc-b945-f0c749666c66\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.352762 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-user-idp-0-file-data\") pod \"d640d651-bd78-49dc-b945-f0c749666c66\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.352794 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-router-certs\") pod \"d640d651-bd78-49dc-b945-f0c749666c66\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.352841 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-trusted-ca-bundle\") pod \"d640d651-bd78-49dc-b945-f0c749666c66\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.352888 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-ocp-branding-template\") pod \"d640d651-bd78-49dc-b945-f0c749666c66\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.352936 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-user-template-login\") pod \"d640d651-bd78-49dc-b945-f0c749666c66\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.352959 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-session\") pod \"d640d651-bd78-49dc-b945-f0c749666c66\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.353023 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-user-template-error\") pod \"d640d651-bd78-49dc-b945-f0c749666c66\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.353077 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-cliconfig\") pod \"d640d651-bd78-49dc-b945-f0c749666c66\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.353118 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-serving-cert\") pod \"d640d651-bd78-49dc-b945-f0c749666c66\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.353165 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-user-template-provider-selection\") pod \"d640d651-bd78-49dc-b945-f0c749666c66\" (UID: \"d640d651-bd78-49dc-b945-f0c749666c66\") " Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.355292 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "d640d651-bd78-49dc-b945-f0c749666c66" (UID: "d640d651-bd78-49dc-b945-f0c749666c66"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.366104 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d640d651-bd78-49dc-b945-f0c749666c66-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "d640d651-bd78-49dc-b945-f0c749666c66" (UID: "d640d651-bd78-49dc-b945-f0c749666c66"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.366753 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d640d651-bd78-49dc-b945-f0c749666c66-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "d640d651-bd78-49dc-b945-f0c749666c66" (UID: "d640d651-bd78-49dc-b945-f0c749666c66"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.367147 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "d640d651-bd78-49dc-b945-f0c749666c66" (UID: "d640d651-bd78-49dc-b945-f0c749666c66"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.367169 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "d640d651-bd78-49dc-b945-f0c749666c66" (UID: "d640d651-bd78-49dc-b945-f0c749666c66"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.370469 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "d640d651-bd78-49dc-b945-f0c749666c66" (UID: "d640d651-bd78-49dc-b945-f0c749666c66"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.370624 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "d640d651-bd78-49dc-b945-f0c749666c66" (UID: "d640d651-bd78-49dc-b945-f0c749666c66"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.370931 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "d640d651-bd78-49dc-b945-f0c749666c66" (UID: "d640d651-bd78-49dc-b945-f0c749666c66"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.371536 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "d640d651-bd78-49dc-b945-f0c749666c66" (UID: "d640d651-bd78-49dc-b945-f0c749666c66"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.371769 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "d640d651-bd78-49dc-b945-f0c749666c66" (UID: "d640d651-bd78-49dc-b945-f0c749666c66"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.372123 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "d640d651-bd78-49dc-b945-f0c749666c66" (UID: "d640d651-bd78-49dc-b945-f0c749666c66"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.372868 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d640d651-bd78-49dc-b945-f0c749666c66-kube-api-access-x4c5h" (OuterVolumeSpecName: "kube-api-access-x4c5h") pod "d640d651-bd78-49dc-b945-f0c749666c66" (UID: "d640d651-bd78-49dc-b945-f0c749666c66"). InnerVolumeSpecName "kube-api-access-x4c5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.373318 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "d640d651-bd78-49dc-b945-f0c749666c66" (UID: "d640d651-bd78-49dc-b945-f0c749666c66"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.391709 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "d640d651-bd78-49dc-b945-f0c749666c66" (UID: "d640d651-bd78-49dc-b945-f0c749666c66"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.440265 4869 generic.go:334] "Generic (PLEG): container finished" podID="d640d651-bd78-49dc-b945-f0c749666c66" containerID="ef1e4918cba5339ca58ec7ddc01989527447fcc18a406f5b98ae1f11a77ddb57" exitCode=0 Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.440360 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" event={"ID":"d640d651-bd78-49dc-b945-f0c749666c66","Type":"ContainerDied","Data":"ef1e4918cba5339ca58ec7ddc01989527447fcc18a406f5b98ae1f11a77ddb57"} Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.440422 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" event={"ID":"d640d651-bd78-49dc-b945-f0c749666c66","Type":"ContainerDied","Data":"17d4e6358251c9e9f4695fc7bf5401464ccc338c55529f48cc85e98f8d00e189"} Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.440444 4869 scope.go:117] "RemoveContainer" containerID="ef1e4918cba5339ca58ec7ddc01989527447fcc18a406f5b98ae1f11a77ddb57" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.441128 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fdwk9" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.454578 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.454606 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.454616 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.454624 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.454635 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.454644 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.454655 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.454669 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4c5h\" (UniqueName: \"kubernetes.io/projected/d640d651-bd78-49dc-b945-f0c749666c66-kube-api-access-x4c5h\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.454682 4869 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d640d651-bd78-49dc-b945-f0c749666c66-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.454693 4869 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d640d651-bd78-49dc-b945-f0c749666c66-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.454704 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.454738 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.454748 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.454757 4869 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d640d651-bd78-49dc-b945-f0c749666c66-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.460536 4869 scope.go:117] "RemoveContainer" containerID="ef1e4918cba5339ca58ec7ddc01989527447fcc18a406f5b98ae1f11a77ddb57" Mar 12 14:50:26 crc kubenswrapper[4869]: E0312 14:50:26.460987 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef1e4918cba5339ca58ec7ddc01989527447fcc18a406f5b98ae1f11a77ddb57\": container with ID starting with ef1e4918cba5339ca58ec7ddc01989527447fcc18a406f5b98ae1f11a77ddb57 not found: ID does not exist" containerID="ef1e4918cba5339ca58ec7ddc01989527447fcc18a406f5b98ae1f11a77ddb57" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.461016 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef1e4918cba5339ca58ec7ddc01989527447fcc18a406f5b98ae1f11a77ddb57"} err="failed to get container status \"ef1e4918cba5339ca58ec7ddc01989527447fcc18a406f5b98ae1f11a77ddb57\": rpc error: code = NotFound desc = could not find container \"ef1e4918cba5339ca58ec7ddc01989527447fcc18a406f5b98ae1f11a77ddb57\": container with ID starting with ef1e4918cba5339ca58ec7ddc01989527447fcc18a406f5b98ae1f11a77ddb57 not found: ID does not exist" Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.476836 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fdwk9"] Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.480270 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fdwk9"] Mar 12 14:50:26 crc kubenswrapper[4869]: I0312 14:50:26.482775 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6x5kk" Mar 12 14:50:27 crc kubenswrapper[4869]: I0312 14:50:27.317058 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6x5kk"] Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.344982 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d640d651-bd78-49dc-b945-f0c749666c66" path="/var/lib/kubelet/pods/d640d651-bd78-49dc-b945-f0c749666c66/volumes" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.453525 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6x5kk" podUID="82120625-a0ba-4136-b3d5-23b2b78f72cb" containerName="registry-server" containerID="cri-o://c25c773e978f8ff56001b54b577884aaedf0678d86137cc5c068fc32de36da63" gracePeriod=2 Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.905144 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-574dcf5686-hgfdl"] Mar 12 14:50:28 crc kubenswrapper[4869]: E0312 14:50:28.905722 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74481165-f298-4cdc-9af7-feef043fa182" containerName="extract-utilities" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.905738 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="74481165-f298-4cdc-9af7-feef043fa182" containerName="extract-utilities" Mar 12 14:50:28 crc kubenswrapper[4869]: E0312 14:50:28.905762 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d640d651-bd78-49dc-b945-f0c749666c66" containerName="oauth-openshift" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.905770 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="d640d651-bd78-49dc-b945-f0c749666c66" containerName="oauth-openshift" Mar 12 14:50:28 crc kubenswrapper[4869]: E0312 14:50:28.905792 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74481165-f298-4cdc-9af7-feef043fa182" containerName="registry-server" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.905802 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="74481165-f298-4cdc-9af7-feef043fa182" containerName="registry-server" Mar 12 14:50:28 crc kubenswrapper[4869]: E0312 14:50:28.905820 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74481165-f298-4cdc-9af7-feef043fa182" containerName="extract-content" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.905828 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="74481165-f298-4cdc-9af7-feef043fa182" containerName="extract-content" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.906078 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="74481165-f298-4cdc-9af7-feef043fa182" containerName="registry-server" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.906096 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="d640d651-bd78-49dc-b945-f0c749666c66" containerName="oauth-openshift" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.906842 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.913347 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.918023 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.919084 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.922687 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-574dcf5686-hgfdl"] Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.927284 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.927941 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.928012 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.928018 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.928229 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.928714 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.929109 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.929336 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.929403 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.936609 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.946981 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.951687 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.985212 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.985287 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-system-serving-cert\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.985339 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/69756115-308a-4c29-81bb-e91f54348383-audit-policies\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.985358 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-system-service-ca\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.985381 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-user-template-error\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.985407 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.985426 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.985638 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.985706 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9npcg\" (UniqueName: \"kubernetes.io/projected/69756115-308a-4c29-81bb-e91f54348383-kube-api-access-9npcg\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.985754 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-system-cliconfig\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.985794 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/69756115-308a-4c29-81bb-e91f54348383-audit-dir\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.985826 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-user-template-login\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.985900 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-system-router-certs\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:28 crc kubenswrapper[4869]: I0312 14:50:28.985942 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-system-session\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.087402 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.087458 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.087555 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.087590 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9npcg\" (UniqueName: \"kubernetes.io/projected/69756115-308a-4c29-81bb-e91f54348383-kube-api-access-9npcg\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.087621 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-system-cliconfig\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.087647 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/69756115-308a-4c29-81bb-e91f54348383-audit-dir\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.087671 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-user-template-login\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.087693 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-system-router-certs\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.087713 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-system-session\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.087741 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.087762 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-system-serving-cert\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.087781 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/69756115-308a-4c29-81bb-e91f54348383-audit-policies\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.087795 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-system-service-ca\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.087814 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-user-template-error\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.088620 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/69756115-308a-4c29-81bb-e91f54348383-audit-dir\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.088964 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/69756115-308a-4c29-81bb-e91f54348383-audit-policies\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.089438 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-system-service-ca\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.089621 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.090432 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-system-cliconfig\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.094305 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.094306 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-user-template-error\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.094309 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-system-router-certs\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.094401 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-system-session\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.094659 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-user-template-login\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.094885 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.095203 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.101846 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/69756115-308a-4c29-81bb-e91f54348383-v4-0-config-system-serving-cert\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.105374 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9npcg\" (UniqueName: \"kubernetes.io/projected/69756115-308a-4c29-81bb-e91f54348383-kube-api-access-9npcg\") pod \"oauth-openshift-574dcf5686-hgfdl\" (UID: \"69756115-308a-4c29-81bb-e91f54348383\") " pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.234026 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.463979 4869 generic.go:334] "Generic (PLEG): container finished" podID="82120625-a0ba-4136-b3d5-23b2b78f72cb" containerID="c25c773e978f8ff56001b54b577884aaedf0678d86137cc5c068fc32de36da63" exitCode=0 Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.464408 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6x5kk" event={"ID":"82120625-a0ba-4136-b3d5-23b2b78f72cb","Type":"ContainerDied","Data":"c25c773e978f8ff56001b54b577884aaedf0678d86137cc5c068fc32de36da63"} Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.478729 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6x5kk" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.497980 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82120625-a0ba-4136-b3d5-23b2b78f72cb-catalog-content\") pod \"82120625-a0ba-4136-b3d5-23b2b78f72cb\" (UID: \"82120625-a0ba-4136-b3d5-23b2b78f72cb\") " Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.498054 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzvvw\" (UniqueName: \"kubernetes.io/projected/82120625-a0ba-4136-b3d5-23b2b78f72cb-kube-api-access-mzvvw\") pod \"82120625-a0ba-4136-b3d5-23b2b78f72cb\" (UID: \"82120625-a0ba-4136-b3d5-23b2b78f72cb\") " Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.498102 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82120625-a0ba-4136-b3d5-23b2b78f72cb-utilities\") pod \"82120625-a0ba-4136-b3d5-23b2b78f72cb\" (UID: \"82120625-a0ba-4136-b3d5-23b2b78f72cb\") " Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.499097 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82120625-a0ba-4136-b3d5-23b2b78f72cb-utilities" (OuterVolumeSpecName: "utilities") pod "82120625-a0ba-4136-b3d5-23b2b78f72cb" (UID: "82120625-a0ba-4136-b3d5-23b2b78f72cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.511725 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82120625-a0ba-4136-b3d5-23b2b78f72cb-kube-api-access-mzvvw" (OuterVolumeSpecName: "kube-api-access-mzvvw") pod "82120625-a0ba-4136-b3d5-23b2b78f72cb" (UID: "82120625-a0ba-4136-b3d5-23b2b78f72cb"). InnerVolumeSpecName "kube-api-access-mzvvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.557976 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82120625-a0ba-4136-b3d5-23b2b78f72cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "82120625-a0ba-4136-b3d5-23b2b78f72cb" (UID: "82120625-a0ba-4136-b3d5-23b2b78f72cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.599397 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82120625-a0ba-4136-b3d5-23b2b78f72cb-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.599433 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82120625-a0ba-4136-b3d5-23b2b78f72cb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.599445 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzvvw\" (UniqueName: \"kubernetes.io/projected/82120625-a0ba-4136-b3d5-23b2b78f72cb-kube-api-access-mzvvw\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:29 crc kubenswrapper[4869]: I0312 14:50:29.721244 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-574dcf5686-hgfdl"] Mar 12 14:50:29 crc kubenswrapper[4869]: W0312 14:50:29.731594 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69756115_308a_4c29_81bb_e91f54348383.slice/crio-c43f9e6ea4cde118360c125bdbd318209e4700c06b3624c5278e86c5f31a51c8 WatchSource:0}: Error finding container c43f9e6ea4cde118360c125bdbd318209e4700c06b3624c5278e86c5f31a51c8: Status 404 returned error can't find the container with id c43f9e6ea4cde118360c125bdbd318209e4700c06b3624c5278e86c5f31a51c8 Mar 12 14:50:30 crc kubenswrapper[4869]: I0312 14:50:30.474513 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" event={"ID":"69756115-308a-4c29-81bb-e91f54348383","Type":"ContainerStarted","Data":"fcdc874a2aaff0f7ab31f697b7de4b38e4f981fcf51d6e678ef9fc6cc6cf1fab"} Mar 12 14:50:30 crc kubenswrapper[4869]: I0312 14:50:30.474887 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:30 crc kubenswrapper[4869]: I0312 14:50:30.474902 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" event={"ID":"69756115-308a-4c29-81bb-e91f54348383","Type":"ContainerStarted","Data":"c43f9e6ea4cde118360c125bdbd318209e4700c06b3624c5278e86c5f31a51c8"} Mar 12 14:50:30 crc kubenswrapper[4869]: I0312 14:50:30.481711 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6x5kk" event={"ID":"82120625-a0ba-4136-b3d5-23b2b78f72cb","Type":"ContainerDied","Data":"26bda690ce8f68649aef318679e3249a54a428a276826823421e45f3ce3bceda"} Mar 12 14:50:30 crc kubenswrapper[4869]: I0312 14:50:30.481828 4869 scope.go:117] "RemoveContainer" containerID="c25c773e978f8ff56001b54b577884aaedf0678d86137cc5c068fc32de36da63" Mar 12 14:50:30 crc kubenswrapper[4869]: I0312 14:50:30.481745 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6x5kk" Mar 12 14:50:30 crc kubenswrapper[4869]: I0312 14:50:30.486208 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" Mar 12 14:50:30 crc kubenswrapper[4869]: I0312 14:50:30.508439 4869 scope.go:117] "RemoveContainer" containerID="aab48e4b0b825c0047d0bf8b3aac6f3b96207b21f69f699241195797185f37a1" Mar 12 14:50:30 crc kubenswrapper[4869]: I0312 14:50:30.514993 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-574dcf5686-hgfdl" podStartSLOduration=30.514954304 podStartE2EDuration="30.514954304s" podCreationTimestamp="2026-03-12 14:50:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:50:30.501743743 +0000 UTC m=+182.786969041" watchObservedRunningTime="2026-03-12 14:50:30.514954304 +0000 UTC m=+182.800179622" Mar 12 14:50:30 crc kubenswrapper[4869]: I0312 14:50:30.528043 4869 scope.go:117] "RemoveContainer" containerID="ae27ac59a1246817e0f19e4a6992dcd7b2e8521ae0c9fcbd802c66006aaf00eb" Mar 12 14:50:30 crc kubenswrapper[4869]: I0312 14:50:30.529083 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6x5kk"] Mar 12 14:50:30 crc kubenswrapper[4869]: I0312 14:50:30.532821 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6x5kk"] Mar 12 14:50:32 crc kubenswrapper[4869]: I0312 14:50:32.344219 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82120625-a0ba-4136-b3d5-23b2b78f72cb" path="/var/lib/kubelet/pods/82120625-a0ba-4136-b3d5-23b2b78f72cb/volumes" Mar 12 14:50:40 crc kubenswrapper[4869]: I0312 14:50:40.431051 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-665bf8c958-gqpvl"] Mar 12 14:50:40 crc kubenswrapper[4869]: I0312 14:50:40.431908 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-665bf8c958-gqpvl" podUID="1f711c19-4a9c-4fc6-b728-b0aa0749597b" containerName="controller-manager" containerID="cri-o://f673f36a6fbd2ea5690c865f6a86e78170d49d0606527c7d08ef9c92d1441448" gracePeriod=30 Mar 12 14:50:40 crc kubenswrapper[4869]: I0312 14:50:40.532817 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dbd769694-l5ft7"] Mar 12 14:50:40 crc kubenswrapper[4869]: I0312 14:50:40.533069 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-dbd769694-l5ft7" podUID="d1b2ae3b-cb2f-4625-b30d-c92c31084966" containerName="route-controller-manager" containerID="cri-o://f88ea4c916b1b58abcdc02b48f1019fe7ab7f2707bb463db284bd9d934ce8d69" gracePeriod=30 Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.270248 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dbd769694-l5ft7" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.330255 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c86497d49-fw2dc"] Mar 12 14:50:42 crc kubenswrapper[4869]: E0312 14:50:42.330580 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1b2ae3b-cb2f-4625-b30d-c92c31084966" containerName="route-controller-manager" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.330605 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1b2ae3b-cb2f-4625-b30d-c92c31084966" containerName="route-controller-manager" Mar 12 14:50:42 crc kubenswrapper[4869]: E0312 14:50:42.330628 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82120625-a0ba-4136-b3d5-23b2b78f72cb" containerName="registry-server" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.330637 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="82120625-a0ba-4136-b3d5-23b2b78f72cb" containerName="registry-server" Mar 12 14:50:42 crc kubenswrapper[4869]: E0312 14:50:42.330655 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82120625-a0ba-4136-b3d5-23b2b78f72cb" containerName="extract-utilities" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.330666 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="82120625-a0ba-4136-b3d5-23b2b78f72cb" containerName="extract-utilities" Mar 12 14:50:42 crc kubenswrapper[4869]: E0312 14:50:42.330680 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82120625-a0ba-4136-b3d5-23b2b78f72cb" containerName="extract-content" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.330687 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="82120625-a0ba-4136-b3d5-23b2b78f72cb" containerName="extract-content" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.330830 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1b2ae3b-cb2f-4625-b30d-c92c31084966" containerName="route-controller-manager" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.330848 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="82120625-a0ba-4136-b3d5-23b2b78f72cb" containerName="registry-server" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.331463 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c86497d49-fw2dc" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.347907 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c86497d49-fw2dc"] Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.384267 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1b2ae3b-cb2f-4625-b30d-c92c31084966-client-ca\") pod \"d1b2ae3b-cb2f-4625-b30d-c92c31084966\" (UID: \"d1b2ae3b-cb2f-4625-b30d-c92c31084966\") " Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.384488 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1b2ae3b-cb2f-4625-b30d-c92c31084966-serving-cert\") pod \"d1b2ae3b-cb2f-4625-b30d-c92c31084966\" (UID: \"d1b2ae3b-cb2f-4625-b30d-c92c31084966\") " Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.385309 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1b2ae3b-cb2f-4625-b30d-c92c31084966-config\") pod \"d1b2ae3b-cb2f-4625-b30d-c92c31084966\" (UID: \"d1b2ae3b-cb2f-4625-b30d-c92c31084966\") " Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.385382 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6f4g\" (UniqueName: \"kubernetes.io/projected/d1b2ae3b-cb2f-4625-b30d-c92c31084966-kube-api-access-d6f4g\") pod \"d1b2ae3b-cb2f-4625-b30d-c92c31084966\" (UID: \"d1b2ae3b-cb2f-4625-b30d-c92c31084966\") " Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.385489 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae3e152e-5f0c-47c2-85da-ccad0eff9e62-client-ca\") pod \"route-controller-manager-c86497d49-fw2dc\" (UID: \"ae3e152e-5f0c-47c2-85da-ccad0eff9e62\") " pod="openshift-route-controller-manager/route-controller-manager-c86497d49-fw2dc" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.385657 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r95fg\" (UniqueName: \"kubernetes.io/projected/ae3e152e-5f0c-47c2-85da-ccad0eff9e62-kube-api-access-r95fg\") pod \"route-controller-manager-c86497d49-fw2dc\" (UID: \"ae3e152e-5f0c-47c2-85da-ccad0eff9e62\") " pod="openshift-route-controller-manager/route-controller-manager-c86497d49-fw2dc" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.385698 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae3e152e-5f0c-47c2-85da-ccad0eff9e62-serving-cert\") pod \"route-controller-manager-c86497d49-fw2dc\" (UID: \"ae3e152e-5f0c-47c2-85da-ccad0eff9e62\") " pod="openshift-route-controller-manager/route-controller-manager-c86497d49-fw2dc" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.385818 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae3e152e-5f0c-47c2-85da-ccad0eff9e62-config\") pod \"route-controller-manager-c86497d49-fw2dc\" (UID: \"ae3e152e-5f0c-47c2-85da-ccad0eff9e62\") " pod="openshift-route-controller-manager/route-controller-manager-c86497d49-fw2dc" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.387305 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1b2ae3b-cb2f-4625-b30d-c92c31084966-config" (OuterVolumeSpecName: "config") pod "d1b2ae3b-cb2f-4625-b30d-c92c31084966" (UID: "d1b2ae3b-cb2f-4625-b30d-c92c31084966"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.389026 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1b2ae3b-cb2f-4625-b30d-c92c31084966-client-ca" (OuterVolumeSpecName: "client-ca") pod "d1b2ae3b-cb2f-4625-b30d-c92c31084966" (UID: "d1b2ae3b-cb2f-4625-b30d-c92c31084966"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.399664 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1b2ae3b-cb2f-4625-b30d-c92c31084966-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d1b2ae3b-cb2f-4625-b30d-c92c31084966" (UID: "d1b2ae3b-cb2f-4625-b30d-c92c31084966"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.399811 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1b2ae3b-cb2f-4625-b30d-c92c31084966-kube-api-access-d6f4g" (OuterVolumeSpecName: "kube-api-access-d6f4g") pod "d1b2ae3b-cb2f-4625-b30d-c92c31084966" (UID: "d1b2ae3b-cb2f-4625-b30d-c92c31084966"). InnerVolumeSpecName "kube-api-access-d6f4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.455581 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-665bf8c958-gqpvl" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.486763 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae3e152e-5f0c-47c2-85da-ccad0eff9e62-serving-cert\") pod \"route-controller-manager-c86497d49-fw2dc\" (UID: \"ae3e152e-5f0c-47c2-85da-ccad0eff9e62\") " pod="openshift-route-controller-manager/route-controller-manager-c86497d49-fw2dc" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.486881 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae3e152e-5f0c-47c2-85da-ccad0eff9e62-config\") pod \"route-controller-manager-c86497d49-fw2dc\" (UID: \"ae3e152e-5f0c-47c2-85da-ccad0eff9e62\") " pod="openshift-route-controller-manager/route-controller-manager-c86497d49-fw2dc" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.486928 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae3e152e-5f0c-47c2-85da-ccad0eff9e62-client-ca\") pod \"route-controller-manager-c86497d49-fw2dc\" (UID: \"ae3e152e-5f0c-47c2-85da-ccad0eff9e62\") " pod="openshift-route-controller-manager/route-controller-manager-c86497d49-fw2dc" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.487003 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r95fg\" (UniqueName: \"kubernetes.io/projected/ae3e152e-5f0c-47c2-85da-ccad0eff9e62-kube-api-access-r95fg\") pod \"route-controller-manager-c86497d49-fw2dc\" (UID: \"ae3e152e-5f0c-47c2-85da-ccad0eff9e62\") " pod="openshift-route-controller-manager/route-controller-manager-c86497d49-fw2dc" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.487052 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1b2ae3b-cb2f-4625-b30d-c92c31084966-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.487067 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1b2ae3b-cb2f-4625-b30d-c92c31084966-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.487080 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6f4g\" (UniqueName: \"kubernetes.io/projected/d1b2ae3b-cb2f-4625-b30d-c92c31084966-kube-api-access-d6f4g\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.487094 4869 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1b2ae3b-cb2f-4625-b30d-c92c31084966-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.489967 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae3e152e-5f0c-47c2-85da-ccad0eff9e62-config\") pod \"route-controller-manager-c86497d49-fw2dc\" (UID: \"ae3e152e-5f0c-47c2-85da-ccad0eff9e62\") " pod="openshift-route-controller-manager/route-controller-manager-c86497d49-fw2dc" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.491223 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae3e152e-5f0c-47c2-85da-ccad0eff9e62-client-ca\") pod \"route-controller-manager-c86497d49-fw2dc\" (UID: \"ae3e152e-5f0c-47c2-85da-ccad0eff9e62\") " pod="openshift-route-controller-manager/route-controller-manager-c86497d49-fw2dc" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.492029 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae3e152e-5f0c-47c2-85da-ccad0eff9e62-serving-cert\") pod \"route-controller-manager-c86497d49-fw2dc\" (UID: \"ae3e152e-5f0c-47c2-85da-ccad0eff9e62\") " pod="openshift-route-controller-manager/route-controller-manager-c86497d49-fw2dc" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.504912 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r95fg\" (UniqueName: \"kubernetes.io/projected/ae3e152e-5f0c-47c2-85da-ccad0eff9e62-kube-api-access-r95fg\") pod \"route-controller-manager-c86497d49-fw2dc\" (UID: \"ae3e152e-5f0c-47c2-85da-ccad0eff9e62\") " pod="openshift-route-controller-manager/route-controller-manager-c86497d49-fw2dc" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.587906 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkjhf\" (UniqueName: \"kubernetes.io/projected/1f711c19-4a9c-4fc6-b728-b0aa0749597b-kube-api-access-kkjhf\") pod \"1f711c19-4a9c-4fc6-b728-b0aa0749597b\" (UID: \"1f711c19-4a9c-4fc6-b728-b0aa0749597b\") " Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.588042 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f711c19-4a9c-4fc6-b728-b0aa0749597b-config\") pod \"1f711c19-4a9c-4fc6-b728-b0aa0749597b\" (UID: \"1f711c19-4a9c-4fc6-b728-b0aa0749597b\") " Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.588108 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1f711c19-4a9c-4fc6-b728-b0aa0749597b-proxy-ca-bundles\") pod \"1f711c19-4a9c-4fc6-b728-b0aa0749597b\" (UID: \"1f711c19-4a9c-4fc6-b728-b0aa0749597b\") " Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.588140 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f711c19-4a9c-4fc6-b728-b0aa0749597b-serving-cert\") pod \"1f711c19-4a9c-4fc6-b728-b0aa0749597b\" (UID: \"1f711c19-4a9c-4fc6-b728-b0aa0749597b\") " Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.588207 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1f711c19-4a9c-4fc6-b728-b0aa0749597b-client-ca\") pod \"1f711c19-4a9c-4fc6-b728-b0aa0749597b\" (UID: \"1f711c19-4a9c-4fc6-b728-b0aa0749597b\") " Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.588801 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f711c19-4a9c-4fc6-b728-b0aa0749597b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1f711c19-4a9c-4fc6-b728-b0aa0749597b" (UID: "1f711c19-4a9c-4fc6-b728-b0aa0749597b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.588845 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f711c19-4a9c-4fc6-b728-b0aa0749597b-config" (OuterVolumeSpecName: "config") pod "1f711c19-4a9c-4fc6-b728-b0aa0749597b" (UID: "1f711c19-4a9c-4fc6-b728-b0aa0749597b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.589080 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f711c19-4a9c-4fc6-b728-b0aa0749597b-client-ca" (OuterVolumeSpecName: "client-ca") pod "1f711c19-4a9c-4fc6-b728-b0aa0749597b" (UID: "1f711c19-4a9c-4fc6-b728-b0aa0749597b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.590603 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f711c19-4a9c-4fc6-b728-b0aa0749597b-kube-api-access-kkjhf" (OuterVolumeSpecName: "kube-api-access-kkjhf") pod "1f711c19-4a9c-4fc6-b728-b0aa0749597b" (UID: "1f711c19-4a9c-4fc6-b728-b0aa0749597b"). InnerVolumeSpecName "kube-api-access-kkjhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.591109 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f711c19-4a9c-4fc6-b728-b0aa0749597b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1f711c19-4a9c-4fc6-b728-b0aa0749597b" (UID: "1f711c19-4a9c-4fc6-b728-b0aa0749597b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.652115 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c86497d49-fw2dc" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.691502 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkjhf\" (UniqueName: \"kubernetes.io/projected/1f711c19-4a9c-4fc6-b728-b0aa0749597b-kube-api-access-kkjhf\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.691965 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f711c19-4a9c-4fc6-b728-b0aa0749597b-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.691981 4869 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1f711c19-4a9c-4fc6-b728-b0aa0749597b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.691995 4869 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f711c19-4a9c-4fc6-b728-b0aa0749597b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.692008 4869 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1f711c19-4a9c-4fc6-b728-b0aa0749597b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.693883 4869 generic.go:334] "Generic (PLEG): container finished" podID="d1b2ae3b-cb2f-4625-b30d-c92c31084966" containerID="f88ea4c916b1b58abcdc02b48f1019fe7ab7f2707bb463db284bd9d934ce8d69" exitCode=0 Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.693972 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dbd769694-l5ft7" event={"ID":"d1b2ae3b-cb2f-4625-b30d-c92c31084966","Type":"ContainerDied","Data":"f88ea4c916b1b58abcdc02b48f1019fe7ab7f2707bb463db284bd9d934ce8d69"} Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.694006 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dbd769694-l5ft7" event={"ID":"d1b2ae3b-cb2f-4625-b30d-c92c31084966","Type":"ContainerDied","Data":"d9732cbe6bc657d3b9bb6004b754988588e2d50d55714770907349b4e7b42307"} Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.694028 4869 scope.go:117] "RemoveContainer" containerID="f88ea4c916b1b58abcdc02b48f1019fe7ab7f2707bb463db284bd9d934ce8d69" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.694034 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dbd769694-l5ft7" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.695955 4869 generic.go:334] "Generic (PLEG): container finished" podID="1f711c19-4a9c-4fc6-b728-b0aa0749597b" containerID="f673f36a6fbd2ea5690c865f6a86e78170d49d0606527c7d08ef9c92d1441448" exitCode=0 Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.696001 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-665bf8c958-gqpvl" event={"ID":"1f711c19-4a9c-4fc6-b728-b0aa0749597b","Type":"ContainerDied","Data":"f673f36a6fbd2ea5690c865f6a86e78170d49d0606527c7d08ef9c92d1441448"} Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.696035 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-665bf8c958-gqpvl" event={"ID":"1f711c19-4a9c-4fc6-b728-b0aa0749597b","Type":"ContainerDied","Data":"768a872213ef60ef8df10461d0a83ec9a99a7e46f0b97b17f50cbc23a07bd00e"} Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.696081 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-665bf8c958-gqpvl" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.710178 4869 scope.go:117] "RemoveContainer" containerID="f88ea4c916b1b58abcdc02b48f1019fe7ab7f2707bb463db284bd9d934ce8d69" Mar 12 14:50:42 crc kubenswrapper[4869]: E0312 14:50:42.710573 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f88ea4c916b1b58abcdc02b48f1019fe7ab7f2707bb463db284bd9d934ce8d69\": container with ID starting with f88ea4c916b1b58abcdc02b48f1019fe7ab7f2707bb463db284bd9d934ce8d69 not found: ID does not exist" containerID="f88ea4c916b1b58abcdc02b48f1019fe7ab7f2707bb463db284bd9d934ce8d69" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.710619 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f88ea4c916b1b58abcdc02b48f1019fe7ab7f2707bb463db284bd9d934ce8d69"} err="failed to get container status \"f88ea4c916b1b58abcdc02b48f1019fe7ab7f2707bb463db284bd9d934ce8d69\": rpc error: code = NotFound desc = could not find container \"f88ea4c916b1b58abcdc02b48f1019fe7ab7f2707bb463db284bd9d934ce8d69\": container with ID starting with f88ea4c916b1b58abcdc02b48f1019fe7ab7f2707bb463db284bd9d934ce8d69 not found: ID does not exist" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.710643 4869 scope.go:117] "RemoveContainer" containerID="f673f36a6fbd2ea5690c865f6a86e78170d49d0606527c7d08ef9c92d1441448" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.740590 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-665bf8c958-gqpvl"] Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.745855 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-665bf8c958-gqpvl"] Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.751418 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dbd769694-l5ft7"] Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.751646 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dbd769694-l5ft7"] Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.785213 4869 scope.go:117] "RemoveContainer" containerID="f673f36a6fbd2ea5690c865f6a86e78170d49d0606527c7d08ef9c92d1441448" Mar 12 14:50:42 crc kubenswrapper[4869]: E0312 14:50:42.788185 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f673f36a6fbd2ea5690c865f6a86e78170d49d0606527c7d08ef9c92d1441448\": container with ID starting with f673f36a6fbd2ea5690c865f6a86e78170d49d0606527c7d08ef9c92d1441448 not found: ID does not exist" containerID="f673f36a6fbd2ea5690c865f6a86e78170d49d0606527c7d08ef9c92d1441448" Mar 12 14:50:42 crc kubenswrapper[4869]: I0312 14:50:42.788221 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f673f36a6fbd2ea5690c865f6a86e78170d49d0606527c7d08ef9c92d1441448"} err="failed to get container status \"f673f36a6fbd2ea5690c865f6a86e78170d49d0606527c7d08ef9c92d1441448\": rpc error: code = NotFound desc = could not find container \"f673f36a6fbd2ea5690c865f6a86e78170d49d0606527c7d08ef9c92d1441448\": container with ID starting with f673f36a6fbd2ea5690c865f6a86e78170d49d0606527c7d08ef9c92d1441448 not found: ID does not exist" Mar 12 14:50:43 crc kubenswrapper[4869]: I0312 14:50:43.054247 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c86497d49-fw2dc"] Mar 12 14:50:43 crc kubenswrapper[4869]: I0312 14:50:43.223497 4869 patch_prober.go:28] interesting pod/route-controller-manager-dbd769694-l5ft7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.64:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 14:50:43 crc kubenswrapper[4869]: I0312 14:50:43.224474 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-dbd769694-l5ft7" podUID="d1b2ae3b-cb2f-4625-b30d-c92c31084966" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.64:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 14:50:43 crc kubenswrapper[4869]: I0312 14:50:43.237038 4869 patch_prober.go:28] interesting pod/controller-manager-665bf8c958-gqpvl container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 14:50:43 crc kubenswrapper[4869]: I0312 14:50:43.237138 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-665bf8c958-gqpvl" podUID="1f711c19-4a9c-4fc6-b728-b0aa0749597b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 14:50:43 crc kubenswrapper[4869]: I0312 14:50:43.700649 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c86497d49-fw2dc" event={"ID":"ae3e152e-5f0c-47c2-85da-ccad0eff9e62","Type":"ContainerStarted","Data":"3cd1d10d4c998c2793665cbceedf160759be31ed0a20b028081d557845e7c77b"} Mar 12 14:50:43 crc kubenswrapper[4869]: I0312 14:50:43.700704 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c86497d49-fw2dc" event={"ID":"ae3e152e-5f0c-47c2-85da-ccad0eff9e62","Type":"ContainerStarted","Data":"cc17e6ddfa486a9bc845cd6d67eb77c5f5248e151103fee446828c8bab9d0f26"} Mar 12 14:50:43 crc kubenswrapper[4869]: I0312 14:50:43.700855 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-c86497d49-fw2dc" Mar 12 14:50:43 crc kubenswrapper[4869]: I0312 14:50:43.706494 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-c86497d49-fw2dc" Mar 12 14:50:43 crc kubenswrapper[4869]: I0312 14:50:43.719715 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-c86497d49-fw2dc" podStartSLOduration=3.719695872 podStartE2EDuration="3.719695872s" podCreationTimestamp="2026-03-12 14:50:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:50:43.719439735 +0000 UTC m=+196.004665013" watchObservedRunningTime="2026-03-12 14:50:43.719695872 +0000 UTC m=+196.004921150" Mar 12 14:50:44 crc kubenswrapper[4869]: I0312 14:50:44.345131 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f711c19-4a9c-4fc6-b728-b0aa0749597b" path="/var/lib/kubelet/pods/1f711c19-4a9c-4fc6-b728-b0aa0749597b/volumes" Mar 12 14:50:44 crc kubenswrapper[4869]: I0312 14:50:44.346222 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1b2ae3b-cb2f-4625-b30d-c92c31084966" path="/var/lib/kubelet/pods/d1b2ae3b-cb2f-4625-b30d-c92c31084966/volumes" Mar 12 14:50:44 crc kubenswrapper[4869]: I0312 14:50:44.911187 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-86ddfd7495-9kbnc"] Mar 12 14:50:44 crc kubenswrapper[4869]: E0312 14:50:44.921419 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f711c19-4a9c-4fc6-b728-b0aa0749597b" containerName="controller-manager" Mar 12 14:50:44 crc kubenswrapper[4869]: I0312 14:50:44.921507 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f711c19-4a9c-4fc6-b728-b0aa0749597b" containerName="controller-manager" Mar 12 14:50:44 crc kubenswrapper[4869]: I0312 14:50:44.921936 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f711c19-4a9c-4fc6-b728-b0aa0749597b" containerName="controller-manager" Mar 12 14:50:44 crc kubenswrapper[4869]: I0312 14:50:44.924311 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86ddfd7495-9kbnc" Mar 12 14:50:44 crc kubenswrapper[4869]: I0312 14:50:44.929154 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 12 14:50:44 crc kubenswrapper[4869]: I0312 14:50:44.929348 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 12 14:50:44 crc kubenswrapper[4869]: I0312 14:50:44.930165 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 12 14:50:44 crc kubenswrapper[4869]: I0312 14:50:44.930389 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 12 14:50:44 crc kubenswrapper[4869]: I0312 14:50:44.931835 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 12 14:50:44 crc kubenswrapper[4869]: I0312 14:50:44.935122 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 12 14:50:44 crc kubenswrapper[4869]: I0312 14:50:44.939158 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86ddfd7495-9kbnc"] Mar 12 14:50:44 crc kubenswrapper[4869]: I0312 14:50:44.939758 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 12 14:50:45 crc kubenswrapper[4869]: I0312 14:50:45.123365 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95c3db8a-4507-4bf5-ba61-0343b4b25044-config\") pod \"controller-manager-86ddfd7495-9kbnc\" (UID: \"95c3db8a-4507-4bf5-ba61-0343b4b25044\") " pod="openshift-controller-manager/controller-manager-86ddfd7495-9kbnc" Mar 12 14:50:45 crc kubenswrapper[4869]: I0312 14:50:45.123433 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95c3db8a-4507-4bf5-ba61-0343b4b25044-serving-cert\") pod \"controller-manager-86ddfd7495-9kbnc\" (UID: \"95c3db8a-4507-4bf5-ba61-0343b4b25044\") " pod="openshift-controller-manager/controller-manager-86ddfd7495-9kbnc" Mar 12 14:50:45 crc kubenswrapper[4869]: I0312 14:50:45.123515 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mfmm\" (UniqueName: \"kubernetes.io/projected/95c3db8a-4507-4bf5-ba61-0343b4b25044-kube-api-access-9mfmm\") pod \"controller-manager-86ddfd7495-9kbnc\" (UID: \"95c3db8a-4507-4bf5-ba61-0343b4b25044\") " pod="openshift-controller-manager/controller-manager-86ddfd7495-9kbnc" Mar 12 14:50:45 crc kubenswrapper[4869]: I0312 14:50:45.123563 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/95c3db8a-4507-4bf5-ba61-0343b4b25044-client-ca\") pod \"controller-manager-86ddfd7495-9kbnc\" (UID: \"95c3db8a-4507-4bf5-ba61-0343b4b25044\") " pod="openshift-controller-manager/controller-manager-86ddfd7495-9kbnc" Mar 12 14:50:45 crc kubenswrapper[4869]: I0312 14:50:45.123593 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/95c3db8a-4507-4bf5-ba61-0343b4b25044-proxy-ca-bundles\") pod \"controller-manager-86ddfd7495-9kbnc\" (UID: \"95c3db8a-4507-4bf5-ba61-0343b4b25044\") " pod="openshift-controller-manager/controller-manager-86ddfd7495-9kbnc" Mar 12 14:50:45 crc kubenswrapper[4869]: I0312 14:50:45.224734 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mfmm\" (UniqueName: \"kubernetes.io/projected/95c3db8a-4507-4bf5-ba61-0343b4b25044-kube-api-access-9mfmm\") pod \"controller-manager-86ddfd7495-9kbnc\" (UID: \"95c3db8a-4507-4bf5-ba61-0343b4b25044\") " pod="openshift-controller-manager/controller-manager-86ddfd7495-9kbnc" Mar 12 14:50:45 crc kubenswrapper[4869]: I0312 14:50:45.225363 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/95c3db8a-4507-4bf5-ba61-0343b4b25044-client-ca\") pod \"controller-manager-86ddfd7495-9kbnc\" (UID: \"95c3db8a-4507-4bf5-ba61-0343b4b25044\") " pod="openshift-controller-manager/controller-manager-86ddfd7495-9kbnc" Mar 12 14:50:45 crc kubenswrapper[4869]: I0312 14:50:45.225676 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/95c3db8a-4507-4bf5-ba61-0343b4b25044-proxy-ca-bundles\") pod \"controller-manager-86ddfd7495-9kbnc\" (UID: \"95c3db8a-4507-4bf5-ba61-0343b4b25044\") " pod="openshift-controller-manager/controller-manager-86ddfd7495-9kbnc" Mar 12 14:50:45 crc kubenswrapper[4869]: I0312 14:50:45.225920 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95c3db8a-4507-4bf5-ba61-0343b4b25044-config\") pod \"controller-manager-86ddfd7495-9kbnc\" (UID: \"95c3db8a-4507-4bf5-ba61-0343b4b25044\") " pod="openshift-controller-manager/controller-manager-86ddfd7495-9kbnc" Mar 12 14:50:45 crc kubenswrapper[4869]: I0312 14:50:45.226238 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95c3db8a-4507-4bf5-ba61-0343b4b25044-serving-cert\") pod \"controller-manager-86ddfd7495-9kbnc\" (UID: \"95c3db8a-4507-4bf5-ba61-0343b4b25044\") " pod="openshift-controller-manager/controller-manager-86ddfd7495-9kbnc" Mar 12 14:50:45 crc kubenswrapper[4869]: I0312 14:50:45.226395 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/95c3db8a-4507-4bf5-ba61-0343b4b25044-client-ca\") pod \"controller-manager-86ddfd7495-9kbnc\" (UID: \"95c3db8a-4507-4bf5-ba61-0343b4b25044\") " pod="openshift-controller-manager/controller-manager-86ddfd7495-9kbnc" Mar 12 14:50:45 crc kubenswrapper[4869]: I0312 14:50:45.228029 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/95c3db8a-4507-4bf5-ba61-0343b4b25044-proxy-ca-bundles\") pod \"controller-manager-86ddfd7495-9kbnc\" (UID: \"95c3db8a-4507-4bf5-ba61-0343b4b25044\") " pod="openshift-controller-manager/controller-manager-86ddfd7495-9kbnc" Mar 12 14:50:45 crc kubenswrapper[4869]: I0312 14:50:45.229415 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95c3db8a-4507-4bf5-ba61-0343b4b25044-config\") pod \"controller-manager-86ddfd7495-9kbnc\" (UID: \"95c3db8a-4507-4bf5-ba61-0343b4b25044\") " pod="openshift-controller-manager/controller-manager-86ddfd7495-9kbnc" Mar 12 14:50:45 crc kubenswrapper[4869]: I0312 14:50:45.236222 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95c3db8a-4507-4bf5-ba61-0343b4b25044-serving-cert\") pod \"controller-manager-86ddfd7495-9kbnc\" (UID: \"95c3db8a-4507-4bf5-ba61-0343b4b25044\") " pod="openshift-controller-manager/controller-manager-86ddfd7495-9kbnc" Mar 12 14:50:45 crc kubenswrapper[4869]: I0312 14:50:45.242332 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mfmm\" (UniqueName: \"kubernetes.io/projected/95c3db8a-4507-4bf5-ba61-0343b4b25044-kube-api-access-9mfmm\") pod \"controller-manager-86ddfd7495-9kbnc\" (UID: \"95c3db8a-4507-4bf5-ba61-0343b4b25044\") " pod="openshift-controller-manager/controller-manager-86ddfd7495-9kbnc" Mar 12 14:50:45 crc kubenswrapper[4869]: I0312 14:50:45.275929 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86ddfd7495-9kbnc" Mar 12 14:50:45 crc kubenswrapper[4869]: I0312 14:50:45.490983 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86ddfd7495-9kbnc"] Mar 12 14:50:45 crc kubenswrapper[4869]: W0312 14:50:45.494872 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95c3db8a_4507_4bf5_ba61_0343b4b25044.slice/crio-bd8e201849aa4f014f55979e521d811d63daa8d7b7d81853562963371a423098 WatchSource:0}: Error finding container bd8e201849aa4f014f55979e521d811d63daa8d7b7d81853562963371a423098: Status 404 returned error can't find the container with id bd8e201849aa4f014f55979e521d811d63daa8d7b7d81853562963371a423098 Mar 12 14:50:45 crc kubenswrapper[4869]: I0312 14:50:45.718142 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86ddfd7495-9kbnc" event={"ID":"95c3db8a-4507-4bf5-ba61-0343b4b25044","Type":"ContainerStarted","Data":"61fce96774d651c28197efd935834b4157f950353ddf5bea84430bb3acb8d8d4"} Mar 12 14:50:45 crc kubenswrapper[4869]: I0312 14:50:45.718555 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86ddfd7495-9kbnc" event={"ID":"95c3db8a-4507-4bf5-ba61-0343b4b25044","Type":"ContainerStarted","Data":"bd8e201849aa4f014f55979e521d811d63daa8d7b7d81853562963371a423098"} Mar 12 14:50:45 crc kubenswrapper[4869]: I0312 14:50:45.733707 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-86ddfd7495-9kbnc" podStartSLOduration=5.733687106 podStartE2EDuration="5.733687106s" podCreationTimestamp="2026-03-12 14:50:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:50:45.732309885 +0000 UTC m=+198.017535163" watchObservedRunningTime="2026-03-12 14:50:45.733687106 +0000 UTC m=+198.018912394" Mar 12 14:50:46 crc kubenswrapper[4869]: I0312 14:50:46.723477 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-86ddfd7495-9kbnc" Mar 12 14:50:46 crc kubenswrapper[4869]: I0312 14:50:46.727578 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-86ddfd7495-9kbnc" Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.882951 4869 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.884523 4869 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.884718 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.884990 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://2be3fc94063ff64e8420a185bf53159150916de6355f8bb3ca727c6def21b5e8" gracePeriod=15 Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.885124 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://518439416ac86b6aa626cb78deb23ea94de1043a09abb2dc0ae51ea876199b72" gracePeriod=15 Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.885144 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://f7f2d0e37f450282dca2d3e08b24868d3862fcb72eca6f7a6a9aca2d13015f4e" gracePeriod=15 Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.885111 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://ade0e5dee1a3862128dbe2599fff70920f8af3d833f28981be57d7a67e0003c9" gracePeriod=15 Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.885217 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://1e74d55d3f437aafbce19df9d38a30003f8c5668956c1cd61f7792e1747c4ed6" gracePeriod=15 Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.886031 4869 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 12 14:50:54 crc kubenswrapper[4869]: E0312 14:50:54.886348 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.886373 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 14:50:54 crc kubenswrapper[4869]: E0312 14:50:54.886396 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.886411 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 14:50:54 crc kubenswrapper[4869]: E0312 14:50:54.886428 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.886445 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 12 14:50:54 crc kubenswrapper[4869]: E0312 14:50:54.886464 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.886479 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 14:50:54 crc kubenswrapper[4869]: E0312 14:50:54.886498 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.886513 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 12 14:50:54 crc kubenswrapper[4869]: E0312 14:50:54.886567 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.886586 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 12 14:50:54 crc kubenswrapper[4869]: E0312 14:50:54.886608 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.886623 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 12 14:50:54 crc kubenswrapper[4869]: E0312 14:50:54.886649 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.886666 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.886910 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.886935 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.886954 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.886972 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.886989 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.887016 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.887035 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 12 14:50:54 crc kubenswrapper[4869]: E0312 14:50:54.887255 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.887273 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 14:50:54 crc kubenswrapper[4869]: E0312 14:50:54.887295 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.887310 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.887520 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.887572 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.945226 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.952356 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.952436 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.952465 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.952486 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.952510 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.952534 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.952589 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 14:50:54 crc kubenswrapper[4869]: I0312 14:50:54.952612 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.053512 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.053612 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.053660 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.053710 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.053698 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.053737 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.053763 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.053789 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.053798 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.053841 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.053883 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.053903 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.053920 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.054016 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.054040 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.054042 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.241631 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 14:50:55 crc kubenswrapper[4869]: E0312 14:50:55.279938 4869 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.223:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189c1f8fb01c4dfa openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:50:55.27256217 +0000 UTC m=+207.557787468,LastTimestamp:2026-03-12 14:50:55.27256217 +0000 UTC m=+207.557787468,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.511147 4869 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.511219 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.775769 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.778499 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.779525 4869 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1e74d55d3f437aafbce19df9d38a30003f8c5668956c1cd61f7792e1747c4ed6" exitCode=0 Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.779594 4869 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ade0e5dee1a3862128dbe2599fff70920f8af3d833f28981be57d7a67e0003c9" exitCode=0 Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.779605 4869 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f7f2d0e37f450282dca2d3e08b24868d3862fcb72eca6f7a6a9aca2d13015f4e" exitCode=0 Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.779617 4869 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="518439416ac86b6aa626cb78deb23ea94de1043a09abb2dc0ae51ea876199b72" exitCode=2 Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.779641 4869 scope.go:117] "RemoveContainer" containerID="0e6104f6e86200fc4f007b43b7b8c0c0dfb0cf70075ca81ff9773e4424d03e28" Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.782420 4869 generic.go:334] "Generic (PLEG): container finished" podID="fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2" containerID="ffe530aa4230f2adc429700dea2edfed048b5e43bc5e2268dee47c2440632fc7" exitCode=0 Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.782554 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2","Type":"ContainerDied","Data":"ffe530aa4230f2adc429700dea2edfed048b5e43bc5e2268dee47c2440632fc7"} Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.783263 4869 status_manager.go:851] "Failed to get status for pod" podUID="fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.783795 4869 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.784137 4869 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.784891 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"2f86ed7d93b8f1188ec7ba7486307a358692889407afcf769d87d8a700fa1d6f"} Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.784923 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"2cd87614f8c0b429e12ca6af795869c78fae19178379ea36db663b49089c25b3"} Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.785725 4869 status_manager.go:851] "Failed to get status for pod" podUID="fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.785921 4869 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 14:50:55 crc kubenswrapper[4869]: I0312 14:50:55.786113 4869 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 14:50:56 crc kubenswrapper[4869]: I0312 14:50:56.795610 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.280480 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.281732 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.282432 4869 status_manager.go:851] "Failed to get status for pod" podUID="fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.282932 4869 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.283307 4869 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.285052 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.285354 4869 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.285618 4869 status_manager.go:851] "Failed to get status for pod" podUID="fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.285971 4869 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.392800 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.392877 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.392896 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.392946 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2-kube-api-access\") pod \"fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2\" (UID: \"fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2\") " Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.392965 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2-var-lock\") pod \"fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2\" (UID: \"fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2\") " Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.392978 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.393035 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2-kubelet-dir\") pod \"fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2\" (UID: \"fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2\") " Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.393033 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.393072 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.393171 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2" (UID: "fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.393145 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2-var-lock" (OuterVolumeSpecName: "var-lock") pod "fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2" (UID: "fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.393492 4869 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.393513 4869 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.393528 4869 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.393559 4869 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2-var-lock\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.393571 4869 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.399209 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2" (UID: "fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.494715 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.805978 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.807200 4869 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2be3fc94063ff64e8420a185bf53159150916de6355f8bb3ca727c6def21b5e8" exitCode=0 Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.807270 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.807314 4869 scope.go:117] "RemoveContainer" containerID="1e74d55d3f437aafbce19df9d38a30003f8c5668956c1cd61f7792e1747c4ed6" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.811228 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2","Type":"ContainerDied","Data":"f27749ee1196b7f5a52c3909687a81dda32434db6622b688576121ca82014312"} Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.811356 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f27749ee1196b7f5a52c3909687a81dda32434db6622b688576121ca82014312" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.811503 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.823986 4869 scope.go:117] "RemoveContainer" containerID="ade0e5dee1a3862128dbe2599fff70920f8af3d833f28981be57d7a67e0003c9" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.827374 4869 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.827782 4869 status_manager.go:851] "Failed to get status for pod" podUID="fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.828030 4869 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.830512 4869 status_manager.go:851] "Failed to get status for pod" podUID="fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.830796 4869 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.831091 4869 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.841665 4869 scope.go:117] "RemoveContainer" containerID="f7f2d0e37f450282dca2d3e08b24868d3862fcb72eca6f7a6a9aca2d13015f4e" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.856164 4869 scope.go:117] "RemoveContainer" containerID="518439416ac86b6aa626cb78deb23ea94de1043a09abb2dc0ae51ea876199b72" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.868844 4869 scope.go:117] "RemoveContainer" containerID="2be3fc94063ff64e8420a185bf53159150916de6355f8bb3ca727c6def21b5e8" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.885704 4869 scope.go:117] "RemoveContainer" containerID="6ee40a3b1e31feae95747d08364edf9c9995181354849d3d3e55f2255664d4c3" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.903436 4869 scope.go:117] "RemoveContainer" containerID="1e74d55d3f437aafbce19df9d38a30003f8c5668956c1cd61f7792e1747c4ed6" Mar 12 14:50:57 crc kubenswrapper[4869]: E0312 14:50:57.904176 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e74d55d3f437aafbce19df9d38a30003f8c5668956c1cd61f7792e1747c4ed6\": container with ID starting with 1e74d55d3f437aafbce19df9d38a30003f8c5668956c1cd61f7792e1747c4ed6 not found: ID does not exist" containerID="1e74d55d3f437aafbce19df9d38a30003f8c5668956c1cd61f7792e1747c4ed6" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.904239 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e74d55d3f437aafbce19df9d38a30003f8c5668956c1cd61f7792e1747c4ed6"} err="failed to get container status \"1e74d55d3f437aafbce19df9d38a30003f8c5668956c1cd61f7792e1747c4ed6\": rpc error: code = NotFound desc = could not find container \"1e74d55d3f437aafbce19df9d38a30003f8c5668956c1cd61f7792e1747c4ed6\": container with ID starting with 1e74d55d3f437aafbce19df9d38a30003f8c5668956c1cd61f7792e1747c4ed6 not found: ID does not exist" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.904287 4869 scope.go:117] "RemoveContainer" containerID="ade0e5dee1a3862128dbe2599fff70920f8af3d833f28981be57d7a67e0003c9" Mar 12 14:50:57 crc kubenswrapper[4869]: E0312 14:50:57.905165 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ade0e5dee1a3862128dbe2599fff70920f8af3d833f28981be57d7a67e0003c9\": container with ID starting with ade0e5dee1a3862128dbe2599fff70920f8af3d833f28981be57d7a67e0003c9 not found: ID does not exist" containerID="ade0e5dee1a3862128dbe2599fff70920f8af3d833f28981be57d7a67e0003c9" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.905208 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ade0e5dee1a3862128dbe2599fff70920f8af3d833f28981be57d7a67e0003c9"} err="failed to get container status \"ade0e5dee1a3862128dbe2599fff70920f8af3d833f28981be57d7a67e0003c9\": rpc error: code = NotFound desc = could not find container \"ade0e5dee1a3862128dbe2599fff70920f8af3d833f28981be57d7a67e0003c9\": container with ID starting with ade0e5dee1a3862128dbe2599fff70920f8af3d833f28981be57d7a67e0003c9 not found: ID does not exist" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.905248 4869 scope.go:117] "RemoveContainer" containerID="f7f2d0e37f450282dca2d3e08b24868d3862fcb72eca6f7a6a9aca2d13015f4e" Mar 12 14:50:57 crc kubenswrapper[4869]: E0312 14:50:57.906880 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7f2d0e37f450282dca2d3e08b24868d3862fcb72eca6f7a6a9aca2d13015f4e\": container with ID starting with f7f2d0e37f450282dca2d3e08b24868d3862fcb72eca6f7a6a9aca2d13015f4e not found: ID does not exist" containerID="f7f2d0e37f450282dca2d3e08b24868d3862fcb72eca6f7a6a9aca2d13015f4e" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.906918 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7f2d0e37f450282dca2d3e08b24868d3862fcb72eca6f7a6a9aca2d13015f4e"} err="failed to get container status \"f7f2d0e37f450282dca2d3e08b24868d3862fcb72eca6f7a6a9aca2d13015f4e\": rpc error: code = NotFound desc = could not find container \"f7f2d0e37f450282dca2d3e08b24868d3862fcb72eca6f7a6a9aca2d13015f4e\": container with ID starting with f7f2d0e37f450282dca2d3e08b24868d3862fcb72eca6f7a6a9aca2d13015f4e not found: ID does not exist" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.906940 4869 scope.go:117] "RemoveContainer" containerID="518439416ac86b6aa626cb78deb23ea94de1043a09abb2dc0ae51ea876199b72" Mar 12 14:50:57 crc kubenswrapper[4869]: E0312 14:50:57.907381 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"518439416ac86b6aa626cb78deb23ea94de1043a09abb2dc0ae51ea876199b72\": container with ID starting with 518439416ac86b6aa626cb78deb23ea94de1043a09abb2dc0ae51ea876199b72 not found: ID does not exist" containerID="518439416ac86b6aa626cb78deb23ea94de1043a09abb2dc0ae51ea876199b72" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.907415 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"518439416ac86b6aa626cb78deb23ea94de1043a09abb2dc0ae51ea876199b72"} err="failed to get container status \"518439416ac86b6aa626cb78deb23ea94de1043a09abb2dc0ae51ea876199b72\": rpc error: code = NotFound desc = could not find container \"518439416ac86b6aa626cb78deb23ea94de1043a09abb2dc0ae51ea876199b72\": container with ID starting with 518439416ac86b6aa626cb78deb23ea94de1043a09abb2dc0ae51ea876199b72 not found: ID does not exist" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.907436 4869 scope.go:117] "RemoveContainer" containerID="2be3fc94063ff64e8420a185bf53159150916de6355f8bb3ca727c6def21b5e8" Mar 12 14:50:57 crc kubenswrapper[4869]: E0312 14:50:57.907798 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2be3fc94063ff64e8420a185bf53159150916de6355f8bb3ca727c6def21b5e8\": container with ID starting with 2be3fc94063ff64e8420a185bf53159150916de6355f8bb3ca727c6def21b5e8 not found: ID does not exist" containerID="2be3fc94063ff64e8420a185bf53159150916de6355f8bb3ca727c6def21b5e8" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.907836 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2be3fc94063ff64e8420a185bf53159150916de6355f8bb3ca727c6def21b5e8"} err="failed to get container status \"2be3fc94063ff64e8420a185bf53159150916de6355f8bb3ca727c6def21b5e8\": rpc error: code = NotFound desc = could not find container \"2be3fc94063ff64e8420a185bf53159150916de6355f8bb3ca727c6def21b5e8\": container with ID starting with 2be3fc94063ff64e8420a185bf53159150916de6355f8bb3ca727c6def21b5e8 not found: ID does not exist" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.907851 4869 scope.go:117] "RemoveContainer" containerID="6ee40a3b1e31feae95747d08364edf9c9995181354849d3d3e55f2255664d4c3" Mar 12 14:50:57 crc kubenswrapper[4869]: E0312 14:50:57.908471 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ee40a3b1e31feae95747d08364edf9c9995181354849d3d3e55f2255664d4c3\": container with ID starting with 6ee40a3b1e31feae95747d08364edf9c9995181354849d3d3e55f2255664d4c3 not found: ID does not exist" containerID="6ee40a3b1e31feae95747d08364edf9c9995181354849d3d3e55f2255664d4c3" Mar 12 14:50:57 crc kubenswrapper[4869]: I0312 14:50:57.908497 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ee40a3b1e31feae95747d08364edf9c9995181354849d3d3e55f2255664d4c3"} err="failed to get container status \"6ee40a3b1e31feae95747d08364edf9c9995181354849d3d3e55f2255664d4c3\": rpc error: code = NotFound desc = could not find container \"6ee40a3b1e31feae95747d08364edf9c9995181354849d3d3e55f2255664d4c3\": container with ID starting with 6ee40a3b1e31feae95747d08364edf9c9995181354849d3d3e55f2255664d4c3 not found: ID does not exist" Mar 12 14:50:58 crc kubenswrapper[4869]: I0312 14:50:58.340355 4869 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 14:50:58 crc kubenswrapper[4869]: I0312 14:50:58.341332 4869 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 14:50:58 crc kubenswrapper[4869]: I0312 14:50:58.341784 4869 status_manager.go:851] "Failed to get status for pod" podUID="fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 14:50:58 crc kubenswrapper[4869]: I0312 14:50:58.351358 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 12 14:51:00 crc kubenswrapper[4869]: E0312 14:51:00.097114 4869 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.223:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189c1f8fb01c4dfa openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 14:50:55.27256217 +0000 UTC m=+207.557787468,LastTimestamp:2026-03-12 14:50:55.27256217 +0000 UTC m=+207.557787468,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 14:51:01 crc kubenswrapper[4869]: E0312 14:51:01.946459 4869 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 14:51:01 crc kubenswrapper[4869]: E0312 14:51:01.946881 4869 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 14:51:01 crc kubenswrapper[4869]: E0312 14:51:01.947388 4869 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 14:51:01 crc kubenswrapper[4869]: E0312 14:51:01.947647 4869 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 14:51:01 crc kubenswrapper[4869]: E0312 14:51:01.947911 4869 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 14:51:01 crc kubenswrapper[4869]: I0312 14:51:01.947943 4869 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 12 14:51:01 crc kubenswrapper[4869]: E0312 14:51:01.948587 4869 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="200ms" Mar 12 14:51:02 crc kubenswrapper[4869]: E0312 14:51:02.149522 4869 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="400ms" Mar 12 14:51:02 crc kubenswrapper[4869]: E0312 14:51:02.551436 4869 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="800ms" Mar 12 14:51:03 crc kubenswrapper[4869]: E0312 14:51:03.352474 4869 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="1.6s" Mar 12 14:51:04 crc kubenswrapper[4869]: E0312 14:51:04.954956 4869 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="3.2s" Mar 12 14:51:06 crc kubenswrapper[4869]: I0312 14:51:06.336430 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:51:06 crc kubenswrapper[4869]: I0312 14:51:06.338763 4869 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 14:51:06 crc kubenswrapper[4869]: I0312 14:51:06.339234 4869 status_manager.go:851] "Failed to get status for pod" podUID="fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 14:51:06 crc kubenswrapper[4869]: I0312 14:51:06.360951 4869 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b890e426-900b-4755-ba40-37ed7df4521e" Mar 12 14:51:06 crc kubenswrapper[4869]: I0312 14:51:06.361007 4869 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b890e426-900b-4755-ba40-37ed7df4521e" Mar 12 14:51:06 crc kubenswrapper[4869]: E0312 14:51:06.361798 4869 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:51:06 crc kubenswrapper[4869]: I0312 14:51:06.362854 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:51:06 crc kubenswrapper[4869]: I0312 14:51:06.884699 4869 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="5e601e88c798c312c8a30c151c4883ad1ff23e245fe5da5c2d8cb74baafe616f" exitCode=0 Mar 12 14:51:06 crc kubenswrapper[4869]: I0312 14:51:06.884846 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"5e601e88c798c312c8a30c151c4883ad1ff23e245fe5da5c2d8cb74baafe616f"} Mar 12 14:51:06 crc kubenswrapper[4869]: I0312 14:51:06.885354 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"92b4218ce188cd7f6a4a848cc2e80a8b616561ea12b9865ea408ee3387586b7f"} Mar 12 14:51:06 crc kubenswrapper[4869]: I0312 14:51:06.885798 4869 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b890e426-900b-4755-ba40-37ed7df4521e" Mar 12 14:51:06 crc kubenswrapper[4869]: I0312 14:51:06.885824 4869 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b890e426-900b-4755-ba40-37ed7df4521e" Mar 12 14:51:06 crc kubenswrapper[4869]: E0312 14:51:06.886977 4869 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:51:06 crc kubenswrapper[4869]: I0312 14:51:06.887008 4869 status_manager.go:851] "Failed to get status for pod" podUID="fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 14:51:06 crc kubenswrapper[4869]: I0312 14:51:06.888167 4869 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 14:51:07 crc kubenswrapper[4869]: I0312 14:51:07.895183 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ac5d456c86eaf32cab873c01f55513994f74cdc038a0cfd8f6610ac5ae87b3f8"} Mar 12 14:51:07 crc kubenswrapper[4869]: I0312 14:51:07.895598 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"26b6d1e5c89a6491f2178724f83e80a59587d74ccb2004d03319f8857c046dab"} Mar 12 14:51:07 crc kubenswrapper[4869]: I0312 14:51:07.895614 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b2988ae904e233ba7da58f66f2045ed8ac1bb103ebd6b1eb12139c797481e297"} Mar 12 14:51:07 crc kubenswrapper[4869]: I0312 14:51:07.895625 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"19cb718bb52d74cbe4420aa176391fb1edf55b01c5b81b6ae35aa6cd525119a1"} Mar 12 14:51:08 crc kubenswrapper[4869]: I0312 14:51:08.902614 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c42beba73d9d8131b983e27372695359be522f41d00be33a34a7f12152c6acb9"} Mar 12 14:51:08 crc kubenswrapper[4869]: I0312 14:51:08.902903 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:51:08 crc kubenswrapper[4869]: I0312 14:51:08.902839 4869 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b890e426-900b-4755-ba40-37ed7df4521e" Mar 12 14:51:08 crc kubenswrapper[4869]: I0312 14:51:08.902920 4869 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b890e426-900b-4755-ba40-37ed7df4521e" Mar 12 14:51:10 crc kubenswrapper[4869]: I0312 14:51:10.650722 4869 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 12 14:51:10 crc kubenswrapper[4869]: I0312 14:51:10.651115 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 12 14:51:10 crc kubenswrapper[4869]: I0312 14:51:10.921598 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 12 14:51:10 crc kubenswrapper[4869]: I0312 14:51:10.922285 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 12 14:51:10 crc kubenswrapper[4869]: I0312 14:51:10.922341 4869 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="4ee62258679276360176bb5fc50c9215884f5d5e9a291139f21b5e0216f5e6f1" exitCode=1 Mar 12 14:51:10 crc kubenswrapper[4869]: I0312 14:51:10.922374 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"4ee62258679276360176bb5fc50c9215884f5d5e9a291139f21b5e0216f5e6f1"} Mar 12 14:51:10 crc kubenswrapper[4869]: I0312 14:51:10.922847 4869 scope.go:117] "RemoveContainer" containerID="4ee62258679276360176bb5fc50c9215884f5d5e9a291139f21b5e0216f5e6f1" Mar 12 14:51:11 crc kubenswrapper[4869]: I0312 14:51:11.363770 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:51:11 crc kubenswrapper[4869]: I0312 14:51:11.363841 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:51:11 crc kubenswrapper[4869]: I0312 14:51:11.373528 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:51:11 crc kubenswrapper[4869]: I0312 14:51:11.936409 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 12 14:51:11 crc kubenswrapper[4869]: I0312 14:51:11.937491 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 12 14:51:11 crc kubenswrapper[4869]: I0312 14:51:11.937578 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bc5c8395812a34612afa1af44d2144eb348c6f9b471c8a06649804ce1f5602e0"} Mar 12 14:51:12 crc kubenswrapper[4869]: I0312 14:51:12.737087 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 14:51:12 crc kubenswrapper[4869]: I0312 14:51:12.741167 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 14:51:12 crc kubenswrapper[4869]: I0312 14:51:12.942924 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 14:51:13 crc kubenswrapper[4869]: I0312 14:51:13.913507 4869 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:51:13 crc kubenswrapper[4869]: I0312 14:51:13.948580 4869 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b890e426-900b-4755-ba40-37ed7df4521e" Mar 12 14:51:13 crc kubenswrapper[4869]: I0312 14:51:13.948609 4869 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b890e426-900b-4755-ba40-37ed7df4521e" Mar 12 14:51:13 crc kubenswrapper[4869]: I0312 14:51:13.958059 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:51:13 crc kubenswrapper[4869]: I0312 14:51:13.962895 4869 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="94b684de-3447-42a8-b095-cfdc39881d53" Mar 12 14:51:14 crc kubenswrapper[4869]: I0312 14:51:14.953452 4869 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b890e426-900b-4755-ba40-37ed7df4521e" Mar 12 14:51:14 crc kubenswrapper[4869]: I0312 14:51:14.953482 4869 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b890e426-900b-4755-ba40-37ed7df4521e" Mar 12 14:51:18 crc kubenswrapper[4869]: I0312 14:51:18.366443 4869 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="94b684de-3447-42a8-b095-cfdc39881d53" Mar 12 14:51:19 crc kubenswrapper[4869]: I0312 14:51:19.684831 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:51:19 crc kubenswrapper[4869]: I0312 14:51:19.685162 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:51:20 crc kubenswrapper[4869]: I0312 14:51:20.133669 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 12 14:51:22 crc kubenswrapper[4869]: I0312 14:51:22.575161 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 12 14:51:22 crc kubenswrapper[4869]: I0312 14:51:22.658226 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 12 14:51:23 crc kubenswrapper[4869]: I0312 14:51:23.204786 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 12 14:51:24 crc kubenswrapper[4869]: I0312 14:51:24.058745 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 12 14:51:24 crc kubenswrapper[4869]: I0312 14:51:24.745046 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 12 14:51:25 crc kubenswrapper[4869]: I0312 14:51:25.051012 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 12 14:51:25 crc kubenswrapper[4869]: I0312 14:51:25.170033 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 12 14:51:25 crc kubenswrapper[4869]: I0312 14:51:25.245055 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 12 14:51:25 crc kubenswrapper[4869]: I0312 14:51:25.412299 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 12 14:51:26 crc kubenswrapper[4869]: I0312 14:51:26.144636 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 12 14:51:26 crc kubenswrapper[4869]: I0312 14:51:26.360153 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 12 14:51:26 crc kubenswrapper[4869]: I0312 14:51:26.433197 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 12 14:51:26 crc kubenswrapper[4869]: I0312 14:51:26.592295 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 12 14:51:26 crc kubenswrapper[4869]: I0312 14:51:26.598006 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 12 14:51:26 crc kubenswrapper[4869]: I0312 14:51:26.711012 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 12 14:51:26 crc kubenswrapper[4869]: I0312 14:51:26.713173 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 12 14:51:26 crc kubenswrapper[4869]: I0312 14:51:26.738821 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 12 14:51:27 crc kubenswrapper[4869]: I0312 14:51:27.047373 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 12 14:51:27 crc kubenswrapper[4869]: I0312 14:51:27.114232 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 12 14:51:27 crc kubenswrapper[4869]: I0312 14:51:27.249898 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 12 14:51:27 crc kubenswrapper[4869]: I0312 14:51:27.249901 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 12 14:51:27 crc kubenswrapper[4869]: I0312 14:51:27.302958 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 12 14:51:27 crc kubenswrapper[4869]: I0312 14:51:27.390356 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 12 14:51:27 crc kubenswrapper[4869]: I0312 14:51:27.484860 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 12 14:51:27 crc kubenswrapper[4869]: I0312 14:51:27.503249 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 12 14:51:27 crc kubenswrapper[4869]: I0312 14:51:27.538092 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 12 14:51:27 crc kubenswrapper[4869]: I0312 14:51:27.602213 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 12 14:51:27 crc kubenswrapper[4869]: I0312 14:51:27.648849 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 12 14:51:27 crc kubenswrapper[4869]: I0312 14:51:27.826047 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 12 14:51:27 crc kubenswrapper[4869]: I0312 14:51:27.845188 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 12 14:51:27 crc kubenswrapper[4869]: I0312 14:51:27.870795 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 12 14:51:27 crc kubenswrapper[4869]: I0312 14:51:27.966785 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 12 14:51:28 crc kubenswrapper[4869]: I0312 14:51:28.246015 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 12 14:51:28 crc kubenswrapper[4869]: I0312 14:51:28.346694 4869 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 12 14:51:28 crc kubenswrapper[4869]: I0312 14:51:28.351111 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=34.351096418 podStartE2EDuration="34.351096418s" podCreationTimestamp="2026-03-12 14:50:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:51:13.458093082 +0000 UTC m=+225.743318370" watchObservedRunningTime="2026-03-12 14:51:28.351096418 +0000 UTC m=+240.636321696" Mar 12 14:51:28 crc kubenswrapper[4869]: I0312 14:51:28.351358 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 12 14:51:28 crc kubenswrapper[4869]: I0312 14:51:28.351396 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 12 14:51:28 crc kubenswrapper[4869]: I0312 14:51:28.351697 4869 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b890e426-900b-4755-ba40-37ed7df4521e" Mar 12 14:51:28 crc kubenswrapper[4869]: I0312 14:51:28.352235 4869 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b890e426-900b-4755-ba40-37ed7df4521e" Mar 12 14:51:28 crc kubenswrapper[4869]: I0312 14:51:28.356270 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 14:51:28 crc kubenswrapper[4869]: I0312 14:51:28.375341 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=15.375316716 podStartE2EDuration="15.375316716s" podCreationTimestamp="2026-03-12 14:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:51:28.371075772 +0000 UTC m=+240.656301070" watchObservedRunningTime="2026-03-12 14:51:28.375316716 +0000 UTC m=+240.660541994" Mar 12 14:51:28 crc kubenswrapper[4869]: I0312 14:51:28.415767 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 12 14:51:28 crc kubenswrapper[4869]: I0312 14:51:28.509071 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 12 14:51:28 crc kubenswrapper[4869]: I0312 14:51:28.656085 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 12 14:51:28 crc kubenswrapper[4869]: I0312 14:51:28.706691 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 12 14:51:28 crc kubenswrapper[4869]: I0312 14:51:28.740021 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 12 14:51:28 crc kubenswrapper[4869]: I0312 14:51:28.894638 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 12 14:51:28 crc kubenswrapper[4869]: I0312 14:51:28.902185 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 12 14:51:28 crc kubenswrapper[4869]: I0312 14:51:28.910974 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 12 14:51:28 crc kubenswrapper[4869]: I0312 14:51:28.930119 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 12 14:51:28 crc kubenswrapper[4869]: I0312 14:51:28.942054 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 12 14:51:29 crc kubenswrapper[4869]: I0312 14:51:29.022608 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 12 14:51:29 crc kubenswrapper[4869]: I0312 14:51:29.048125 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 12 14:51:29 crc kubenswrapper[4869]: I0312 14:51:29.115678 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 12 14:51:29 crc kubenswrapper[4869]: I0312 14:51:29.182530 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 12 14:51:29 crc kubenswrapper[4869]: I0312 14:51:29.189206 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 12 14:51:29 crc kubenswrapper[4869]: I0312 14:51:29.203929 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 12 14:51:29 crc kubenswrapper[4869]: I0312 14:51:29.225124 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 12 14:51:29 crc kubenswrapper[4869]: I0312 14:51:29.240493 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 12 14:51:29 crc kubenswrapper[4869]: I0312 14:51:29.409500 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 12 14:51:29 crc kubenswrapper[4869]: I0312 14:51:29.473915 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 12 14:51:29 crc kubenswrapper[4869]: I0312 14:51:29.657681 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 12 14:51:29 crc kubenswrapper[4869]: I0312 14:51:29.823286 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 12 14:51:29 crc kubenswrapper[4869]: I0312 14:51:29.843709 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 12 14:51:29 crc kubenswrapper[4869]: I0312 14:51:29.861646 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 12 14:51:29 crc kubenswrapper[4869]: I0312 14:51:29.942276 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 12 14:51:29 crc kubenswrapper[4869]: I0312 14:51:29.985150 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 12 14:51:29 crc kubenswrapper[4869]: I0312 14:51:29.987248 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 12 14:51:29 crc kubenswrapper[4869]: I0312 14:51:29.992691 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 12 14:51:30 crc kubenswrapper[4869]: I0312 14:51:30.003829 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 12 14:51:30 crc kubenswrapper[4869]: I0312 14:51:30.037335 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 12 14:51:30 crc kubenswrapper[4869]: I0312 14:51:30.040127 4869 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 12 14:51:30 crc kubenswrapper[4869]: I0312 14:51:30.049354 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 12 14:51:30 crc kubenswrapper[4869]: I0312 14:51:30.284296 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 12 14:51:30 crc kubenswrapper[4869]: I0312 14:51:30.467219 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 12 14:51:30 crc kubenswrapper[4869]: I0312 14:51:30.535109 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 12 14:51:30 crc kubenswrapper[4869]: I0312 14:51:30.600999 4869 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 12 14:51:30 crc kubenswrapper[4869]: I0312 14:51:30.633326 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 12 14:51:30 crc kubenswrapper[4869]: I0312 14:51:30.655476 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 14:51:30 crc kubenswrapper[4869]: I0312 14:51:30.687933 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 12 14:51:30 crc kubenswrapper[4869]: I0312 14:51:30.713379 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 12 14:51:30 crc kubenswrapper[4869]: I0312 14:51:30.738378 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 12 14:51:30 crc kubenswrapper[4869]: I0312 14:51:30.923486 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 12 14:51:30 crc kubenswrapper[4869]: I0312 14:51:30.948277 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 12 14:51:31 crc kubenswrapper[4869]: I0312 14:51:31.126642 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 12 14:51:31 crc kubenswrapper[4869]: I0312 14:51:31.166692 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 12 14:51:31 crc kubenswrapper[4869]: I0312 14:51:31.251284 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 12 14:51:31 crc kubenswrapper[4869]: I0312 14:51:31.269849 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 12 14:51:31 crc kubenswrapper[4869]: I0312 14:51:31.400100 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 12 14:51:31 crc kubenswrapper[4869]: I0312 14:51:31.411293 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 12 14:51:31 crc kubenswrapper[4869]: I0312 14:51:31.418445 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 12 14:51:31 crc kubenswrapper[4869]: I0312 14:51:31.435488 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 12 14:51:31 crc kubenswrapper[4869]: I0312 14:51:31.459401 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 12 14:51:31 crc kubenswrapper[4869]: I0312 14:51:31.505655 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 12 14:51:31 crc kubenswrapper[4869]: I0312 14:51:31.574610 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 12 14:51:31 crc kubenswrapper[4869]: I0312 14:51:31.581238 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 12 14:51:31 crc kubenswrapper[4869]: I0312 14:51:31.600403 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 12 14:51:31 crc kubenswrapper[4869]: I0312 14:51:31.613115 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 12 14:51:31 crc kubenswrapper[4869]: I0312 14:51:31.817946 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 12 14:51:31 crc kubenswrapper[4869]: I0312 14:51:31.941797 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 12 14:51:31 crc kubenswrapper[4869]: I0312 14:51:31.951410 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 12 14:51:31 crc kubenswrapper[4869]: I0312 14:51:31.967735 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 12 14:51:32 crc kubenswrapper[4869]: I0312 14:51:32.019513 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 12 14:51:32 crc kubenswrapper[4869]: I0312 14:51:32.221227 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 12 14:51:32 crc kubenswrapper[4869]: I0312 14:51:32.319796 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 12 14:51:32 crc kubenswrapper[4869]: I0312 14:51:32.397295 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 12 14:51:32 crc kubenswrapper[4869]: I0312 14:51:32.464185 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 12 14:51:32 crc kubenswrapper[4869]: I0312 14:51:32.485320 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 12 14:51:32 crc kubenswrapper[4869]: I0312 14:51:32.485399 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 12 14:51:32 crc kubenswrapper[4869]: I0312 14:51:32.516363 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 12 14:51:32 crc kubenswrapper[4869]: I0312 14:51:32.619766 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 12 14:51:32 crc kubenswrapper[4869]: I0312 14:51:32.686365 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 12 14:51:32 crc kubenswrapper[4869]: I0312 14:51:32.691605 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 12 14:51:32 crc kubenswrapper[4869]: I0312 14:51:32.692926 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 12 14:51:32 crc kubenswrapper[4869]: I0312 14:51:32.697490 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 12 14:51:32 crc kubenswrapper[4869]: I0312 14:51:32.731507 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 12 14:51:32 crc kubenswrapper[4869]: I0312 14:51:32.805642 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 12 14:51:32 crc kubenswrapper[4869]: I0312 14:51:32.823012 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 12 14:51:32 crc kubenswrapper[4869]: I0312 14:51:32.920406 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 12 14:51:33 crc kubenswrapper[4869]: I0312 14:51:33.015787 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 12 14:51:33 crc kubenswrapper[4869]: I0312 14:51:33.016862 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 12 14:51:33 crc kubenswrapper[4869]: I0312 14:51:33.019179 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 12 14:51:33 crc kubenswrapper[4869]: I0312 14:51:33.096862 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 12 14:51:33 crc kubenswrapper[4869]: I0312 14:51:33.170907 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 12 14:51:33 crc kubenswrapper[4869]: I0312 14:51:33.185022 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 12 14:51:33 crc kubenswrapper[4869]: I0312 14:51:33.189917 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 12 14:51:33 crc kubenswrapper[4869]: I0312 14:51:33.231162 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 12 14:51:33 crc kubenswrapper[4869]: I0312 14:51:33.258381 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 12 14:51:33 crc kubenswrapper[4869]: I0312 14:51:33.315580 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 12 14:51:33 crc kubenswrapper[4869]: I0312 14:51:33.329486 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 12 14:51:33 crc kubenswrapper[4869]: I0312 14:51:33.418482 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 12 14:51:33 crc kubenswrapper[4869]: I0312 14:51:33.622523 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 12 14:51:33 crc kubenswrapper[4869]: I0312 14:51:33.628793 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 12 14:51:33 crc kubenswrapper[4869]: I0312 14:51:33.641096 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 12 14:51:33 crc kubenswrapper[4869]: I0312 14:51:33.673953 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 12 14:51:33 crc kubenswrapper[4869]: I0312 14:51:33.792198 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 12 14:51:33 crc kubenswrapper[4869]: I0312 14:51:33.811082 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 12 14:51:33 crc kubenswrapper[4869]: I0312 14:51:33.902812 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 12 14:51:33 crc kubenswrapper[4869]: I0312 14:51:33.963004 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 12 14:51:33 crc kubenswrapper[4869]: I0312 14:51:33.982715 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 12 14:51:34 crc kubenswrapper[4869]: I0312 14:51:34.000410 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 12 14:51:34 crc kubenswrapper[4869]: I0312 14:51:34.030000 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 12 14:51:34 crc kubenswrapper[4869]: I0312 14:51:34.031020 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 12 14:51:34 crc kubenswrapper[4869]: I0312 14:51:34.169123 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 12 14:51:34 crc kubenswrapper[4869]: I0312 14:51:34.190365 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 12 14:51:34 crc kubenswrapper[4869]: I0312 14:51:34.228669 4869 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 12 14:51:34 crc kubenswrapper[4869]: I0312 14:51:34.251154 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 12 14:51:34 crc kubenswrapper[4869]: I0312 14:51:34.267555 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 12 14:51:34 crc kubenswrapper[4869]: I0312 14:51:34.285498 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 12 14:51:34 crc kubenswrapper[4869]: I0312 14:51:34.361913 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 12 14:51:34 crc kubenswrapper[4869]: I0312 14:51:34.393410 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 12 14:51:34 crc kubenswrapper[4869]: I0312 14:51:34.437312 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 12 14:51:34 crc kubenswrapper[4869]: I0312 14:51:34.454409 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 12 14:51:34 crc kubenswrapper[4869]: I0312 14:51:34.540638 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 12 14:51:34 crc kubenswrapper[4869]: I0312 14:51:34.549842 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 12 14:51:34 crc kubenswrapper[4869]: I0312 14:51:34.643061 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 12 14:51:34 crc kubenswrapper[4869]: I0312 14:51:34.677126 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 12 14:51:34 crc kubenswrapper[4869]: I0312 14:51:34.682003 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 12 14:51:34 crc kubenswrapper[4869]: I0312 14:51:34.697876 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 12 14:51:34 crc kubenswrapper[4869]: I0312 14:51:34.754299 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 12 14:51:34 crc kubenswrapper[4869]: I0312 14:51:34.811001 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 12 14:51:34 crc kubenswrapper[4869]: I0312 14:51:34.815218 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 12 14:51:34 crc kubenswrapper[4869]: I0312 14:51:34.914705 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 12 14:51:34 crc kubenswrapper[4869]: I0312 14:51:34.930323 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 12 14:51:35 crc kubenswrapper[4869]: I0312 14:51:35.032298 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 12 14:51:35 crc kubenswrapper[4869]: I0312 14:51:35.195864 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 12 14:51:35 crc kubenswrapper[4869]: I0312 14:51:35.238248 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 12 14:51:35 crc kubenswrapper[4869]: I0312 14:51:35.239700 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 12 14:51:35 crc kubenswrapper[4869]: I0312 14:51:35.373300 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 12 14:51:35 crc kubenswrapper[4869]: I0312 14:51:35.391203 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 12 14:51:35 crc kubenswrapper[4869]: I0312 14:51:35.615954 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 12 14:51:35 crc kubenswrapper[4869]: I0312 14:51:35.740231 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 12 14:51:35 crc kubenswrapper[4869]: I0312 14:51:35.773993 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 12 14:51:35 crc kubenswrapper[4869]: I0312 14:51:35.783712 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 12 14:51:35 crc kubenswrapper[4869]: I0312 14:51:35.790906 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 12 14:51:35 crc kubenswrapper[4869]: I0312 14:51:35.815236 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 12 14:51:35 crc kubenswrapper[4869]: I0312 14:51:35.817970 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 12 14:51:35 crc kubenswrapper[4869]: I0312 14:51:35.868397 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 12 14:51:35 crc kubenswrapper[4869]: I0312 14:51:35.878559 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 12 14:51:35 crc kubenswrapper[4869]: I0312 14:51:35.913563 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 12 14:51:35 crc kubenswrapper[4869]: I0312 14:51:35.920550 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 12 14:51:35 crc kubenswrapper[4869]: I0312 14:51:35.966027 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 12 14:51:36 crc kubenswrapper[4869]: I0312 14:51:36.052276 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 12 14:51:36 crc kubenswrapper[4869]: I0312 14:51:36.104358 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 12 14:51:36 crc kubenswrapper[4869]: I0312 14:51:36.127160 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 12 14:51:36 crc kubenswrapper[4869]: I0312 14:51:36.139794 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 12 14:51:36 crc kubenswrapper[4869]: I0312 14:51:36.145493 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 12 14:51:36 crc kubenswrapper[4869]: I0312 14:51:36.151218 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 12 14:51:36 crc kubenswrapper[4869]: I0312 14:51:36.175531 4869 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 12 14:51:36 crc kubenswrapper[4869]: I0312 14:51:36.175794 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://2f86ed7d93b8f1188ec7ba7486307a358692889407afcf769d87d8a700fa1d6f" gracePeriod=5 Mar 12 14:51:36 crc kubenswrapper[4869]: I0312 14:51:36.214517 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 12 14:51:36 crc kubenswrapper[4869]: I0312 14:51:36.272804 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 12 14:51:36 crc kubenswrapper[4869]: I0312 14:51:36.346467 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 12 14:51:36 crc kubenswrapper[4869]: I0312 14:51:36.389742 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 12 14:51:36 crc kubenswrapper[4869]: I0312 14:51:36.407658 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 12 14:51:36 crc kubenswrapper[4869]: I0312 14:51:36.414040 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 12 14:51:36 crc kubenswrapper[4869]: I0312 14:51:36.457187 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 12 14:51:36 crc kubenswrapper[4869]: I0312 14:51:36.517026 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 12 14:51:36 crc kubenswrapper[4869]: I0312 14:51:36.573769 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 12 14:51:36 crc kubenswrapper[4869]: I0312 14:51:36.660251 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 12 14:51:36 crc kubenswrapper[4869]: I0312 14:51:36.765943 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 12 14:51:36 crc kubenswrapper[4869]: I0312 14:51:36.795746 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 12 14:51:36 crc kubenswrapper[4869]: I0312 14:51:36.809419 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 12 14:51:36 crc kubenswrapper[4869]: I0312 14:51:36.874640 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 12 14:51:36 crc kubenswrapper[4869]: I0312 14:51:36.899664 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 12 14:51:36 crc kubenswrapper[4869]: I0312 14:51:36.934861 4869 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 12 14:51:36 crc kubenswrapper[4869]: I0312 14:51:36.996935 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 12 14:51:37 crc kubenswrapper[4869]: I0312 14:51:37.308102 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 12 14:51:37 crc kubenswrapper[4869]: I0312 14:51:37.386872 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 12 14:51:37 crc kubenswrapper[4869]: I0312 14:51:37.387412 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 12 14:51:37 crc kubenswrapper[4869]: I0312 14:51:37.500261 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 12 14:51:37 crc kubenswrapper[4869]: I0312 14:51:37.537279 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 12 14:51:37 crc kubenswrapper[4869]: I0312 14:51:37.559894 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 12 14:51:37 crc kubenswrapper[4869]: I0312 14:51:37.583847 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 12 14:51:37 crc kubenswrapper[4869]: I0312 14:51:37.785147 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 12 14:51:37 crc kubenswrapper[4869]: I0312 14:51:37.815022 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 12 14:51:37 crc kubenswrapper[4869]: I0312 14:51:37.816273 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 12 14:51:37 crc kubenswrapper[4869]: I0312 14:51:37.939731 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 12 14:51:37 crc kubenswrapper[4869]: I0312 14:51:37.985001 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 12 14:51:38 crc kubenswrapper[4869]: I0312 14:51:38.219810 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 12 14:51:38 crc kubenswrapper[4869]: I0312 14:51:38.383084 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 12 14:51:38 crc kubenswrapper[4869]: I0312 14:51:38.400275 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 12 14:51:38 crc kubenswrapper[4869]: I0312 14:51:38.447322 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 12 14:51:38 crc kubenswrapper[4869]: I0312 14:51:38.477657 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 12 14:51:38 crc kubenswrapper[4869]: I0312 14:51:38.480373 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 12 14:51:38 crc kubenswrapper[4869]: I0312 14:51:38.507817 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 12 14:51:38 crc kubenswrapper[4869]: I0312 14:51:38.519737 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 12 14:51:38 crc kubenswrapper[4869]: I0312 14:51:38.541531 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 12 14:51:38 crc kubenswrapper[4869]: I0312 14:51:38.627231 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 12 14:51:38 crc kubenswrapper[4869]: I0312 14:51:38.639280 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 12 14:51:38 crc kubenswrapper[4869]: I0312 14:51:38.751040 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 12 14:51:38 crc kubenswrapper[4869]: I0312 14:51:38.941014 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 12 14:51:39 crc kubenswrapper[4869]: I0312 14:51:39.093622 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 12 14:51:39 crc kubenswrapper[4869]: I0312 14:51:39.161845 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 12 14:51:39 crc kubenswrapper[4869]: I0312 14:51:39.164898 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 12 14:51:39 crc kubenswrapper[4869]: I0312 14:51:39.180903 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 12 14:51:39 crc kubenswrapper[4869]: I0312 14:51:39.185277 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 12 14:51:39 crc kubenswrapper[4869]: I0312 14:51:39.361284 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 12 14:51:39 crc kubenswrapper[4869]: I0312 14:51:39.505034 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 12 14:51:39 crc kubenswrapper[4869]: I0312 14:51:39.613100 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 12 14:51:39 crc kubenswrapper[4869]: I0312 14:51:39.828022 4869 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 12 14:51:39 crc kubenswrapper[4869]: I0312 14:51:39.991276 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 12 14:51:40 crc kubenswrapper[4869]: I0312 14:51:40.280881 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 12 14:51:40 crc kubenswrapper[4869]: I0312 14:51:40.410618 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 12 14:51:40 crc kubenswrapper[4869]: I0312 14:51:40.723053 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 12 14:51:40 crc kubenswrapper[4869]: I0312 14:51:40.779339 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 12 14:51:40 crc kubenswrapper[4869]: I0312 14:51:40.785570 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 12 14:51:40 crc kubenswrapper[4869]: I0312 14:51:40.910850 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 12 14:51:40 crc kubenswrapper[4869]: I0312 14:51:40.929782 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 12 14:51:41 crc kubenswrapper[4869]: I0312 14:51:41.060498 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 12 14:51:41 crc kubenswrapper[4869]: I0312 14:51:41.074817 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 12 14:51:41 crc kubenswrapper[4869]: I0312 14:51:41.082701 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 12 14:51:41 crc kubenswrapper[4869]: I0312 14:51:41.228923 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 12 14:51:41 crc kubenswrapper[4869]: I0312 14:51:41.228997 4869 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="2f86ed7d93b8f1188ec7ba7486307a358692889407afcf769d87d8a700fa1d6f" exitCode=137 Mar 12 14:51:41 crc kubenswrapper[4869]: I0312 14:51:41.243214 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 12 14:51:41 crc kubenswrapper[4869]: I0312 14:51:41.353377 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 12 14:51:41 crc kubenswrapper[4869]: I0312 14:51:41.746449 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 12 14:51:41 crc kubenswrapper[4869]: I0312 14:51:41.746897 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 14:51:41 crc kubenswrapper[4869]: I0312 14:51:41.793959 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 12 14:51:41 crc kubenswrapper[4869]: I0312 14:51:41.794005 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 12 14:51:41 crc kubenswrapper[4869]: I0312 14:51:41.794054 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 12 14:51:41 crc kubenswrapper[4869]: I0312 14:51:41.794082 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 12 14:51:41 crc kubenswrapper[4869]: I0312 14:51:41.794157 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 12 14:51:41 crc kubenswrapper[4869]: I0312 14:51:41.794227 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 14:51:41 crc kubenswrapper[4869]: I0312 14:51:41.794270 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 14:51:41 crc kubenswrapper[4869]: I0312 14:51:41.794323 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 14:51:41 crc kubenswrapper[4869]: I0312 14:51:41.794377 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 14:51:41 crc kubenswrapper[4869]: I0312 14:51:41.794515 4869 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 12 14:51:41 crc kubenswrapper[4869]: I0312 14:51:41.794534 4869 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 12 14:51:41 crc kubenswrapper[4869]: I0312 14:51:41.794568 4869 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 12 14:51:41 crc kubenswrapper[4869]: I0312 14:51:41.801760 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 14:51:41 crc kubenswrapper[4869]: I0312 14:51:41.859510 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 12 14:51:41 crc kubenswrapper[4869]: I0312 14:51:41.896202 4869 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 12 14:51:41 crc kubenswrapper[4869]: I0312 14:51:41.896265 4869 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 12 14:51:42 crc kubenswrapper[4869]: I0312 14:51:42.234826 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 12 14:51:42 crc kubenswrapper[4869]: I0312 14:51:42.234899 4869 scope.go:117] "RemoveContainer" containerID="2f86ed7d93b8f1188ec7ba7486307a358692889407afcf769d87d8a700fa1d6f" Mar 12 14:51:42 crc kubenswrapper[4869]: I0312 14:51:42.234928 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 14:51:42 crc kubenswrapper[4869]: I0312 14:51:42.347988 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 12 14:51:42 crc kubenswrapper[4869]: I0312 14:51:42.348279 4869 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 12 14:51:42 crc kubenswrapper[4869]: I0312 14:51:42.360596 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 12 14:51:42 crc kubenswrapper[4869]: I0312 14:51:42.360650 4869 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="9d6271a2-66f0-4406-8540-a8451f8d89a2" Mar 12 14:51:42 crc kubenswrapper[4869]: I0312 14:51:42.363654 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 12 14:51:42 crc kubenswrapper[4869]: I0312 14:51:42.363699 4869 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="9d6271a2-66f0-4406-8540-a8451f8d89a2" Mar 12 14:51:42 crc kubenswrapper[4869]: I0312 14:51:42.526076 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 12 14:51:43 crc kubenswrapper[4869]: I0312 14:51:43.072389 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 12 14:51:43 crc kubenswrapper[4869]: I0312 14:51:43.204766 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 12 14:51:43 crc kubenswrapper[4869]: I0312 14:51:43.603018 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 12 14:51:44 crc kubenswrapper[4869]: I0312 14:51:44.461773 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 12 14:51:44 crc kubenswrapper[4869]: I0312 14:51:44.998921 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 12 14:51:49 crc kubenswrapper[4869]: I0312 14:51:49.683945 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:51:49 crc kubenswrapper[4869]: I0312 14:51:49.684010 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:52:00 crc kubenswrapper[4869]: I0312 14:52:00.155446 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555452-jbkgg"] Mar 12 14:52:00 crc kubenswrapper[4869]: E0312 14:52:00.156232 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2" containerName="installer" Mar 12 14:52:00 crc kubenswrapper[4869]: I0312 14:52:00.156248 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2" containerName="installer" Mar 12 14:52:00 crc kubenswrapper[4869]: E0312 14:52:00.156264 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 12 14:52:00 crc kubenswrapper[4869]: I0312 14:52:00.156271 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 12 14:52:00 crc kubenswrapper[4869]: I0312 14:52:00.156385 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 12 14:52:00 crc kubenswrapper[4869]: I0312 14:52:00.156399 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="fafe913d-7e55-4f19-8ebd-7ef3d6fdd4a2" containerName="installer" Mar 12 14:52:00 crc kubenswrapper[4869]: I0312 14:52:00.156771 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555452-jbkgg" Mar 12 14:52:00 crc kubenswrapper[4869]: I0312 14:52:00.159479 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:52:00 crc kubenswrapper[4869]: I0312 14:52:00.159592 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:52:00 crc kubenswrapper[4869]: I0312 14:52:00.159698 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 14:52:00 crc kubenswrapper[4869]: I0312 14:52:00.162474 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555452-jbkgg"] Mar 12 14:52:00 crc kubenswrapper[4869]: I0312 14:52:00.235790 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzqpr\" (UniqueName: \"kubernetes.io/projected/bf2f746c-3fb9-425d-a06c-62c3cf768500-kube-api-access-mzqpr\") pod \"auto-csr-approver-29555452-jbkgg\" (UID: \"bf2f746c-3fb9-425d-a06c-62c3cf768500\") " pod="openshift-infra/auto-csr-approver-29555452-jbkgg" Mar 12 14:52:00 crc kubenswrapper[4869]: I0312 14:52:00.336702 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzqpr\" (UniqueName: \"kubernetes.io/projected/bf2f746c-3fb9-425d-a06c-62c3cf768500-kube-api-access-mzqpr\") pod \"auto-csr-approver-29555452-jbkgg\" (UID: \"bf2f746c-3fb9-425d-a06c-62c3cf768500\") " pod="openshift-infra/auto-csr-approver-29555452-jbkgg" Mar 12 14:52:00 crc kubenswrapper[4869]: I0312 14:52:00.355816 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzqpr\" (UniqueName: \"kubernetes.io/projected/bf2f746c-3fb9-425d-a06c-62c3cf768500-kube-api-access-mzqpr\") pod \"auto-csr-approver-29555452-jbkgg\" (UID: \"bf2f746c-3fb9-425d-a06c-62c3cf768500\") " pod="openshift-infra/auto-csr-approver-29555452-jbkgg" Mar 12 14:52:00 crc kubenswrapper[4869]: I0312 14:52:00.480206 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555452-jbkgg" Mar 12 14:52:00 crc kubenswrapper[4869]: I0312 14:52:00.877393 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555452-jbkgg"] Mar 12 14:52:01 crc kubenswrapper[4869]: I0312 14:52:01.330962 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555452-jbkgg" event={"ID":"bf2f746c-3fb9-425d-a06c-62c3cf768500","Type":"ContainerStarted","Data":"775e1209b62d249ae77e54d60ec25261c4619139b938f16c2fbf59a214e69e26"} Mar 12 14:52:02 crc kubenswrapper[4869]: I0312 14:52:02.341395 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555452-jbkgg" event={"ID":"bf2f746c-3fb9-425d-a06c-62c3cf768500","Type":"ContainerStarted","Data":"8f4927e0f25391a515e345445412ddbdfae727c49288de1397c6a4501444f551"} Mar 12 14:52:02 crc kubenswrapper[4869]: I0312 14:52:02.351104 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555452-jbkgg" podStartSLOduration=1.146198648 podStartE2EDuration="2.351082186s" podCreationTimestamp="2026-03-12 14:52:00 +0000 UTC" firstStartedPulling="2026-03-12 14:52:00.884671227 +0000 UTC m=+273.169896505" lastFinishedPulling="2026-03-12 14:52:02.089554765 +0000 UTC m=+274.374780043" observedRunningTime="2026-03-12 14:52:02.346646113 +0000 UTC m=+274.631871391" watchObservedRunningTime="2026-03-12 14:52:02.351082186 +0000 UTC m=+274.636307464" Mar 12 14:52:03 crc kubenswrapper[4869]: I0312 14:52:03.342597 4869 generic.go:334] "Generic (PLEG): container finished" podID="bf2f746c-3fb9-425d-a06c-62c3cf768500" containerID="8f4927e0f25391a515e345445412ddbdfae727c49288de1397c6a4501444f551" exitCode=0 Mar 12 14:52:03 crc kubenswrapper[4869]: I0312 14:52:03.342695 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555452-jbkgg" event={"ID":"bf2f746c-3fb9-425d-a06c-62c3cf768500","Type":"ContainerDied","Data":"8f4927e0f25391a515e345445412ddbdfae727c49288de1397c6a4501444f551"} Mar 12 14:52:04 crc kubenswrapper[4869]: I0312 14:52:04.587216 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555452-jbkgg" Mar 12 14:52:04 crc kubenswrapper[4869]: I0312 14:52:04.697874 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzqpr\" (UniqueName: \"kubernetes.io/projected/bf2f746c-3fb9-425d-a06c-62c3cf768500-kube-api-access-mzqpr\") pod \"bf2f746c-3fb9-425d-a06c-62c3cf768500\" (UID: \"bf2f746c-3fb9-425d-a06c-62c3cf768500\") " Mar 12 14:52:04 crc kubenswrapper[4869]: I0312 14:52:04.702079 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf2f746c-3fb9-425d-a06c-62c3cf768500-kube-api-access-mzqpr" (OuterVolumeSpecName: "kube-api-access-mzqpr") pod "bf2f746c-3fb9-425d-a06c-62c3cf768500" (UID: "bf2f746c-3fb9-425d-a06c-62c3cf768500"). InnerVolumeSpecName "kube-api-access-mzqpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:52:04 crc kubenswrapper[4869]: I0312 14:52:04.799533 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzqpr\" (UniqueName: \"kubernetes.io/projected/bf2f746c-3fb9-425d-a06c-62c3cf768500-kube-api-access-mzqpr\") on node \"crc\" DevicePath \"\"" Mar 12 14:52:05 crc kubenswrapper[4869]: I0312 14:52:05.354349 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555452-jbkgg" event={"ID":"bf2f746c-3fb9-425d-a06c-62c3cf768500","Type":"ContainerDied","Data":"775e1209b62d249ae77e54d60ec25261c4619139b938f16c2fbf59a214e69e26"} Mar 12 14:52:05 crc kubenswrapper[4869]: I0312 14:52:05.354387 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="775e1209b62d249ae77e54d60ec25261c4619139b938f16c2fbf59a214e69e26" Mar 12 14:52:05 crc kubenswrapper[4869]: I0312 14:52:05.354410 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555452-jbkgg" Mar 12 14:52:19 crc kubenswrapper[4869]: I0312 14:52:19.684261 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:52:19 crc kubenswrapper[4869]: I0312 14:52:19.685134 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:52:19 crc kubenswrapper[4869]: I0312 14:52:19.685223 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" Mar 12 14:52:19 crc kubenswrapper[4869]: I0312 14:52:19.686231 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9e48f067c773716c7d24fab1c2ac1e1bfd0b073b1e56d62472b739aafe4d8ef4"} pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 14:52:19 crc kubenswrapper[4869]: I0312 14:52:19.686332 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" containerID="cri-o://9e48f067c773716c7d24fab1c2ac1e1bfd0b073b1e56d62472b739aafe4d8ef4" gracePeriod=600 Mar 12 14:52:20 crc kubenswrapper[4869]: I0312 14:52:20.443177 4869 generic.go:334] "Generic (PLEG): container finished" podID="1621c994-94d2-4105-a988-f4739518ba91" containerID="9e48f067c773716c7d24fab1c2ac1e1bfd0b073b1e56d62472b739aafe4d8ef4" exitCode=0 Mar 12 14:52:20 crc kubenswrapper[4869]: I0312 14:52:20.443262 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerDied","Data":"9e48f067c773716c7d24fab1c2ac1e1bfd0b073b1e56d62472b739aafe4d8ef4"} Mar 12 14:52:20 crc kubenswrapper[4869]: I0312 14:52:20.443709 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerStarted","Data":"7938ffdcb0f5d68aef181bbbe274d4e45cbcba7c963f76ee6c1f153d4d2ccdd0"} Mar 12 14:53:05 crc kubenswrapper[4869]: I0312 14:53:05.902714 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-99thx"] Mar 12 14:53:05 crc kubenswrapper[4869]: E0312 14:53:05.903422 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf2f746c-3fb9-425d-a06c-62c3cf768500" containerName="oc" Mar 12 14:53:05 crc kubenswrapper[4869]: I0312 14:53:05.903434 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf2f746c-3fb9-425d-a06c-62c3cf768500" containerName="oc" Mar 12 14:53:05 crc kubenswrapper[4869]: I0312 14:53:05.903534 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf2f746c-3fb9-425d-a06c-62c3cf768500" containerName="oc" Mar 12 14:53:05 crc kubenswrapper[4869]: I0312 14:53:05.903984 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-99thx" Mar 12 14:53:05 crc kubenswrapper[4869]: I0312 14:53:05.915467 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-99thx"] Mar 12 14:53:05 crc kubenswrapper[4869]: I0312 14:53:05.967291 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j5jv\" (UniqueName: \"kubernetes.io/projected/cce33ba2-a5b9-41d4-8ce6-d117621a4c62-kube-api-access-8j5jv\") pod \"image-registry-66df7c8f76-99thx\" (UID: \"cce33ba2-a5b9-41d4-8ce6-d117621a4c62\") " pod="openshift-image-registry/image-registry-66df7c8f76-99thx" Mar 12 14:53:05 crc kubenswrapper[4869]: I0312 14:53:05.967331 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cce33ba2-a5b9-41d4-8ce6-d117621a4c62-bound-sa-token\") pod \"image-registry-66df7c8f76-99thx\" (UID: \"cce33ba2-a5b9-41d4-8ce6-d117621a4c62\") " pod="openshift-image-registry/image-registry-66df7c8f76-99thx" Mar 12 14:53:05 crc kubenswrapper[4869]: I0312 14:53:05.967368 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cce33ba2-a5b9-41d4-8ce6-d117621a4c62-trusted-ca\") pod \"image-registry-66df7c8f76-99thx\" (UID: \"cce33ba2-a5b9-41d4-8ce6-d117621a4c62\") " pod="openshift-image-registry/image-registry-66df7c8f76-99thx" Mar 12 14:53:05 crc kubenswrapper[4869]: I0312 14:53:05.967394 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cce33ba2-a5b9-41d4-8ce6-d117621a4c62-ca-trust-extracted\") pod \"image-registry-66df7c8f76-99thx\" (UID: \"cce33ba2-a5b9-41d4-8ce6-d117621a4c62\") " pod="openshift-image-registry/image-registry-66df7c8f76-99thx" Mar 12 14:53:05 crc kubenswrapper[4869]: I0312 14:53:05.967555 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cce33ba2-a5b9-41d4-8ce6-d117621a4c62-registry-certificates\") pod \"image-registry-66df7c8f76-99thx\" (UID: \"cce33ba2-a5b9-41d4-8ce6-d117621a4c62\") " pod="openshift-image-registry/image-registry-66df7c8f76-99thx" Mar 12 14:53:05 crc kubenswrapper[4869]: I0312 14:53:05.967619 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-99thx\" (UID: \"cce33ba2-a5b9-41d4-8ce6-d117621a4c62\") " pod="openshift-image-registry/image-registry-66df7c8f76-99thx" Mar 12 14:53:05 crc kubenswrapper[4869]: I0312 14:53:05.967654 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cce33ba2-a5b9-41d4-8ce6-d117621a4c62-registry-tls\") pod \"image-registry-66df7c8f76-99thx\" (UID: \"cce33ba2-a5b9-41d4-8ce6-d117621a4c62\") " pod="openshift-image-registry/image-registry-66df7c8f76-99thx" Mar 12 14:53:05 crc kubenswrapper[4869]: I0312 14:53:05.967677 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cce33ba2-a5b9-41d4-8ce6-d117621a4c62-installation-pull-secrets\") pod \"image-registry-66df7c8f76-99thx\" (UID: \"cce33ba2-a5b9-41d4-8ce6-d117621a4c62\") " pod="openshift-image-registry/image-registry-66df7c8f76-99thx" Mar 12 14:53:05 crc kubenswrapper[4869]: I0312 14:53:05.985771 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-99thx\" (UID: \"cce33ba2-a5b9-41d4-8ce6-d117621a4c62\") " pod="openshift-image-registry/image-registry-66df7c8f76-99thx" Mar 12 14:53:06 crc kubenswrapper[4869]: I0312 14:53:06.068740 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cce33ba2-a5b9-41d4-8ce6-d117621a4c62-registry-tls\") pod \"image-registry-66df7c8f76-99thx\" (UID: \"cce33ba2-a5b9-41d4-8ce6-d117621a4c62\") " pod="openshift-image-registry/image-registry-66df7c8f76-99thx" Mar 12 14:53:06 crc kubenswrapper[4869]: I0312 14:53:06.069096 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cce33ba2-a5b9-41d4-8ce6-d117621a4c62-installation-pull-secrets\") pod \"image-registry-66df7c8f76-99thx\" (UID: \"cce33ba2-a5b9-41d4-8ce6-d117621a4c62\") " pod="openshift-image-registry/image-registry-66df7c8f76-99thx" Mar 12 14:53:06 crc kubenswrapper[4869]: I0312 14:53:06.069133 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j5jv\" (UniqueName: \"kubernetes.io/projected/cce33ba2-a5b9-41d4-8ce6-d117621a4c62-kube-api-access-8j5jv\") pod \"image-registry-66df7c8f76-99thx\" (UID: \"cce33ba2-a5b9-41d4-8ce6-d117621a4c62\") " pod="openshift-image-registry/image-registry-66df7c8f76-99thx" Mar 12 14:53:06 crc kubenswrapper[4869]: I0312 14:53:06.069154 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cce33ba2-a5b9-41d4-8ce6-d117621a4c62-bound-sa-token\") pod \"image-registry-66df7c8f76-99thx\" (UID: \"cce33ba2-a5b9-41d4-8ce6-d117621a4c62\") " pod="openshift-image-registry/image-registry-66df7c8f76-99thx" Mar 12 14:53:06 crc kubenswrapper[4869]: I0312 14:53:06.069192 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cce33ba2-a5b9-41d4-8ce6-d117621a4c62-trusted-ca\") pod \"image-registry-66df7c8f76-99thx\" (UID: \"cce33ba2-a5b9-41d4-8ce6-d117621a4c62\") " pod="openshift-image-registry/image-registry-66df7c8f76-99thx" Mar 12 14:53:06 crc kubenswrapper[4869]: I0312 14:53:06.069214 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cce33ba2-a5b9-41d4-8ce6-d117621a4c62-ca-trust-extracted\") pod \"image-registry-66df7c8f76-99thx\" (UID: \"cce33ba2-a5b9-41d4-8ce6-d117621a4c62\") " pod="openshift-image-registry/image-registry-66df7c8f76-99thx" Mar 12 14:53:06 crc kubenswrapper[4869]: I0312 14:53:06.069242 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cce33ba2-a5b9-41d4-8ce6-d117621a4c62-registry-certificates\") pod \"image-registry-66df7c8f76-99thx\" (UID: \"cce33ba2-a5b9-41d4-8ce6-d117621a4c62\") " pod="openshift-image-registry/image-registry-66df7c8f76-99thx" Mar 12 14:53:06 crc kubenswrapper[4869]: I0312 14:53:06.070170 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cce33ba2-a5b9-41d4-8ce6-d117621a4c62-ca-trust-extracted\") pod \"image-registry-66df7c8f76-99thx\" (UID: \"cce33ba2-a5b9-41d4-8ce6-d117621a4c62\") " pod="openshift-image-registry/image-registry-66df7c8f76-99thx" Mar 12 14:53:06 crc kubenswrapper[4869]: I0312 14:53:06.070403 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cce33ba2-a5b9-41d4-8ce6-d117621a4c62-registry-certificates\") pod \"image-registry-66df7c8f76-99thx\" (UID: \"cce33ba2-a5b9-41d4-8ce6-d117621a4c62\") " pod="openshift-image-registry/image-registry-66df7c8f76-99thx" Mar 12 14:53:06 crc kubenswrapper[4869]: I0312 14:53:06.070622 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cce33ba2-a5b9-41d4-8ce6-d117621a4c62-trusted-ca\") pod \"image-registry-66df7c8f76-99thx\" (UID: \"cce33ba2-a5b9-41d4-8ce6-d117621a4c62\") " pod="openshift-image-registry/image-registry-66df7c8f76-99thx" Mar 12 14:53:06 crc kubenswrapper[4869]: I0312 14:53:06.074159 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cce33ba2-a5b9-41d4-8ce6-d117621a4c62-installation-pull-secrets\") pod \"image-registry-66df7c8f76-99thx\" (UID: \"cce33ba2-a5b9-41d4-8ce6-d117621a4c62\") " pod="openshift-image-registry/image-registry-66df7c8f76-99thx" Mar 12 14:53:06 crc kubenswrapper[4869]: I0312 14:53:06.078007 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cce33ba2-a5b9-41d4-8ce6-d117621a4c62-registry-tls\") pod \"image-registry-66df7c8f76-99thx\" (UID: \"cce33ba2-a5b9-41d4-8ce6-d117621a4c62\") " pod="openshift-image-registry/image-registry-66df7c8f76-99thx" Mar 12 14:53:06 crc kubenswrapper[4869]: I0312 14:53:06.084598 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cce33ba2-a5b9-41d4-8ce6-d117621a4c62-bound-sa-token\") pod \"image-registry-66df7c8f76-99thx\" (UID: \"cce33ba2-a5b9-41d4-8ce6-d117621a4c62\") " pod="openshift-image-registry/image-registry-66df7c8f76-99thx" Mar 12 14:53:06 crc kubenswrapper[4869]: I0312 14:53:06.086713 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j5jv\" (UniqueName: \"kubernetes.io/projected/cce33ba2-a5b9-41d4-8ce6-d117621a4c62-kube-api-access-8j5jv\") pod \"image-registry-66df7c8f76-99thx\" (UID: \"cce33ba2-a5b9-41d4-8ce6-d117621a4c62\") " pod="openshift-image-registry/image-registry-66df7c8f76-99thx" Mar 12 14:53:06 crc kubenswrapper[4869]: I0312 14:53:06.218954 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-99thx" Mar 12 14:53:07 crc kubenswrapper[4869]: I0312 14:53:07.980910 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-99thx"] Mar 12 14:53:08 crc kubenswrapper[4869]: I0312 14:53:08.848176 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-99thx" event={"ID":"cce33ba2-a5b9-41d4-8ce6-d117621a4c62","Type":"ContainerStarted","Data":"7ce6e22682716abcf209af59718bc3abddd397c1b7a88f56f32f9e0fa2b1b2d9"} Mar 12 14:53:08 crc kubenswrapper[4869]: I0312 14:53:08.848517 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-99thx" Mar 12 14:53:08 crc kubenswrapper[4869]: I0312 14:53:08.848659 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-99thx" event={"ID":"cce33ba2-a5b9-41d4-8ce6-d117621a4c62","Type":"ContainerStarted","Data":"4a25781cb0bf2f45fdb969eee3c059b249050d5930360c6722b95ea2bc1115f6"} Mar 12 14:53:08 crc kubenswrapper[4869]: I0312 14:53:08.872141 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-99thx" podStartSLOduration=3.872115651 podStartE2EDuration="3.872115651s" podCreationTimestamp="2026-03-12 14:53:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:53:08.866579855 +0000 UTC m=+341.151805153" watchObservedRunningTime="2026-03-12 14:53:08.872115651 +0000 UTC m=+341.157340929" Mar 12 14:53:21 crc kubenswrapper[4869]: I0312 14:53:21.798710 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ngq4q"] Mar 12 14:53:21 crc kubenswrapper[4869]: I0312 14:53:21.801972 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ngq4q" podUID="8bf6f93d-a6f8-494b-abfa-47f8a164b667" containerName="registry-server" containerID="cri-o://47f226d09b13229e429c8112fdcda62314f1524f8689557e2310c93169c1764e" gracePeriod=30 Mar 12 14:53:21 crc kubenswrapper[4869]: I0312 14:53:21.825243 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lxjq9"] Mar 12 14:53:21 crc kubenswrapper[4869]: I0312 14:53:21.825882 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lxjq9" podUID="9b8fa922-8e49-42e1-a4a5-40069c505bf6" containerName="registry-server" containerID="cri-o://e0bc81cd57072cc8e509dbf4315f8c4748f81116d8fee2eb326b338a0f6bf66b" gracePeriod=30 Mar 12 14:53:21 crc kubenswrapper[4869]: I0312 14:53:21.838928 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qdm27"] Mar 12 14:53:21 crc kubenswrapper[4869]: I0312 14:53:21.848367 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jvphv"] Mar 12 14:53:21 crc kubenswrapper[4869]: I0312 14:53:21.848695 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jvphv" podUID="e65737a9-e615-4a51-a72c-e4b561bdd1b0" containerName="registry-server" containerID="cri-o://d01c7a54a8b6952334e4b198cabcab20732adf70bb899af7e59c988a8e4d0439" gracePeriod=30 Mar 12 14:53:21 crc kubenswrapper[4869]: I0312 14:53:21.857957 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zs5nz"] Mar 12 14:53:21 crc kubenswrapper[4869]: I0312 14:53:21.858242 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zs5nz" podUID="ffa8f886-960d-4115-a0e0-3ca252d2af08" containerName="registry-server" containerID="cri-o://cb9a2a3df6841a00da4a501abf635b79e546fc54625f796a96cca53b32361a9e" gracePeriod=30 Mar 12 14:53:21 crc kubenswrapper[4869]: I0312 14:53:21.872943 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9spzt"] Mar 12 14:53:21 crc kubenswrapper[4869]: I0312 14:53:21.873730 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9spzt" Mar 12 14:53:21 crc kubenswrapper[4869]: I0312 14:53:21.875901 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9spzt"] Mar 12 14:53:21 crc kubenswrapper[4869]: I0312 14:53:21.984785 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/57f6b5a5-3396-4238-b09c-5c5cf9a81ff9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9spzt\" (UID: \"57f6b5a5-3396-4238-b09c-5c5cf9a81ff9\") " pod="openshift-marketplace/marketplace-operator-79b997595-9spzt" Mar 12 14:53:21 crc kubenswrapper[4869]: I0312 14:53:21.984850 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b9r2\" (UniqueName: \"kubernetes.io/projected/57f6b5a5-3396-4238-b09c-5c5cf9a81ff9-kube-api-access-2b9r2\") pod \"marketplace-operator-79b997595-9spzt\" (UID: \"57f6b5a5-3396-4238-b09c-5c5cf9a81ff9\") " pod="openshift-marketplace/marketplace-operator-79b997595-9spzt" Mar 12 14:53:21 crc kubenswrapper[4869]: I0312 14:53:21.984905 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57f6b5a5-3396-4238-b09c-5c5cf9a81ff9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9spzt\" (UID: \"57f6b5a5-3396-4238-b09c-5c5cf9a81ff9\") " pod="openshift-marketplace/marketplace-operator-79b997595-9spzt" Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.086015 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/57f6b5a5-3396-4238-b09c-5c5cf9a81ff9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9spzt\" (UID: \"57f6b5a5-3396-4238-b09c-5c5cf9a81ff9\") " pod="openshift-marketplace/marketplace-operator-79b997595-9spzt" Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.086071 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b9r2\" (UniqueName: \"kubernetes.io/projected/57f6b5a5-3396-4238-b09c-5c5cf9a81ff9-kube-api-access-2b9r2\") pod \"marketplace-operator-79b997595-9spzt\" (UID: \"57f6b5a5-3396-4238-b09c-5c5cf9a81ff9\") " pod="openshift-marketplace/marketplace-operator-79b997595-9spzt" Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.086175 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57f6b5a5-3396-4238-b09c-5c5cf9a81ff9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9spzt\" (UID: \"57f6b5a5-3396-4238-b09c-5c5cf9a81ff9\") " pod="openshift-marketplace/marketplace-operator-79b997595-9spzt" Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.087490 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57f6b5a5-3396-4238-b09c-5c5cf9a81ff9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9spzt\" (UID: \"57f6b5a5-3396-4238-b09c-5c5cf9a81ff9\") " pod="openshift-marketplace/marketplace-operator-79b997595-9spzt" Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.092047 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/57f6b5a5-3396-4238-b09c-5c5cf9a81ff9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9spzt\" (UID: \"57f6b5a5-3396-4238-b09c-5c5cf9a81ff9\") " pod="openshift-marketplace/marketplace-operator-79b997595-9spzt" Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.104772 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b9r2\" (UniqueName: \"kubernetes.io/projected/57f6b5a5-3396-4238-b09c-5c5cf9a81ff9-kube-api-access-2b9r2\") pod \"marketplace-operator-79b997595-9spzt\" (UID: \"57f6b5a5-3396-4238-b09c-5c5cf9a81ff9\") " pod="openshift-marketplace/marketplace-operator-79b997595-9spzt" Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.150690 4869 generic.go:334] "Generic (PLEG): container finished" podID="8bf6f93d-a6f8-494b-abfa-47f8a164b667" containerID="47f226d09b13229e429c8112fdcda62314f1524f8689557e2310c93169c1764e" exitCode=0 Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.150780 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngq4q" event={"ID":"8bf6f93d-a6f8-494b-abfa-47f8a164b667","Type":"ContainerDied","Data":"47f226d09b13229e429c8112fdcda62314f1524f8689557e2310c93169c1764e"} Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.150876 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-qdm27" podUID="bb16ce4b-e604-45d9-9635-c2565dcbd228" containerName="marketplace-operator" containerID="cri-o://28c39ab654df88951b8558c26bafcf32029ed735756309b4459f406e3274b808" gracePeriod=30 Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.356013 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9spzt" Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.667118 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ngq4q" Mar 12 14:53:22 crc kubenswrapper[4869]: E0312 14:53:22.714056 4869 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffa8f886_960d_4115_a0e0_3ca252d2af08.slice/crio-conmon-cb9a2a3df6841a00da4a501abf635b79e546fc54625f796a96cca53b32361a9e.scope\": RecentStats: unable to find data in memory cache]" Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.768058 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lxjq9" Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.773255 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zs5nz" Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.811596 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8m68\" (UniqueName: \"kubernetes.io/projected/8bf6f93d-a6f8-494b-abfa-47f8a164b667-kube-api-access-r8m68\") pod \"8bf6f93d-a6f8-494b-abfa-47f8a164b667\" (UID: \"8bf6f93d-a6f8-494b-abfa-47f8a164b667\") " Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.811708 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bf6f93d-a6f8-494b-abfa-47f8a164b667-utilities\") pod \"8bf6f93d-a6f8-494b-abfa-47f8a164b667\" (UID: \"8bf6f93d-a6f8-494b-abfa-47f8a164b667\") " Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.811826 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bf6f93d-a6f8-494b-abfa-47f8a164b667-catalog-content\") pod \"8bf6f93d-a6f8-494b-abfa-47f8a164b667\" (UID: \"8bf6f93d-a6f8-494b-abfa-47f8a164b667\") " Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.813739 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bf6f93d-a6f8-494b-abfa-47f8a164b667-utilities" (OuterVolumeSpecName: "utilities") pod "8bf6f93d-a6f8-494b-abfa-47f8a164b667" (UID: "8bf6f93d-a6f8-494b-abfa-47f8a164b667"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.820497 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bf6f93d-a6f8-494b-abfa-47f8a164b667-kube-api-access-r8m68" (OuterVolumeSpecName: "kube-api-access-r8m68") pod "8bf6f93d-a6f8-494b-abfa-47f8a164b667" (UID: "8bf6f93d-a6f8-494b-abfa-47f8a164b667"). InnerVolumeSpecName "kube-api-access-r8m68". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.831905 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jvphv" Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.878052 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9spzt"] Mar 12 14:53:22 crc kubenswrapper[4869]: W0312 14:53:22.884295 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57f6b5a5_3396_4238_b09c_5c5cf9a81ff9.slice/crio-4e81228997b73214fd94190774c40f5ec3f561bd24dedb057ba08318800137a0 WatchSource:0}: Error finding container 4e81228997b73214fd94190774c40f5ec3f561bd24dedb057ba08318800137a0: Status 404 returned error can't find the container with id 4e81228997b73214fd94190774c40f5ec3f561bd24dedb057ba08318800137a0 Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.890939 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bf6f93d-a6f8-494b-abfa-47f8a164b667-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8bf6f93d-a6f8-494b-abfa-47f8a164b667" (UID: "8bf6f93d-a6f8-494b-abfa-47f8a164b667"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.913198 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffa8f886-960d-4115-a0e0-3ca252d2af08-utilities\") pod \"ffa8f886-960d-4115-a0e0-3ca252d2af08\" (UID: \"ffa8f886-960d-4115-a0e0-3ca252d2af08\") " Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.913260 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b8fa922-8e49-42e1-a4a5-40069c505bf6-catalog-content\") pod \"9b8fa922-8e49-42e1-a4a5-40069c505bf6\" (UID: \"9b8fa922-8e49-42e1-a4a5-40069c505bf6\") " Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.913288 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffa8f886-960d-4115-a0e0-3ca252d2af08-catalog-content\") pod \"ffa8f886-960d-4115-a0e0-3ca252d2af08\" (UID: \"ffa8f886-960d-4115-a0e0-3ca252d2af08\") " Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.913315 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nn9z\" (UniqueName: \"kubernetes.io/projected/ffa8f886-960d-4115-a0e0-3ca252d2af08-kube-api-access-5nn9z\") pod \"ffa8f886-960d-4115-a0e0-3ca252d2af08\" (UID: \"ffa8f886-960d-4115-a0e0-3ca252d2af08\") " Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.913361 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pltp\" (UniqueName: \"kubernetes.io/projected/9b8fa922-8e49-42e1-a4a5-40069c505bf6-kube-api-access-6pltp\") pod \"9b8fa922-8e49-42e1-a4a5-40069c505bf6\" (UID: \"9b8fa922-8e49-42e1-a4a5-40069c505bf6\") " Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.913378 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b8fa922-8e49-42e1-a4a5-40069c505bf6-utilities\") pod \"9b8fa922-8e49-42e1-a4a5-40069c505bf6\" (UID: \"9b8fa922-8e49-42e1-a4a5-40069c505bf6\") " Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.913631 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8m68\" (UniqueName: \"kubernetes.io/projected/8bf6f93d-a6f8-494b-abfa-47f8a164b667-kube-api-access-r8m68\") on node \"crc\" DevicePath \"\"" Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.913644 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bf6f93d-a6f8-494b-abfa-47f8a164b667-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.913652 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bf6f93d-a6f8-494b-abfa-47f8a164b667-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.914180 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffa8f886-960d-4115-a0e0-3ca252d2af08-utilities" (OuterVolumeSpecName: "utilities") pod "ffa8f886-960d-4115-a0e0-3ca252d2af08" (UID: "ffa8f886-960d-4115-a0e0-3ca252d2af08"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.914277 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b8fa922-8e49-42e1-a4a5-40069c505bf6-utilities" (OuterVolumeSpecName: "utilities") pod "9b8fa922-8e49-42e1-a4a5-40069c505bf6" (UID: "9b8fa922-8e49-42e1-a4a5-40069c505bf6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.916822 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffa8f886-960d-4115-a0e0-3ca252d2af08-kube-api-access-5nn9z" (OuterVolumeSpecName: "kube-api-access-5nn9z") pod "ffa8f886-960d-4115-a0e0-3ca252d2af08" (UID: "ffa8f886-960d-4115-a0e0-3ca252d2af08"). InnerVolumeSpecName "kube-api-access-5nn9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.918121 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b8fa922-8e49-42e1-a4a5-40069c505bf6-kube-api-access-6pltp" (OuterVolumeSpecName: "kube-api-access-6pltp") pod "9b8fa922-8e49-42e1-a4a5-40069c505bf6" (UID: "9b8fa922-8e49-42e1-a4a5-40069c505bf6"). InnerVolumeSpecName "kube-api-access-6pltp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:53:22 crc kubenswrapper[4869]: I0312 14:53:22.960910 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b8fa922-8e49-42e1-a4a5-40069c505bf6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b8fa922-8e49-42e1-a4a5-40069c505bf6" (UID: "9b8fa922-8e49-42e1-a4a5-40069c505bf6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.010590 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qdm27" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.014285 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f52p\" (UniqueName: \"kubernetes.io/projected/e65737a9-e615-4a51-a72c-e4b561bdd1b0-kube-api-access-2f52p\") pod \"e65737a9-e615-4a51-a72c-e4b561bdd1b0\" (UID: \"e65737a9-e615-4a51-a72c-e4b561bdd1b0\") " Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.014373 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e65737a9-e615-4a51-a72c-e4b561bdd1b0-catalog-content\") pod \"e65737a9-e615-4a51-a72c-e4b561bdd1b0\" (UID: \"e65737a9-e615-4a51-a72c-e4b561bdd1b0\") " Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.014983 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e65737a9-e615-4a51-a72c-e4b561bdd1b0-utilities\") pod \"e65737a9-e615-4a51-a72c-e4b561bdd1b0\" (UID: \"e65737a9-e615-4a51-a72c-e4b561bdd1b0\") " Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.015286 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nn9z\" (UniqueName: \"kubernetes.io/projected/ffa8f886-960d-4115-a0e0-3ca252d2af08-kube-api-access-5nn9z\") on node \"crc\" DevicePath \"\"" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.015301 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pltp\" (UniqueName: \"kubernetes.io/projected/9b8fa922-8e49-42e1-a4a5-40069c505bf6-kube-api-access-6pltp\") on node \"crc\" DevicePath \"\"" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.015312 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b8fa922-8e49-42e1-a4a5-40069c505bf6-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.015324 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffa8f886-960d-4115-a0e0-3ca252d2af08-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.015335 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b8fa922-8e49-42e1-a4a5-40069c505bf6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.016179 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e65737a9-e615-4a51-a72c-e4b561bdd1b0-utilities" (OuterVolumeSpecName: "utilities") pod "e65737a9-e615-4a51-a72c-e4b561bdd1b0" (UID: "e65737a9-e615-4a51-a72c-e4b561bdd1b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.020249 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e65737a9-e615-4a51-a72c-e4b561bdd1b0-kube-api-access-2f52p" (OuterVolumeSpecName: "kube-api-access-2f52p") pod "e65737a9-e615-4a51-a72c-e4b561bdd1b0" (UID: "e65737a9-e615-4a51-a72c-e4b561bdd1b0"). InnerVolumeSpecName "kube-api-access-2f52p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.061673 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffa8f886-960d-4115-a0e0-3ca252d2af08-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ffa8f886-960d-4115-a0e0-3ca252d2af08" (UID: "ffa8f886-960d-4115-a0e0-3ca252d2af08"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.063189 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e65737a9-e615-4a51-a72c-e4b561bdd1b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e65737a9-e615-4a51-a72c-e4b561bdd1b0" (UID: "e65737a9-e615-4a51-a72c-e4b561bdd1b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.116374 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95flb\" (UniqueName: \"kubernetes.io/projected/bb16ce4b-e604-45d9-9635-c2565dcbd228-kube-api-access-95flb\") pod \"bb16ce4b-e604-45d9-9635-c2565dcbd228\" (UID: \"bb16ce4b-e604-45d9-9635-c2565dcbd228\") " Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.116483 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb16ce4b-e604-45d9-9635-c2565dcbd228-marketplace-trusted-ca\") pod \"bb16ce4b-e604-45d9-9635-c2565dcbd228\" (UID: \"bb16ce4b-e604-45d9-9635-c2565dcbd228\") " Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.116517 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bb16ce4b-e604-45d9-9635-c2565dcbd228-marketplace-operator-metrics\") pod \"bb16ce4b-e604-45d9-9635-c2565dcbd228\" (UID: \"bb16ce4b-e604-45d9-9635-c2565dcbd228\") " Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.117027 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb16ce4b-e604-45d9-9635-c2565dcbd228-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "bb16ce4b-e604-45d9-9635-c2565dcbd228" (UID: "bb16ce4b-e604-45d9-9635-c2565dcbd228"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.117474 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e65737a9-e615-4a51-a72c-e4b561bdd1b0-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.117507 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffa8f886-960d-4115-a0e0-3ca252d2af08-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.117524 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2f52p\" (UniqueName: \"kubernetes.io/projected/e65737a9-e615-4a51-a72c-e4b561bdd1b0-kube-api-access-2f52p\") on node \"crc\" DevicePath \"\"" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.117534 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e65737a9-e615-4a51-a72c-e4b561bdd1b0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.119383 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb16ce4b-e604-45d9-9635-c2565dcbd228-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "bb16ce4b-e604-45d9-9635-c2565dcbd228" (UID: "bb16ce4b-e604-45d9-9635-c2565dcbd228"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.119629 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb16ce4b-e604-45d9-9635-c2565dcbd228-kube-api-access-95flb" (OuterVolumeSpecName: "kube-api-access-95flb") pod "bb16ce4b-e604-45d9-9635-c2565dcbd228" (UID: "bb16ce4b-e604-45d9-9635-c2565dcbd228"). InnerVolumeSpecName "kube-api-access-95flb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.158728 4869 generic.go:334] "Generic (PLEG): container finished" podID="9b8fa922-8e49-42e1-a4a5-40069c505bf6" containerID="e0bc81cd57072cc8e509dbf4315f8c4748f81116d8fee2eb326b338a0f6bf66b" exitCode=0 Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.158795 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxjq9" event={"ID":"9b8fa922-8e49-42e1-a4a5-40069c505bf6","Type":"ContainerDied","Data":"e0bc81cd57072cc8e509dbf4315f8c4748f81116d8fee2eb326b338a0f6bf66b"} Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.158825 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxjq9" event={"ID":"9b8fa922-8e49-42e1-a4a5-40069c505bf6","Type":"ContainerDied","Data":"f8859531dcd7eb6bb66b7b9b84212ac829828f6def777b5ae413cf390bcac1f9"} Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.158841 4869 scope.go:117] "RemoveContainer" containerID="e0bc81cd57072cc8e509dbf4315f8c4748f81116d8fee2eb326b338a0f6bf66b" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.158966 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lxjq9" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.162881 4869 generic.go:334] "Generic (PLEG): container finished" podID="ffa8f886-960d-4115-a0e0-3ca252d2af08" containerID="cb9a2a3df6841a00da4a501abf635b79e546fc54625f796a96cca53b32361a9e" exitCode=0 Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.162950 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zs5nz" event={"ID":"ffa8f886-960d-4115-a0e0-3ca252d2af08","Type":"ContainerDied","Data":"cb9a2a3df6841a00da4a501abf635b79e546fc54625f796a96cca53b32361a9e"} Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.162982 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zs5nz" event={"ID":"ffa8f886-960d-4115-a0e0-3ca252d2af08","Type":"ContainerDied","Data":"f9123e78ae5147462aead70ddb6d9442b1f27e1c26fcbd8ac381c67efcebb3e8"} Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.163059 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zs5nz" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.166321 4869 generic.go:334] "Generic (PLEG): container finished" podID="bb16ce4b-e604-45d9-9635-c2565dcbd228" containerID="28c39ab654df88951b8558c26bafcf32029ed735756309b4459f406e3274b808" exitCode=0 Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.166380 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qdm27" event={"ID":"bb16ce4b-e604-45d9-9635-c2565dcbd228","Type":"ContainerDied","Data":"28c39ab654df88951b8558c26bafcf32029ed735756309b4459f406e3274b808"} Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.166400 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qdm27" event={"ID":"bb16ce4b-e604-45d9-9635-c2565dcbd228","Type":"ContainerDied","Data":"54d82dc2700af463e03e36dece7879feeb7a725100c2349bef6d0a02d1105f36"} Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.166446 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qdm27" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.170207 4869 generic.go:334] "Generic (PLEG): container finished" podID="e65737a9-e615-4a51-a72c-e4b561bdd1b0" containerID="d01c7a54a8b6952334e4b198cabcab20732adf70bb899af7e59c988a8e4d0439" exitCode=0 Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.170271 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvphv" event={"ID":"e65737a9-e615-4a51-a72c-e4b561bdd1b0","Type":"ContainerDied","Data":"d01c7a54a8b6952334e4b198cabcab20732adf70bb899af7e59c988a8e4d0439"} Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.170300 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvphv" event={"ID":"e65737a9-e615-4a51-a72c-e4b561bdd1b0","Type":"ContainerDied","Data":"a3c523124e0530612b9ba899ed2d6d98b37b48d0e3afe05bd244b28f7954a0db"} Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.170385 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jvphv" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.173697 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ngq4q" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.173684 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngq4q" event={"ID":"8bf6f93d-a6f8-494b-abfa-47f8a164b667","Type":"ContainerDied","Data":"4ac566714f09e1835b523976a7ef6a7c380014ce3aede52bcacc6dfa92c6aea5"} Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.175706 4869 scope.go:117] "RemoveContainer" containerID="c9b40e76993fec66ffcc9316489fec0b6cdc784726c11fc9bff161cf0b543389" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.176486 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9spzt" event={"ID":"57f6b5a5-3396-4238-b09c-5c5cf9a81ff9","Type":"ContainerStarted","Data":"6420d8f583eeb2b66fc36c71ca8c944ec50be106f05a8c83c106897009fef199"} Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.176533 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9spzt" event={"ID":"57f6b5a5-3396-4238-b09c-5c5cf9a81ff9","Type":"ContainerStarted","Data":"4e81228997b73214fd94190774c40f5ec3f561bd24dedb057ba08318800137a0"} Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.177229 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-9spzt" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.179587 4869 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-9spzt container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.71:8080/healthz\": dial tcp 10.217.0.71:8080: connect: connection refused" start-of-body= Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.179648 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-9spzt" podUID="57f6b5a5-3396-4238-b09c-5c5cf9a81ff9" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.71:8080/healthz\": dial tcp 10.217.0.71:8080: connect: connection refused" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.197687 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lxjq9"] Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.213643 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lxjq9"] Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.215762 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zs5nz"] Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.216975 4869 scope.go:117] "RemoveContainer" containerID="dd5ef113cbf9f5b5f5cc494d7f654afbd93c86f1c15d03f746dd84611adce10d" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.218358 4869 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb16ce4b-e604-45d9-9635-c2565dcbd228-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.218467 4869 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bb16ce4b-e604-45d9-9635-c2565dcbd228-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.218485 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95flb\" (UniqueName: \"kubernetes.io/projected/bb16ce4b-e604-45d9-9635-c2565dcbd228-kube-api-access-95flb\") on node \"crc\" DevicePath \"\"" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.219345 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zs5nz"] Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.222429 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-9spzt" podStartSLOduration=2.2224190090000002 podStartE2EDuration="2.222419009s" podCreationTimestamp="2026-03-12 14:53:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:53:23.221937705 +0000 UTC m=+355.507162983" watchObservedRunningTime="2026-03-12 14:53:23.222419009 +0000 UTC m=+355.507644277" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.239464 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jvphv"] Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.243001 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jvphv"] Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.251318 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qdm27"] Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.254201 4869 scope.go:117] "RemoveContainer" containerID="e0bc81cd57072cc8e509dbf4315f8c4748f81116d8fee2eb326b338a0f6bf66b" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.254620 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qdm27"] Mar 12 14:53:23 crc kubenswrapper[4869]: E0312 14:53:23.254681 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0bc81cd57072cc8e509dbf4315f8c4748f81116d8fee2eb326b338a0f6bf66b\": container with ID starting with e0bc81cd57072cc8e509dbf4315f8c4748f81116d8fee2eb326b338a0f6bf66b not found: ID does not exist" containerID="e0bc81cd57072cc8e509dbf4315f8c4748f81116d8fee2eb326b338a0f6bf66b" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.254703 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0bc81cd57072cc8e509dbf4315f8c4748f81116d8fee2eb326b338a0f6bf66b"} err="failed to get container status \"e0bc81cd57072cc8e509dbf4315f8c4748f81116d8fee2eb326b338a0f6bf66b\": rpc error: code = NotFound desc = could not find container \"e0bc81cd57072cc8e509dbf4315f8c4748f81116d8fee2eb326b338a0f6bf66b\": container with ID starting with e0bc81cd57072cc8e509dbf4315f8c4748f81116d8fee2eb326b338a0f6bf66b not found: ID does not exist" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.254747 4869 scope.go:117] "RemoveContainer" containerID="c9b40e76993fec66ffcc9316489fec0b6cdc784726c11fc9bff161cf0b543389" Mar 12 14:53:23 crc kubenswrapper[4869]: E0312 14:53:23.257591 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9b40e76993fec66ffcc9316489fec0b6cdc784726c11fc9bff161cf0b543389\": container with ID starting with c9b40e76993fec66ffcc9316489fec0b6cdc784726c11fc9bff161cf0b543389 not found: ID does not exist" containerID="c9b40e76993fec66ffcc9316489fec0b6cdc784726c11fc9bff161cf0b543389" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.257641 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9b40e76993fec66ffcc9316489fec0b6cdc784726c11fc9bff161cf0b543389"} err="failed to get container status \"c9b40e76993fec66ffcc9316489fec0b6cdc784726c11fc9bff161cf0b543389\": rpc error: code = NotFound desc = could not find container \"c9b40e76993fec66ffcc9316489fec0b6cdc784726c11fc9bff161cf0b543389\": container with ID starting with c9b40e76993fec66ffcc9316489fec0b6cdc784726c11fc9bff161cf0b543389 not found: ID does not exist" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.257676 4869 scope.go:117] "RemoveContainer" containerID="dd5ef113cbf9f5b5f5cc494d7f654afbd93c86f1c15d03f746dd84611adce10d" Mar 12 14:53:23 crc kubenswrapper[4869]: E0312 14:53:23.258025 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd5ef113cbf9f5b5f5cc494d7f654afbd93c86f1c15d03f746dd84611adce10d\": container with ID starting with dd5ef113cbf9f5b5f5cc494d7f654afbd93c86f1c15d03f746dd84611adce10d not found: ID does not exist" containerID="dd5ef113cbf9f5b5f5cc494d7f654afbd93c86f1c15d03f746dd84611adce10d" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.258058 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd5ef113cbf9f5b5f5cc494d7f654afbd93c86f1c15d03f746dd84611adce10d"} err="failed to get container status \"dd5ef113cbf9f5b5f5cc494d7f654afbd93c86f1c15d03f746dd84611adce10d\": rpc error: code = NotFound desc = could not find container \"dd5ef113cbf9f5b5f5cc494d7f654afbd93c86f1c15d03f746dd84611adce10d\": container with ID starting with dd5ef113cbf9f5b5f5cc494d7f654afbd93c86f1c15d03f746dd84611adce10d not found: ID does not exist" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.258083 4869 scope.go:117] "RemoveContainer" containerID="cb9a2a3df6841a00da4a501abf635b79e546fc54625f796a96cca53b32361a9e" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.262921 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ngq4q"] Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.267617 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ngq4q"] Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.269335 4869 scope.go:117] "RemoveContainer" containerID="83688d11ff4972d61fc95637b12f80cd8df70d8bb3987e90ba90cf6e72e253be" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.283630 4869 scope.go:117] "RemoveContainer" containerID="2ae07afe2e446ef24abdf509609098d57b9aa81be6134615aecf64a2bbc93405" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.299401 4869 scope.go:117] "RemoveContainer" containerID="cb9a2a3df6841a00da4a501abf635b79e546fc54625f796a96cca53b32361a9e" Mar 12 14:53:23 crc kubenswrapper[4869]: E0312 14:53:23.299855 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb9a2a3df6841a00da4a501abf635b79e546fc54625f796a96cca53b32361a9e\": container with ID starting with cb9a2a3df6841a00da4a501abf635b79e546fc54625f796a96cca53b32361a9e not found: ID does not exist" containerID="cb9a2a3df6841a00da4a501abf635b79e546fc54625f796a96cca53b32361a9e" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.299884 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb9a2a3df6841a00da4a501abf635b79e546fc54625f796a96cca53b32361a9e"} err="failed to get container status \"cb9a2a3df6841a00da4a501abf635b79e546fc54625f796a96cca53b32361a9e\": rpc error: code = NotFound desc = could not find container \"cb9a2a3df6841a00da4a501abf635b79e546fc54625f796a96cca53b32361a9e\": container with ID starting with cb9a2a3df6841a00da4a501abf635b79e546fc54625f796a96cca53b32361a9e not found: ID does not exist" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.299906 4869 scope.go:117] "RemoveContainer" containerID="83688d11ff4972d61fc95637b12f80cd8df70d8bb3987e90ba90cf6e72e253be" Mar 12 14:53:23 crc kubenswrapper[4869]: E0312 14:53:23.300262 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83688d11ff4972d61fc95637b12f80cd8df70d8bb3987e90ba90cf6e72e253be\": container with ID starting with 83688d11ff4972d61fc95637b12f80cd8df70d8bb3987e90ba90cf6e72e253be not found: ID does not exist" containerID="83688d11ff4972d61fc95637b12f80cd8df70d8bb3987e90ba90cf6e72e253be" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.300283 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83688d11ff4972d61fc95637b12f80cd8df70d8bb3987e90ba90cf6e72e253be"} err="failed to get container status \"83688d11ff4972d61fc95637b12f80cd8df70d8bb3987e90ba90cf6e72e253be\": rpc error: code = NotFound desc = could not find container \"83688d11ff4972d61fc95637b12f80cd8df70d8bb3987e90ba90cf6e72e253be\": container with ID starting with 83688d11ff4972d61fc95637b12f80cd8df70d8bb3987e90ba90cf6e72e253be not found: ID does not exist" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.300296 4869 scope.go:117] "RemoveContainer" containerID="2ae07afe2e446ef24abdf509609098d57b9aa81be6134615aecf64a2bbc93405" Mar 12 14:53:23 crc kubenswrapper[4869]: E0312 14:53:23.300484 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ae07afe2e446ef24abdf509609098d57b9aa81be6134615aecf64a2bbc93405\": container with ID starting with 2ae07afe2e446ef24abdf509609098d57b9aa81be6134615aecf64a2bbc93405 not found: ID does not exist" containerID="2ae07afe2e446ef24abdf509609098d57b9aa81be6134615aecf64a2bbc93405" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.300508 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ae07afe2e446ef24abdf509609098d57b9aa81be6134615aecf64a2bbc93405"} err="failed to get container status \"2ae07afe2e446ef24abdf509609098d57b9aa81be6134615aecf64a2bbc93405\": rpc error: code = NotFound desc = could not find container \"2ae07afe2e446ef24abdf509609098d57b9aa81be6134615aecf64a2bbc93405\": container with ID starting with 2ae07afe2e446ef24abdf509609098d57b9aa81be6134615aecf64a2bbc93405 not found: ID does not exist" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.300525 4869 scope.go:117] "RemoveContainer" containerID="28c39ab654df88951b8558c26bafcf32029ed735756309b4459f406e3274b808" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.312436 4869 scope.go:117] "RemoveContainer" containerID="28c39ab654df88951b8558c26bafcf32029ed735756309b4459f406e3274b808" Mar 12 14:53:23 crc kubenswrapper[4869]: E0312 14:53:23.312783 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28c39ab654df88951b8558c26bafcf32029ed735756309b4459f406e3274b808\": container with ID starting with 28c39ab654df88951b8558c26bafcf32029ed735756309b4459f406e3274b808 not found: ID does not exist" containerID="28c39ab654df88951b8558c26bafcf32029ed735756309b4459f406e3274b808" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.312811 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28c39ab654df88951b8558c26bafcf32029ed735756309b4459f406e3274b808"} err="failed to get container status \"28c39ab654df88951b8558c26bafcf32029ed735756309b4459f406e3274b808\": rpc error: code = NotFound desc = could not find container \"28c39ab654df88951b8558c26bafcf32029ed735756309b4459f406e3274b808\": container with ID starting with 28c39ab654df88951b8558c26bafcf32029ed735756309b4459f406e3274b808 not found: ID does not exist" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.312836 4869 scope.go:117] "RemoveContainer" containerID="d01c7a54a8b6952334e4b198cabcab20732adf70bb899af7e59c988a8e4d0439" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.326159 4869 scope.go:117] "RemoveContainer" containerID="f8657cc80b562c840a5962dbf99a18f7c23c9fb943d0a4f378426ad6ea01fe7a" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.338904 4869 scope.go:117] "RemoveContainer" containerID="54c5ce06e9a70aa0363a427c1faff190350503758a8ea3524e3015b04ce7ef45" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.352764 4869 scope.go:117] "RemoveContainer" containerID="d01c7a54a8b6952334e4b198cabcab20732adf70bb899af7e59c988a8e4d0439" Mar 12 14:53:23 crc kubenswrapper[4869]: E0312 14:53:23.353331 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d01c7a54a8b6952334e4b198cabcab20732adf70bb899af7e59c988a8e4d0439\": container with ID starting with d01c7a54a8b6952334e4b198cabcab20732adf70bb899af7e59c988a8e4d0439 not found: ID does not exist" containerID="d01c7a54a8b6952334e4b198cabcab20732adf70bb899af7e59c988a8e4d0439" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.353365 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d01c7a54a8b6952334e4b198cabcab20732adf70bb899af7e59c988a8e4d0439"} err="failed to get container status \"d01c7a54a8b6952334e4b198cabcab20732adf70bb899af7e59c988a8e4d0439\": rpc error: code = NotFound desc = could not find container \"d01c7a54a8b6952334e4b198cabcab20732adf70bb899af7e59c988a8e4d0439\": container with ID starting with d01c7a54a8b6952334e4b198cabcab20732adf70bb899af7e59c988a8e4d0439 not found: ID does not exist" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.353396 4869 scope.go:117] "RemoveContainer" containerID="f8657cc80b562c840a5962dbf99a18f7c23c9fb943d0a4f378426ad6ea01fe7a" Mar 12 14:53:23 crc kubenswrapper[4869]: E0312 14:53:23.353774 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8657cc80b562c840a5962dbf99a18f7c23c9fb943d0a4f378426ad6ea01fe7a\": container with ID starting with f8657cc80b562c840a5962dbf99a18f7c23c9fb943d0a4f378426ad6ea01fe7a not found: ID does not exist" containerID="f8657cc80b562c840a5962dbf99a18f7c23c9fb943d0a4f378426ad6ea01fe7a" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.353805 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8657cc80b562c840a5962dbf99a18f7c23c9fb943d0a4f378426ad6ea01fe7a"} err="failed to get container status \"f8657cc80b562c840a5962dbf99a18f7c23c9fb943d0a4f378426ad6ea01fe7a\": rpc error: code = NotFound desc = could not find container \"f8657cc80b562c840a5962dbf99a18f7c23c9fb943d0a4f378426ad6ea01fe7a\": container with ID starting with f8657cc80b562c840a5962dbf99a18f7c23c9fb943d0a4f378426ad6ea01fe7a not found: ID does not exist" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.353826 4869 scope.go:117] "RemoveContainer" containerID="54c5ce06e9a70aa0363a427c1faff190350503758a8ea3524e3015b04ce7ef45" Mar 12 14:53:23 crc kubenswrapper[4869]: E0312 14:53:23.354127 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54c5ce06e9a70aa0363a427c1faff190350503758a8ea3524e3015b04ce7ef45\": container with ID starting with 54c5ce06e9a70aa0363a427c1faff190350503758a8ea3524e3015b04ce7ef45 not found: ID does not exist" containerID="54c5ce06e9a70aa0363a427c1faff190350503758a8ea3524e3015b04ce7ef45" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.354176 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54c5ce06e9a70aa0363a427c1faff190350503758a8ea3524e3015b04ce7ef45"} err="failed to get container status \"54c5ce06e9a70aa0363a427c1faff190350503758a8ea3524e3015b04ce7ef45\": rpc error: code = NotFound desc = could not find container \"54c5ce06e9a70aa0363a427c1faff190350503758a8ea3524e3015b04ce7ef45\": container with ID starting with 54c5ce06e9a70aa0363a427c1faff190350503758a8ea3524e3015b04ce7ef45 not found: ID does not exist" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.354201 4869 scope.go:117] "RemoveContainer" containerID="47f226d09b13229e429c8112fdcda62314f1524f8689557e2310c93169c1764e" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.371994 4869 scope.go:117] "RemoveContainer" containerID="5f069dafa1443afdf8854b3f8a9943d1403024586b5b16823c3872cf9380cff1" Mar 12 14:53:23 crc kubenswrapper[4869]: I0312 14:53:23.392574 4869 scope.go:117] "RemoveContainer" containerID="79d87369560d867a7b26c8308a3d5f725140037cd32cfc4b4e787911e3bba063" Mar 12 14:53:24 crc kubenswrapper[4869]: I0312 14:53:24.191866 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-9spzt" Mar 12 14:53:24 crc kubenswrapper[4869]: I0312 14:53:24.347045 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bf6f93d-a6f8-494b-abfa-47f8a164b667" path="/var/lib/kubelet/pods/8bf6f93d-a6f8-494b-abfa-47f8a164b667/volumes" Mar 12 14:53:24 crc kubenswrapper[4869]: I0312 14:53:24.348182 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b8fa922-8e49-42e1-a4a5-40069c505bf6" path="/var/lib/kubelet/pods/9b8fa922-8e49-42e1-a4a5-40069c505bf6/volumes" Mar 12 14:53:24 crc kubenswrapper[4869]: I0312 14:53:24.350389 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb16ce4b-e604-45d9-9635-c2565dcbd228" path="/var/lib/kubelet/pods/bb16ce4b-e604-45d9-9635-c2565dcbd228/volumes" Mar 12 14:53:24 crc kubenswrapper[4869]: I0312 14:53:24.351252 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e65737a9-e615-4a51-a72c-e4b561bdd1b0" path="/var/lib/kubelet/pods/e65737a9-e615-4a51-a72c-e4b561bdd1b0/volumes" Mar 12 14:53:24 crc kubenswrapper[4869]: I0312 14:53:24.352271 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffa8f886-960d-4115-a0e0-3ca252d2af08" path="/var/lib/kubelet/pods/ffa8f886-960d-4115-a0e0-3ca252d2af08/volumes" Mar 12 14:53:24 crc kubenswrapper[4869]: I0312 14:53:24.612418 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p6g28"] Mar 12 14:53:24 crc kubenswrapper[4869]: E0312 14:53:24.612662 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bf6f93d-a6f8-494b-abfa-47f8a164b667" containerName="extract-utilities" Mar 12 14:53:24 crc kubenswrapper[4869]: I0312 14:53:24.612677 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bf6f93d-a6f8-494b-abfa-47f8a164b667" containerName="extract-utilities" Mar 12 14:53:24 crc kubenswrapper[4869]: E0312 14:53:24.612687 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffa8f886-960d-4115-a0e0-3ca252d2af08" containerName="registry-server" Mar 12 14:53:24 crc kubenswrapper[4869]: I0312 14:53:24.612695 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffa8f886-960d-4115-a0e0-3ca252d2af08" containerName="registry-server" Mar 12 14:53:24 crc kubenswrapper[4869]: E0312 14:53:24.612706 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffa8f886-960d-4115-a0e0-3ca252d2af08" containerName="extract-content" Mar 12 14:53:24 crc kubenswrapper[4869]: I0312 14:53:24.612714 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffa8f886-960d-4115-a0e0-3ca252d2af08" containerName="extract-content" Mar 12 14:53:24 crc kubenswrapper[4869]: E0312 14:53:24.612725 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb16ce4b-e604-45d9-9635-c2565dcbd228" containerName="marketplace-operator" Mar 12 14:53:24 crc kubenswrapper[4869]: I0312 14:53:24.612733 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb16ce4b-e604-45d9-9635-c2565dcbd228" containerName="marketplace-operator" Mar 12 14:53:24 crc kubenswrapper[4869]: E0312 14:53:24.612745 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e65737a9-e615-4a51-a72c-e4b561bdd1b0" containerName="extract-content" Mar 12 14:53:24 crc kubenswrapper[4869]: I0312 14:53:24.612752 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e65737a9-e615-4a51-a72c-e4b561bdd1b0" containerName="extract-content" Mar 12 14:53:24 crc kubenswrapper[4869]: E0312 14:53:24.612763 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b8fa922-8e49-42e1-a4a5-40069c505bf6" containerName="extract-utilities" Mar 12 14:53:24 crc kubenswrapper[4869]: I0312 14:53:24.612770 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b8fa922-8e49-42e1-a4a5-40069c505bf6" containerName="extract-utilities" Mar 12 14:53:24 crc kubenswrapper[4869]: E0312 14:53:24.612785 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e65737a9-e615-4a51-a72c-e4b561bdd1b0" containerName="registry-server" Mar 12 14:53:24 crc kubenswrapper[4869]: I0312 14:53:24.612793 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e65737a9-e615-4a51-a72c-e4b561bdd1b0" containerName="registry-server" Mar 12 14:53:24 crc kubenswrapper[4869]: E0312 14:53:24.612802 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bf6f93d-a6f8-494b-abfa-47f8a164b667" containerName="registry-server" Mar 12 14:53:24 crc kubenswrapper[4869]: I0312 14:53:24.612809 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bf6f93d-a6f8-494b-abfa-47f8a164b667" containerName="registry-server" Mar 12 14:53:24 crc kubenswrapper[4869]: E0312 14:53:24.612819 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bf6f93d-a6f8-494b-abfa-47f8a164b667" containerName="extract-content" Mar 12 14:53:24 crc kubenswrapper[4869]: I0312 14:53:24.612826 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bf6f93d-a6f8-494b-abfa-47f8a164b667" containerName="extract-content" Mar 12 14:53:24 crc kubenswrapper[4869]: E0312 14:53:24.612838 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e65737a9-e615-4a51-a72c-e4b561bdd1b0" containerName="extract-utilities" Mar 12 14:53:24 crc kubenswrapper[4869]: I0312 14:53:24.612845 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e65737a9-e615-4a51-a72c-e4b561bdd1b0" containerName="extract-utilities" Mar 12 14:53:24 crc kubenswrapper[4869]: E0312 14:53:24.612857 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b8fa922-8e49-42e1-a4a5-40069c505bf6" containerName="registry-server" Mar 12 14:53:24 crc kubenswrapper[4869]: I0312 14:53:24.612865 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b8fa922-8e49-42e1-a4a5-40069c505bf6" containerName="registry-server" Mar 12 14:53:24 crc kubenswrapper[4869]: E0312 14:53:24.612874 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffa8f886-960d-4115-a0e0-3ca252d2af08" containerName="extract-utilities" Mar 12 14:53:24 crc kubenswrapper[4869]: I0312 14:53:24.612882 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffa8f886-960d-4115-a0e0-3ca252d2af08" containerName="extract-utilities" Mar 12 14:53:24 crc kubenswrapper[4869]: E0312 14:53:24.612892 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b8fa922-8e49-42e1-a4a5-40069c505bf6" containerName="extract-content" Mar 12 14:53:24 crc kubenswrapper[4869]: I0312 14:53:24.612899 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b8fa922-8e49-42e1-a4a5-40069c505bf6" containerName="extract-content" Mar 12 14:53:24 crc kubenswrapper[4869]: I0312 14:53:24.612995 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bf6f93d-a6f8-494b-abfa-47f8a164b667" containerName="registry-server" Mar 12 14:53:24 crc kubenswrapper[4869]: I0312 14:53:24.613007 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffa8f886-960d-4115-a0e0-3ca252d2af08" containerName="registry-server" Mar 12 14:53:24 crc kubenswrapper[4869]: I0312 14:53:24.613015 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb16ce4b-e604-45d9-9635-c2565dcbd228" containerName="marketplace-operator" Mar 12 14:53:24 crc kubenswrapper[4869]: I0312 14:53:24.613024 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="e65737a9-e615-4a51-a72c-e4b561bdd1b0" containerName="registry-server" Mar 12 14:53:24 crc kubenswrapper[4869]: I0312 14:53:24.613035 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b8fa922-8e49-42e1-a4a5-40069c505bf6" containerName="registry-server" Mar 12 14:53:24 crc kubenswrapper[4869]: I0312 14:53:24.613784 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6g28" Mar 12 14:53:24 crc kubenswrapper[4869]: I0312 14:53:24.623780 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 12 14:53:24 crc kubenswrapper[4869]: I0312 14:53:24.630728 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p6g28"] Mar 12 14:53:24 crc kubenswrapper[4869]: I0312 14:53:24.734554 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9nrm\" (UniqueName: \"kubernetes.io/projected/4aedeb34-f607-43d8-89bc-dac85b2c68ba-kube-api-access-g9nrm\") pod \"redhat-operators-p6g28\" (UID: \"4aedeb34-f607-43d8-89bc-dac85b2c68ba\") " pod="openshift-marketplace/redhat-operators-p6g28" Mar 12 14:53:24 crc kubenswrapper[4869]: I0312 14:53:24.734754 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aedeb34-f607-43d8-89bc-dac85b2c68ba-catalog-content\") pod \"redhat-operators-p6g28\" (UID: \"4aedeb34-f607-43d8-89bc-dac85b2c68ba\") " pod="openshift-marketplace/redhat-operators-p6g28" Mar 12 14:53:24 crc kubenswrapper[4869]: I0312 14:53:24.734837 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aedeb34-f607-43d8-89bc-dac85b2c68ba-utilities\") pod \"redhat-operators-p6g28\" (UID: \"4aedeb34-f607-43d8-89bc-dac85b2c68ba\") " pod="openshift-marketplace/redhat-operators-p6g28" Mar 12 14:53:25 crc kubenswrapper[4869]: I0312 14:53:25.106525 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aedeb34-f607-43d8-89bc-dac85b2c68ba-catalog-content\") pod \"redhat-operators-p6g28\" (UID: \"4aedeb34-f607-43d8-89bc-dac85b2c68ba\") " pod="openshift-marketplace/redhat-operators-p6g28" Mar 12 14:53:25 crc kubenswrapper[4869]: I0312 14:53:25.106630 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aedeb34-f607-43d8-89bc-dac85b2c68ba-utilities\") pod \"redhat-operators-p6g28\" (UID: \"4aedeb34-f607-43d8-89bc-dac85b2c68ba\") " pod="openshift-marketplace/redhat-operators-p6g28" Mar 12 14:53:25 crc kubenswrapper[4869]: I0312 14:53:25.106688 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9nrm\" (UniqueName: \"kubernetes.io/projected/4aedeb34-f607-43d8-89bc-dac85b2c68ba-kube-api-access-g9nrm\") pod \"redhat-operators-p6g28\" (UID: \"4aedeb34-f607-43d8-89bc-dac85b2c68ba\") " pod="openshift-marketplace/redhat-operators-p6g28" Mar 12 14:53:25 crc kubenswrapper[4869]: I0312 14:53:25.107009 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aedeb34-f607-43d8-89bc-dac85b2c68ba-catalog-content\") pod \"redhat-operators-p6g28\" (UID: \"4aedeb34-f607-43d8-89bc-dac85b2c68ba\") " pod="openshift-marketplace/redhat-operators-p6g28" Mar 12 14:53:25 crc kubenswrapper[4869]: I0312 14:53:25.107205 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aedeb34-f607-43d8-89bc-dac85b2c68ba-utilities\") pod \"redhat-operators-p6g28\" (UID: \"4aedeb34-f607-43d8-89bc-dac85b2c68ba\") " pod="openshift-marketplace/redhat-operators-p6g28" Mar 12 14:53:25 crc kubenswrapper[4869]: I0312 14:53:25.126928 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9nrm\" (UniqueName: \"kubernetes.io/projected/4aedeb34-f607-43d8-89bc-dac85b2c68ba-kube-api-access-g9nrm\") pod \"redhat-operators-p6g28\" (UID: \"4aedeb34-f607-43d8-89bc-dac85b2c68ba\") " pod="openshift-marketplace/redhat-operators-p6g28" Mar 12 14:53:25 crc kubenswrapper[4869]: I0312 14:53:25.240599 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6g28" Mar 12 14:53:25 crc kubenswrapper[4869]: I0312 14:53:25.428376 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p6g28"] Mar 12 14:53:25 crc kubenswrapper[4869]: I0312 14:53:25.617181 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8d8dl"] Mar 12 14:53:25 crc kubenswrapper[4869]: I0312 14:53:25.619617 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8d8dl" Mar 12 14:53:25 crc kubenswrapper[4869]: I0312 14:53:25.621249 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2r47\" (UniqueName: \"kubernetes.io/projected/91158fdd-957d-44dc-889c-325cdcffb980-kube-api-access-c2r47\") pod \"certified-operators-8d8dl\" (UID: \"91158fdd-957d-44dc-889c-325cdcffb980\") " pod="openshift-marketplace/certified-operators-8d8dl" Mar 12 14:53:25 crc kubenswrapper[4869]: I0312 14:53:25.621328 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91158fdd-957d-44dc-889c-325cdcffb980-utilities\") pod \"certified-operators-8d8dl\" (UID: \"91158fdd-957d-44dc-889c-325cdcffb980\") " pod="openshift-marketplace/certified-operators-8d8dl" Mar 12 14:53:25 crc kubenswrapper[4869]: I0312 14:53:25.621379 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91158fdd-957d-44dc-889c-325cdcffb980-catalog-content\") pod \"certified-operators-8d8dl\" (UID: \"91158fdd-957d-44dc-889c-325cdcffb980\") " pod="openshift-marketplace/certified-operators-8d8dl" Mar 12 14:53:25 crc kubenswrapper[4869]: I0312 14:53:25.621784 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 12 14:53:25 crc kubenswrapper[4869]: I0312 14:53:25.627935 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8d8dl"] Mar 12 14:53:25 crc kubenswrapper[4869]: I0312 14:53:25.722067 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91158fdd-957d-44dc-889c-325cdcffb980-utilities\") pod \"certified-operators-8d8dl\" (UID: \"91158fdd-957d-44dc-889c-325cdcffb980\") " pod="openshift-marketplace/certified-operators-8d8dl" Mar 12 14:53:25 crc kubenswrapper[4869]: I0312 14:53:25.722114 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91158fdd-957d-44dc-889c-325cdcffb980-catalog-content\") pod \"certified-operators-8d8dl\" (UID: \"91158fdd-957d-44dc-889c-325cdcffb980\") " pod="openshift-marketplace/certified-operators-8d8dl" Mar 12 14:53:25 crc kubenswrapper[4869]: I0312 14:53:25.722181 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2r47\" (UniqueName: \"kubernetes.io/projected/91158fdd-957d-44dc-889c-325cdcffb980-kube-api-access-c2r47\") pod \"certified-operators-8d8dl\" (UID: \"91158fdd-957d-44dc-889c-325cdcffb980\") " pod="openshift-marketplace/certified-operators-8d8dl" Mar 12 14:53:25 crc kubenswrapper[4869]: I0312 14:53:25.722634 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91158fdd-957d-44dc-889c-325cdcffb980-catalog-content\") pod \"certified-operators-8d8dl\" (UID: \"91158fdd-957d-44dc-889c-325cdcffb980\") " pod="openshift-marketplace/certified-operators-8d8dl" Mar 12 14:53:25 crc kubenswrapper[4869]: I0312 14:53:25.722782 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91158fdd-957d-44dc-889c-325cdcffb980-utilities\") pod \"certified-operators-8d8dl\" (UID: \"91158fdd-957d-44dc-889c-325cdcffb980\") " pod="openshift-marketplace/certified-operators-8d8dl" Mar 12 14:53:25 crc kubenswrapper[4869]: I0312 14:53:25.740428 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2r47\" (UniqueName: \"kubernetes.io/projected/91158fdd-957d-44dc-889c-325cdcffb980-kube-api-access-c2r47\") pod \"certified-operators-8d8dl\" (UID: \"91158fdd-957d-44dc-889c-325cdcffb980\") " pod="openshift-marketplace/certified-operators-8d8dl" Mar 12 14:53:25 crc kubenswrapper[4869]: I0312 14:53:25.932645 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8d8dl" Mar 12 14:53:26 crc kubenswrapper[4869]: I0312 14:53:26.198271 4869 generic.go:334] "Generic (PLEG): container finished" podID="4aedeb34-f607-43d8-89bc-dac85b2c68ba" containerID="6060196f2d18d65563121d8ccab44b2ca195380ea598180a489e38a65cea5011" exitCode=0 Mar 12 14:53:26 crc kubenswrapper[4869]: I0312 14:53:26.198328 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6g28" event={"ID":"4aedeb34-f607-43d8-89bc-dac85b2c68ba","Type":"ContainerDied","Data":"6060196f2d18d65563121d8ccab44b2ca195380ea598180a489e38a65cea5011"} Mar 12 14:53:26 crc kubenswrapper[4869]: I0312 14:53:26.198369 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6g28" event={"ID":"4aedeb34-f607-43d8-89bc-dac85b2c68ba","Type":"ContainerStarted","Data":"0c32f7809452e239700566f87ff05b0e679a14939fb35bef474c2f0f412b51ce"} Mar 12 14:53:26 crc kubenswrapper[4869]: I0312 14:53:26.222946 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-99thx" Mar 12 14:53:26 crc kubenswrapper[4869]: I0312 14:53:26.430481 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5lmx8"] Mar 12 14:53:26 crc kubenswrapper[4869]: I0312 14:53:26.673160 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8d8dl"] Mar 12 14:53:27 crc kubenswrapper[4869]: I0312 14:53:27.021820 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hlrr7"] Mar 12 14:53:27 crc kubenswrapper[4869]: I0312 14:53:27.023387 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hlrr7" Mar 12 14:53:27 crc kubenswrapper[4869]: I0312 14:53:27.025920 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 12 14:53:27 crc kubenswrapper[4869]: I0312 14:53:27.031800 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hlrr7"] Mar 12 14:53:27 crc kubenswrapper[4869]: I0312 14:53:27.172958 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xwgf\" (UniqueName: \"kubernetes.io/projected/870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6-kube-api-access-9xwgf\") pod \"community-operators-hlrr7\" (UID: \"870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6\") " pod="openshift-marketplace/community-operators-hlrr7" Mar 12 14:53:27 crc kubenswrapper[4869]: I0312 14:53:27.173169 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6-catalog-content\") pod \"community-operators-hlrr7\" (UID: \"870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6\") " pod="openshift-marketplace/community-operators-hlrr7" Mar 12 14:53:27 crc kubenswrapper[4869]: I0312 14:53:27.173325 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6-utilities\") pod \"community-operators-hlrr7\" (UID: \"870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6\") " pod="openshift-marketplace/community-operators-hlrr7" Mar 12 14:53:27 crc kubenswrapper[4869]: I0312 14:53:27.203818 4869 generic.go:334] "Generic (PLEG): container finished" podID="91158fdd-957d-44dc-889c-325cdcffb980" containerID="50aca4aa161a7835aeceeeba0c9f0c01cb7ef1ac158ac8b86b364d70a8601a54" exitCode=0 Mar 12 14:53:27 crc kubenswrapper[4869]: I0312 14:53:27.203909 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d8dl" event={"ID":"91158fdd-957d-44dc-889c-325cdcffb980","Type":"ContainerDied","Data":"50aca4aa161a7835aeceeeba0c9f0c01cb7ef1ac158ac8b86b364d70a8601a54"} Mar 12 14:53:27 crc kubenswrapper[4869]: I0312 14:53:27.204192 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d8dl" event={"ID":"91158fdd-957d-44dc-889c-325cdcffb980","Type":"ContainerStarted","Data":"31676beba64b1a45c952afecaa2f0489c8783a7416c78076b4b16554a406ef0f"} Mar 12 14:53:27 crc kubenswrapper[4869]: I0312 14:53:27.274374 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xwgf\" (UniqueName: \"kubernetes.io/projected/870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6-kube-api-access-9xwgf\") pod \"community-operators-hlrr7\" (UID: \"870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6\") " pod="openshift-marketplace/community-operators-hlrr7" Mar 12 14:53:27 crc kubenswrapper[4869]: I0312 14:53:27.274760 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6-catalog-content\") pod \"community-operators-hlrr7\" (UID: \"870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6\") " pod="openshift-marketplace/community-operators-hlrr7" Mar 12 14:53:27 crc kubenswrapper[4869]: I0312 14:53:27.274969 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6-utilities\") pod \"community-operators-hlrr7\" (UID: \"870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6\") " pod="openshift-marketplace/community-operators-hlrr7" Mar 12 14:53:27 crc kubenswrapper[4869]: I0312 14:53:27.275479 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6-catalog-content\") pod \"community-operators-hlrr7\" (UID: \"870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6\") " pod="openshift-marketplace/community-operators-hlrr7" Mar 12 14:53:27 crc kubenswrapper[4869]: I0312 14:53:27.275672 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6-utilities\") pod \"community-operators-hlrr7\" (UID: \"870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6\") " pod="openshift-marketplace/community-operators-hlrr7" Mar 12 14:53:27 crc kubenswrapper[4869]: I0312 14:53:27.295232 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xwgf\" (UniqueName: \"kubernetes.io/projected/870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6-kube-api-access-9xwgf\") pod \"community-operators-hlrr7\" (UID: \"870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6\") " pod="openshift-marketplace/community-operators-hlrr7" Mar 12 14:53:27 crc kubenswrapper[4869]: I0312 14:53:27.343623 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hlrr7" Mar 12 14:53:27 crc kubenswrapper[4869]: I0312 14:53:27.541530 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hlrr7"] Mar 12 14:53:27 crc kubenswrapper[4869]: W0312 14:53:27.542990 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod870b2ca4_83f3_4a2b_9bc4_3dea9c1773f6.slice/crio-780c45694cf780d1739d5a67d67033d17d915e53ea0fc5b26863aac3c87476a3 WatchSource:0}: Error finding container 780c45694cf780d1739d5a67d67033d17d915e53ea0fc5b26863aac3c87476a3: Status 404 returned error can't find the container with id 780c45694cf780d1739d5a67d67033d17d915e53ea0fc5b26863aac3c87476a3 Mar 12 14:53:28 crc kubenswrapper[4869]: I0312 14:53:28.020222 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-76wh2"] Mar 12 14:53:28 crc kubenswrapper[4869]: I0312 14:53:28.023118 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-76wh2" Mar 12 14:53:28 crc kubenswrapper[4869]: I0312 14:53:28.027233 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-76wh2"] Mar 12 14:53:28 crc kubenswrapper[4869]: I0312 14:53:28.027278 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 12 14:53:28 crc kubenswrapper[4869]: I0312 14:53:28.188634 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwcv8\" (UniqueName: \"kubernetes.io/projected/a697779f-0fbb-4a95-aa75-b8fe3fc77944-kube-api-access-xwcv8\") pod \"redhat-marketplace-76wh2\" (UID: \"a697779f-0fbb-4a95-aa75-b8fe3fc77944\") " pod="openshift-marketplace/redhat-marketplace-76wh2" Mar 12 14:53:28 crc kubenswrapper[4869]: I0312 14:53:28.188677 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a697779f-0fbb-4a95-aa75-b8fe3fc77944-utilities\") pod \"redhat-marketplace-76wh2\" (UID: \"a697779f-0fbb-4a95-aa75-b8fe3fc77944\") " pod="openshift-marketplace/redhat-marketplace-76wh2" Mar 12 14:53:28 crc kubenswrapper[4869]: I0312 14:53:28.188741 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a697779f-0fbb-4a95-aa75-b8fe3fc77944-catalog-content\") pod \"redhat-marketplace-76wh2\" (UID: \"a697779f-0fbb-4a95-aa75-b8fe3fc77944\") " pod="openshift-marketplace/redhat-marketplace-76wh2" Mar 12 14:53:28 crc kubenswrapper[4869]: I0312 14:53:28.210470 4869 generic.go:334] "Generic (PLEG): container finished" podID="870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6" containerID="fd515955af0aa011af81ccef68cc4d1e4cb8e40f7636f79f1d0a356ccfbaec0c" exitCode=0 Mar 12 14:53:28 crc kubenswrapper[4869]: I0312 14:53:28.210525 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlrr7" event={"ID":"870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6","Type":"ContainerDied","Data":"fd515955af0aa011af81ccef68cc4d1e4cb8e40f7636f79f1d0a356ccfbaec0c"} Mar 12 14:53:28 crc kubenswrapper[4869]: I0312 14:53:28.210936 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlrr7" event={"ID":"870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6","Type":"ContainerStarted","Data":"780c45694cf780d1739d5a67d67033d17d915e53ea0fc5b26863aac3c87476a3"} Mar 12 14:53:28 crc kubenswrapper[4869]: I0312 14:53:28.212669 4869 generic.go:334] "Generic (PLEG): container finished" podID="4aedeb34-f607-43d8-89bc-dac85b2c68ba" containerID="010158b87ed73596d3e8829a35f709536ed5fea0c527ca6539e691f10093416c" exitCode=0 Mar 12 14:53:28 crc kubenswrapper[4869]: I0312 14:53:28.212704 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6g28" event={"ID":"4aedeb34-f607-43d8-89bc-dac85b2c68ba","Type":"ContainerDied","Data":"010158b87ed73596d3e8829a35f709536ed5fea0c527ca6539e691f10093416c"} Mar 12 14:53:28 crc kubenswrapper[4869]: I0312 14:53:28.292035 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a697779f-0fbb-4a95-aa75-b8fe3fc77944-catalog-content\") pod \"redhat-marketplace-76wh2\" (UID: \"a697779f-0fbb-4a95-aa75-b8fe3fc77944\") " pod="openshift-marketplace/redhat-marketplace-76wh2" Mar 12 14:53:28 crc kubenswrapper[4869]: I0312 14:53:28.292121 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwcv8\" (UniqueName: \"kubernetes.io/projected/a697779f-0fbb-4a95-aa75-b8fe3fc77944-kube-api-access-xwcv8\") pod \"redhat-marketplace-76wh2\" (UID: \"a697779f-0fbb-4a95-aa75-b8fe3fc77944\") " pod="openshift-marketplace/redhat-marketplace-76wh2" Mar 12 14:53:28 crc kubenswrapper[4869]: I0312 14:53:28.292168 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a697779f-0fbb-4a95-aa75-b8fe3fc77944-utilities\") pod \"redhat-marketplace-76wh2\" (UID: \"a697779f-0fbb-4a95-aa75-b8fe3fc77944\") " pod="openshift-marketplace/redhat-marketplace-76wh2" Mar 12 14:53:28 crc kubenswrapper[4869]: I0312 14:53:28.292808 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a697779f-0fbb-4a95-aa75-b8fe3fc77944-catalog-content\") pod \"redhat-marketplace-76wh2\" (UID: \"a697779f-0fbb-4a95-aa75-b8fe3fc77944\") " pod="openshift-marketplace/redhat-marketplace-76wh2" Mar 12 14:53:28 crc kubenswrapper[4869]: I0312 14:53:28.293956 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a697779f-0fbb-4a95-aa75-b8fe3fc77944-utilities\") pod \"redhat-marketplace-76wh2\" (UID: \"a697779f-0fbb-4a95-aa75-b8fe3fc77944\") " pod="openshift-marketplace/redhat-marketplace-76wh2" Mar 12 14:53:28 crc kubenswrapper[4869]: I0312 14:53:28.310611 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwcv8\" (UniqueName: \"kubernetes.io/projected/a697779f-0fbb-4a95-aa75-b8fe3fc77944-kube-api-access-xwcv8\") pod \"redhat-marketplace-76wh2\" (UID: \"a697779f-0fbb-4a95-aa75-b8fe3fc77944\") " pod="openshift-marketplace/redhat-marketplace-76wh2" Mar 12 14:53:28 crc kubenswrapper[4869]: I0312 14:53:28.339816 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 12 14:53:28 crc kubenswrapper[4869]: I0312 14:53:28.347370 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-76wh2" Mar 12 14:53:28 crc kubenswrapper[4869]: I0312 14:53:28.582204 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-76wh2"] Mar 12 14:53:29 crc kubenswrapper[4869]: I0312 14:53:29.219766 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlrr7" event={"ID":"870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6","Type":"ContainerStarted","Data":"7a51ccbaafecd816b0bc847e17ccfee444a82598853130c85a3b74cdef0ac9d3"} Mar 12 14:53:29 crc kubenswrapper[4869]: I0312 14:53:29.221609 4869 generic.go:334] "Generic (PLEG): container finished" podID="91158fdd-957d-44dc-889c-325cdcffb980" containerID="4c621508e2c0a41f09c3bae66b9083b0eea569f9d5ef3897d47c7ee654098632" exitCode=0 Mar 12 14:53:29 crc kubenswrapper[4869]: I0312 14:53:29.221640 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d8dl" event={"ID":"91158fdd-957d-44dc-889c-325cdcffb980","Type":"ContainerDied","Data":"4c621508e2c0a41f09c3bae66b9083b0eea569f9d5ef3897d47c7ee654098632"} Mar 12 14:53:29 crc kubenswrapper[4869]: I0312 14:53:29.223564 4869 generic.go:334] "Generic (PLEG): container finished" podID="a697779f-0fbb-4a95-aa75-b8fe3fc77944" containerID="acede576fba69a4c7233260f7f130cc3f0d0b36806960dc89c4ec1a4c459d31c" exitCode=0 Mar 12 14:53:29 crc kubenswrapper[4869]: I0312 14:53:29.223630 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76wh2" event={"ID":"a697779f-0fbb-4a95-aa75-b8fe3fc77944","Type":"ContainerDied","Data":"acede576fba69a4c7233260f7f130cc3f0d0b36806960dc89c4ec1a4c459d31c"} Mar 12 14:53:29 crc kubenswrapper[4869]: I0312 14:53:29.223654 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76wh2" event={"ID":"a697779f-0fbb-4a95-aa75-b8fe3fc77944","Type":"ContainerStarted","Data":"99e5f8e14a421aa4b1a05d9ed16c09ad01922f5fc371bd37ebe5472028ca707b"} Mar 12 14:53:29 crc kubenswrapper[4869]: I0312 14:53:29.227502 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6g28" event={"ID":"4aedeb34-f607-43d8-89bc-dac85b2c68ba","Type":"ContainerStarted","Data":"32e8a9c91cb3ed5482c72bf170c2cae833a6a108e194b535ea94f1d5114ae514"} Mar 12 14:53:29 crc kubenswrapper[4869]: I0312 14:53:29.272825 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p6g28" podStartSLOduration=2.78015021 podStartE2EDuration="5.272809028s" podCreationTimestamp="2026-03-12 14:53:24 +0000 UTC" firstStartedPulling="2026-03-12 14:53:26.199957836 +0000 UTC m=+358.485183114" lastFinishedPulling="2026-03-12 14:53:28.692616664 +0000 UTC m=+360.977841932" observedRunningTime="2026-03-12 14:53:29.272265072 +0000 UTC m=+361.557490380" watchObservedRunningTime="2026-03-12 14:53:29.272809028 +0000 UTC m=+361.558034306" Mar 12 14:53:30 crc kubenswrapper[4869]: I0312 14:53:30.233935 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76wh2" event={"ID":"a697779f-0fbb-4a95-aa75-b8fe3fc77944","Type":"ContainerStarted","Data":"80eec98b03a1eccbc3068479f150ca6ebfa41694dc1fdbcb2bb0c6a06d110809"} Mar 12 14:53:30 crc kubenswrapper[4869]: I0312 14:53:30.236027 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d8dl" event={"ID":"91158fdd-957d-44dc-889c-325cdcffb980","Type":"ContainerStarted","Data":"46a479458eced49743ff7e312bf94e95c0148f52aa34932a26348e3ad902b4dd"} Mar 12 14:53:30 crc kubenswrapper[4869]: I0312 14:53:30.237804 4869 generic.go:334] "Generic (PLEG): container finished" podID="870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6" containerID="7a51ccbaafecd816b0bc847e17ccfee444a82598853130c85a3b74cdef0ac9d3" exitCode=0 Mar 12 14:53:30 crc kubenswrapper[4869]: I0312 14:53:30.238243 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlrr7" event={"ID":"870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6","Type":"ContainerDied","Data":"7a51ccbaafecd816b0bc847e17ccfee444a82598853130c85a3b74cdef0ac9d3"} Mar 12 14:53:30 crc kubenswrapper[4869]: I0312 14:53:30.288489 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8d8dl" podStartSLOduration=2.610619814 podStartE2EDuration="5.288468633s" podCreationTimestamp="2026-03-12 14:53:25 +0000 UTC" firstStartedPulling="2026-03-12 14:53:27.204991501 +0000 UTC m=+359.490216779" lastFinishedPulling="2026-03-12 14:53:29.88284031 +0000 UTC m=+362.168065598" observedRunningTime="2026-03-12 14:53:30.285222165 +0000 UTC m=+362.570447453" watchObservedRunningTime="2026-03-12 14:53:30.288468633 +0000 UTC m=+362.573693921" Mar 12 14:53:31 crc kubenswrapper[4869]: I0312 14:53:31.254313 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlrr7" event={"ID":"870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6","Type":"ContainerStarted","Data":"1f3d97fb96f3c37d2d5b5fea4eeb7efd8571ecf8b66f6fc30ed13ee954df3d7b"} Mar 12 14:53:31 crc kubenswrapper[4869]: I0312 14:53:31.257981 4869 generic.go:334] "Generic (PLEG): container finished" podID="a697779f-0fbb-4a95-aa75-b8fe3fc77944" containerID="80eec98b03a1eccbc3068479f150ca6ebfa41694dc1fdbcb2bb0c6a06d110809" exitCode=0 Mar 12 14:53:31 crc kubenswrapper[4869]: I0312 14:53:31.258017 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76wh2" event={"ID":"a697779f-0fbb-4a95-aa75-b8fe3fc77944","Type":"ContainerDied","Data":"80eec98b03a1eccbc3068479f150ca6ebfa41694dc1fdbcb2bb0c6a06d110809"} Mar 12 14:53:31 crc kubenswrapper[4869]: I0312 14:53:31.273370 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hlrr7" podStartSLOduration=1.6929687549999999 podStartE2EDuration="4.273354392s" podCreationTimestamp="2026-03-12 14:53:27 +0000 UTC" firstStartedPulling="2026-03-12 14:53:28.228053739 +0000 UTC m=+360.513279017" lastFinishedPulling="2026-03-12 14:53:30.808439376 +0000 UTC m=+363.093664654" observedRunningTime="2026-03-12 14:53:31.268043953 +0000 UTC m=+363.553269241" watchObservedRunningTime="2026-03-12 14:53:31.273354392 +0000 UTC m=+363.558579670" Mar 12 14:53:32 crc kubenswrapper[4869]: I0312 14:53:32.265697 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76wh2" event={"ID":"a697779f-0fbb-4a95-aa75-b8fe3fc77944","Type":"ContainerStarted","Data":"bf64cf1f0ca6eb3229b8b2f7a7ba9e82803656bbf35ed5da61754451eb06884d"} Mar 12 14:53:32 crc kubenswrapper[4869]: I0312 14:53:32.279348 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-76wh2" podStartSLOduration=1.564559704 podStartE2EDuration="4.279329105s" podCreationTimestamp="2026-03-12 14:53:28 +0000 UTC" firstStartedPulling="2026-03-12 14:53:29.224723611 +0000 UTC m=+361.509948889" lastFinishedPulling="2026-03-12 14:53:31.939493002 +0000 UTC m=+364.224718290" observedRunningTime="2026-03-12 14:53:32.27916174 +0000 UTC m=+364.564387028" watchObservedRunningTime="2026-03-12 14:53:32.279329105 +0000 UTC m=+364.564554383" Mar 12 14:53:35 crc kubenswrapper[4869]: I0312 14:53:35.241533 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p6g28" Mar 12 14:53:35 crc kubenswrapper[4869]: I0312 14:53:35.242062 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p6g28" Mar 12 14:53:35 crc kubenswrapper[4869]: I0312 14:53:35.286272 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p6g28" Mar 12 14:53:35 crc kubenswrapper[4869]: I0312 14:53:35.321814 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p6g28" Mar 12 14:53:35 crc kubenswrapper[4869]: I0312 14:53:35.933776 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8d8dl" Mar 12 14:53:35 crc kubenswrapper[4869]: I0312 14:53:35.933829 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8d8dl" Mar 12 14:53:35 crc kubenswrapper[4869]: I0312 14:53:35.981119 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8d8dl" Mar 12 14:53:36 crc kubenswrapper[4869]: I0312 14:53:36.316286 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8d8dl" Mar 12 14:53:37 crc kubenswrapper[4869]: I0312 14:53:37.343934 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hlrr7" Mar 12 14:53:37 crc kubenswrapper[4869]: I0312 14:53:37.344977 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hlrr7" Mar 12 14:53:37 crc kubenswrapper[4869]: I0312 14:53:37.401294 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hlrr7" Mar 12 14:53:38 crc kubenswrapper[4869]: I0312 14:53:38.343616 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hlrr7" Mar 12 14:53:38 crc kubenswrapper[4869]: I0312 14:53:38.347620 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-76wh2" Mar 12 14:53:38 crc kubenswrapper[4869]: I0312 14:53:38.347730 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-76wh2" Mar 12 14:53:38 crc kubenswrapper[4869]: I0312 14:53:38.408539 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-76wh2" Mar 12 14:53:39 crc kubenswrapper[4869]: I0312 14:53:39.342115 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-76wh2" Mar 12 14:53:51 crc kubenswrapper[4869]: I0312 14:53:51.465772 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" podUID="32af1403-874a-49e0-ab8f-96511da15218" containerName="registry" containerID="cri-o://cca4c5f295d56b68dc26dd61a67685b5bf76ac37bb63423dac2daa3f6355f1e9" gracePeriod=30 Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:51.828659 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:51.990148 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"32af1403-874a-49e0-ab8f-96511da15218\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:51.990188 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/32af1403-874a-49e0-ab8f-96511da15218-registry-certificates\") pod \"32af1403-874a-49e0-ab8f-96511da15218\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:51.990208 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/32af1403-874a-49e0-ab8f-96511da15218-registry-tls\") pod \"32af1403-874a-49e0-ab8f-96511da15218\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:51.990304 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/32af1403-874a-49e0-ab8f-96511da15218-installation-pull-secrets\") pod \"32af1403-874a-49e0-ab8f-96511da15218\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:51.990347 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32af1403-874a-49e0-ab8f-96511da15218-trusted-ca\") pod \"32af1403-874a-49e0-ab8f-96511da15218\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:51.990389 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/32af1403-874a-49e0-ab8f-96511da15218-ca-trust-extracted\") pod \"32af1403-874a-49e0-ab8f-96511da15218\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:51.990407 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7g5n\" (UniqueName: \"kubernetes.io/projected/32af1403-874a-49e0-ab8f-96511da15218-kube-api-access-n7g5n\") pod \"32af1403-874a-49e0-ab8f-96511da15218\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:51.990420 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32af1403-874a-49e0-ab8f-96511da15218-bound-sa-token\") pod \"32af1403-874a-49e0-ab8f-96511da15218\" (UID: \"32af1403-874a-49e0-ab8f-96511da15218\") " Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:51.991343 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32af1403-874a-49e0-ab8f-96511da15218-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "32af1403-874a-49e0-ab8f-96511da15218" (UID: "32af1403-874a-49e0-ab8f-96511da15218"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:51.991470 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32af1403-874a-49e0-ab8f-96511da15218-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "32af1403-874a-49e0-ab8f-96511da15218" (UID: "32af1403-874a-49e0-ab8f-96511da15218"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:51.996650 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32af1403-874a-49e0-ab8f-96511da15218-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "32af1403-874a-49e0-ab8f-96511da15218" (UID: "32af1403-874a-49e0-ab8f-96511da15218"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:51.996812 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32af1403-874a-49e0-ab8f-96511da15218-kube-api-access-n7g5n" (OuterVolumeSpecName: "kube-api-access-n7g5n") pod "32af1403-874a-49e0-ab8f-96511da15218" (UID: "32af1403-874a-49e0-ab8f-96511da15218"). InnerVolumeSpecName "kube-api-access-n7g5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:51.998008 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32af1403-874a-49e0-ab8f-96511da15218-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "32af1403-874a-49e0-ab8f-96511da15218" (UID: "32af1403-874a-49e0-ab8f-96511da15218"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:51.999749 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32af1403-874a-49e0-ab8f-96511da15218-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "32af1403-874a-49e0-ab8f-96511da15218" (UID: "32af1403-874a-49e0-ab8f-96511da15218"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:52.000087 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "32af1403-874a-49e0-ab8f-96511da15218" (UID: "32af1403-874a-49e0-ab8f-96511da15218"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:52.009711 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32af1403-874a-49e0-ab8f-96511da15218-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "32af1403-874a-49e0-ab8f-96511da15218" (UID: "32af1403-874a-49e0-ab8f-96511da15218"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:52.092220 4869 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/32af1403-874a-49e0-ab8f-96511da15218-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:52.092250 4869 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/32af1403-874a-49e0-ab8f-96511da15218-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:52.092262 4869 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/32af1403-874a-49e0-ab8f-96511da15218-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:52.092270 4869 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32af1403-874a-49e0-ab8f-96511da15218-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:52.092279 4869 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/32af1403-874a-49e0-ab8f-96511da15218-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:52.092287 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7g5n\" (UniqueName: \"kubernetes.io/projected/32af1403-874a-49e0-ab8f-96511da15218-kube-api-access-n7g5n\") on node \"crc\" DevicePath \"\"" Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:52.092294 4869 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32af1403-874a-49e0-ab8f-96511da15218-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:52.355099 4869 generic.go:334] "Generic (PLEG): container finished" podID="32af1403-874a-49e0-ab8f-96511da15218" containerID="cca4c5f295d56b68dc26dd61a67685b5bf76ac37bb63423dac2daa3f6355f1e9" exitCode=0 Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:52.355145 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" event={"ID":"32af1403-874a-49e0-ab8f-96511da15218","Type":"ContainerDied","Data":"cca4c5f295d56b68dc26dd61a67685b5bf76ac37bb63423dac2daa3f6355f1e9"} Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:52.355172 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" event={"ID":"32af1403-874a-49e0-ab8f-96511da15218","Type":"ContainerDied","Data":"dd9defa550f53785b552896d978862b07e6b63b096517727044c3799268a47b7"} Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:52.355192 4869 scope.go:117] "RemoveContainer" containerID="cca4c5f295d56b68dc26dd61a67685b5bf76ac37bb63423dac2daa3f6355f1e9" Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:52.355147 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5lmx8" Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:52.381503 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5lmx8"] Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:52.385169 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5lmx8"] Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:52.388809 4869 scope.go:117] "RemoveContainer" containerID="cca4c5f295d56b68dc26dd61a67685b5bf76ac37bb63423dac2daa3f6355f1e9" Mar 12 14:53:52 crc kubenswrapper[4869]: E0312 14:53:52.389945 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cca4c5f295d56b68dc26dd61a67685b5bf76ac37bb63423dac2daa3f6355f1e9\": container with ID starting with cca4c5f295d56b68dc26dd61a67685b5bf76ac37bb63423dac2daa3f6355f1e9 not found: ID does not exist" containerID="cca4c5f295d56b68dc26dd61a67685b5bf76ac37bb63423dac2daa3f6355f1e9" Mar 12 14:53:52 crc kubenswrapper[4869]: I0312 14:53:52.389981 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cca4c5f295d56b68dc26dd61a67685b5bf76ac37bb63423dac2daa3f6355f1e9"} err="failed to get container status \"cca4c5f295d56b68dc26dd61a67685b5bf76ac37bb63423dac2daa3f6355f1e9\": rpc error: code = NotFound desc = could not find container \"cca4c5f295d56b68dc26dd61a67685b5bf76ac37bb63423dac2daa3f6355f1e9\": container with ID starting with cca4c5f295d56b68dc26dd61a67685b5bf76ac37bb63423dac2daa3f6355f1e9 not found: ID does not exist" Mar 12 14:53:54 crc kubenswrapper[4869]: I0312 14:53:54.345137 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32af1403-874a-49e0-ab8f-96511da15218" path="/var/lib/kubelet/pods/32af1403-874a-49e0-ab8f-96511da15218/volumes" Mar 12 14:54:00 crc kubenswrapper[4869]: I0312 14:54:00.135008 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555454-5svs7"] Mar 12 14:54:00 crc kubenswrapper[4869]: E0312 14:54:00.136059 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32af1403-874a-49e0-ab8f-96511da15218" containerName="registry" Mar 12 14:54:00 crc kubenswrapper[4869]: I0312 14:54:00.136075 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="32af1403-874a-49e0-ab8f-96511da15218" containerName="registry" Mar 12 14:54:00 crc kubenswrapper[4869]: I0312 14:54:00.136179 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="32af1403-874a-49e0-ab8f-96511da15218" containerName="registry" Mar 12 14:54:00 crc kubenswrapper[4869]: I0312 14:54:00.136751 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555454-5svs7" Mar 12 14:54:00 crc kubenswrapper[4869]: I0312 14:54:00.144306 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 14:54:00 crc kubenswrapper[4869]: I0312 14:54:00.144921 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:54:00 crc kubenswrapper[4869]: I0312 14:54:00.145094 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:54:00 crc kubenswrapper[4869]: I0312 14:54:00.147557 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555454-5svs7"] Mar 12 14:54:00 crc kubenswrapper[4869]: I0312 14:54:00.304079 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkvjl\" (UniqueName: \"kubernetes.io/projected/f1a92e51-8487-471e-a055-e7f63f101490-kube-api-access-xkvjl\") pod \"auto-csr-approver-29555454-5svs7\" (UID: \"f1a92e51-8487-471e-a055-e7f63f101490\") " pod="openshift-infra/auto-csr-approver-29555454-5svs7" Mar 12 14:54:00 crc kubenswrapper[4869]: I0312 14:54:00.405194 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkvjl\" (UniqueName: \"kubernetes.io/projected/f1a92e51-8487-471e-a055-e7f63f101490-kube-api-access-xkvjl\") pod \"auto-csr-approver-29555454-5svs7\" (UID: \"f1a92e51-8487-471e-a055-e7f63f101490\") " pod="openshift-infra/auto-csr-approver-29555454-5svs7" Mar 12 14:54:00 crc kubenswrapper[4869]: I0312 14:54:00.425250 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkvjl\" (UniqueName: \"kubernetes.io/projected/f1a92e51-8487-471e-a055-e7f63f101490-kube-api-access-xkvjl\") pod \"auto-csr-approver-29555454-5svs7\" (UID: \"f1a92e51-8487-471e-a055-e7f63f101490\") " pod="openshift-infra/auto-csr-approver-29555454-5svs7" Mar 12 14:54:00 crc kubenswrapper[4869]: I0312 14:54:00.460202 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555454-5svs7" Mar 12 14:54:00 crc kubenswrapper[4869]: I0312 14:54:00.633261 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555454-5svs7"] Mar 12 14:54:01 crc kubenswrapper[4869]: I0312 14:54:01.401009 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555454-5svs7" event={"ID":"f1a92e51-8487-471e-a055-e7f63f101490","Type":"ContainerStarted","Data":"ba2ac3f35d771c39e1a799b222f1ddedf106f59615c9dc4dbc0883db09025c5a"} Mar 12 14:54:02 crc kubenswrapper[4869]: I0312 14:54:02.407338 4869 generic.go:334] "Generic (PLEG): container finished" podID="f1a92e51-8487-471e-a055-e7f63f101490" containerID="962c67795a5a57351d812d8e4dfa75849a6944336d685c83fe2d3167f79c78d9" exitCode=0 Mar 12 14:54:02 crc kubenswrapper[4869]: I0312 14:54:02.407394 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555454-5svs7" event={"ID":"f1a92e51-8487-471e-a055-e7f63f101490","Type":"ContainerDied","Data":"962c67795a5a57351d812d8e4dfa75849a6944336d685c83fe2d3167f79c78d9"} Mar 12 14:54:03 crc kubenswrapper[4869]: I0312 14:54:03.599668 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555454-5svs7" Mar 12 14:54:03 crc kubenswrapper[4869]: I0312 14:54:03.744699 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkvjl\" (UniqueName: \"kubernetes.io/projected/f1a92e51-8487-471e-a055-e7f63f101490-kube-api-access-xkvjl\") pod \"f1a92e51-8487-471e-a055-e7f63f101490\" (UID: \"f1a92e51-8487-471e-a055-e7f63f101490\") " Mar 12 14:54:03 crc kubenswrapper[4869]: I0312 14:54:03.750458 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1a92e51-8487-471e-a055-e7f63f101490-kube-api-access-xkvjl" (OuterVolumeSpecName: "kube-api-access-xkvjl") pod "f1a92e51-8487-471e-a055-e7f63f101490" (UID: "f1a92e51-8487-471e-a055-e7f63f101490"). InnerVolumeSpecName "kube-api-access-xkvjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:54:03 crc kubenswrapper[4869]: I0312 14:54:03.845872 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkvjl\" (UniqueName: \"kubernetes.io/projected/f1a92e51-8487-471e-a055-e7f63f101490-kube-api-access-xkvjl\") on node \"crc\" DevicePath \"\"" Mar 12 14:54:04 crc kubenswrapper[4869]: I0312 14:54:04.419106 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555454-5svs7" event={"ID":"f1a92e51-8487-471e-a055-e7f63f101490","Type":"ContainerDied","Data":"ba2ac3f35d771c39e1a799b222f1ddedf106f59615c9dc4dbc0883db09025c5a"} Mar 12 14:54:04 crc kubenswrapper[4869]: I0312 14:54:04.419144 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba2ac3f35d771c39e1a799b222f1ddedf106f59615c9dc4dbc0883db09025c5a" Mar 12 14:54:04 crc kubenswrapper[4869]: I0312 14:54:04.419267 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555454-5svs7" Mar 12 14:54:22 crc kubenswrapper[4869]: I0312 14:54:22.904622 4869 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-z6x5x container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 14:54:22 crc kubenswrapper[4869]: I0312 14:54:22.905274 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z6x5x" podUID="d17df7e2-d85d-4172-aff7-0b5e63605a77" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 14:54:22 crc kubenswrapper[4869]: I0312 14:54:22.924260 4869 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-z6x5x container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 14:54:22 crc kubenswrapper[4869]: I0312 14:54:22.924371 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z6x5x" podUID="d17df7e2-d85d-4172-aff7-0b5e63605a77" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 12 14:54:49 crc kubenswrapper[4869]: I0312 14:54:49.684846 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:54:49 crc kubenswrapper[4869]: I0312 14:54:49.685532 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:55:19 crc kubenswrapper[4869]: I0312 14:55:19.684634 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:55:19 crc kubenswrapper[4869]: I0312 14:55:19.685334 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:55:49 crc kubenswrapper[4869]: I0312 14:55:49.684094 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:55:49 crc kubenswrapper[4869]: I0312 14:55:49.684718 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:55:49 crc kubenswrapper[4869]: I0312 14:55:49.684789 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" Mar 12 14:55:49 crc kubenswrapper[4869]: I0312 14:55:49.685471 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7938ffdcb0f5d68aef181bbbe274d4e45cbcba7c963f76ee6c1f153d4d2ccdd0"} pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 14:55:49 crc kubenswrapper[4869]: I0312 14:55:49.685530 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" containerID="cri-o://7938ffdcb0f5d68aef181bbbe274d4e45cbcba7c963f76ee6c1f153d4d2ccdd0" gracePeriod=600 Mar 12 14:55:50 crc kubenswrapper[4869]: I0312 14:55:50.570201 4869 generic.go:334] "Generic (PLEG): container finished" podID="1621c994-94d2-4105-a988-f4739518ba91" containerID="7938ffdcb0f5d68aef181bbbe274d4e45cbcba7c963f76ee6c1f153d4d2ccdd0" exitCode=0 Mar 12 14:55:50 crc kubenswrapper[4869]: I0312 14:55:50.570295 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerDied","Data":"7938ffdcb0f5d68aef181bbbe274d4e45cbcba7c963f76ee6c1f153d4d2ccdd0"} Mar 12 14:55:50 crc kubenswrapper[4869]: I0312 14:55:50.571019 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerStarted","Data":"782a4d0854bd913b58ff7c98e0d483c61a66e1f21a0fa88c58c100bef1826876"} Mar 12 14:55:50 crc kubenswrapper[4869]: I0312 14:55:50.571098 4869 scope.go:117] "RemoveContainer" containerID="9e48f067c773716c7d24fab1c2ac1e1bfd0b073b1e56d62472b739aafe4d8ef4" Mar 12 14:56:00 crc kubenswrapper[4869]: I0312 14:56:00.132492 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555456-2vfdb"] Mar 12 14:56:00 crc kubenswrapper[4869]: E0312 14:56:00.133596 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a92e51-8487-471e-a055-e7f63f101490" containerName="oc" Mar 12 14:56:00 crc kubenswrapper[4869]: I0312 14:56:00.133612 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a92e51-8487-471e-a055-e7f63f101490" containerName="oc" Mar 12 14:56:00 crc kubenswrapper[4869]: I0312 14:56:00.133731 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a92e51-8487-471e-a055-e7f63f101490" containerName="oc" Mar 12 14:56:00 crc kubenswrapper[4869]: I0312 14:56:00.134241 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555456-2vfdb" Mar 12 14:56:00 crc kubenswrapper[4869]: I0312 14:56:00.136188 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 14:56:00 crc kubenswrapper[4869]: I0312 14:56:00.136188 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:56:00 crc kubenswrapper[4869]: I0312 14:56:00.136373 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:56:00 crc kubenswrapper[4869]: I0312 14:56:00.138031 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555456-2vfdb"] Mar 12 14:56:00 crc kubenswrapper[4869]: I0312 14:56:00.247184 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cst7\" (UniqueName: \"kubernetes.io/projected/e2c74cbb-f1e0-4776-aa0f-17893bf634f9-kube-api-access-8cst7\") pod \"auto-csr-approver-29555456-2vfdb\" (UID: \"e2c74cbb-f1e0-4776-aa0f-17893bf634f9\") " pod="openshift-infra/auto-csr-approver-29555456-2vfdb" Mar 12 14:56:00 crc kubenswrapper[4869]: I0312 14:56:00.348625 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cst7\" (UniqueName: \"kubernetes.io/projected/e2c74cbb-f1e0-4776-aa0f-17893bf634f9-kube-api-access-8cst7\") pod \"auto-csr-approver-29555456-2vfdb\" (UID: \"e2c74cbb-f1e0-4776-aa0f-17893bf634f9\") " pod="openshift-infra/auto-csr-approver-29555456-2vfdb" Mar 12 14:56:00 crc kubenswrapper[4869]: I0312 14:56:00.366497 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cst7\" (UniqueName: \"kubernetes.io/projected/e2c74cbb-f1e0-4776-aa0f-17893bf634f9-kube-api-access-8cst7\") pod \"auto-csr-approver-29555456-2vfdb\" (UID: \"e2c74cbb-f1e0-4776-aa0f-17893bf634f9\") " pod="openshift-infra/auto-csr-approver-29555456-2vfdb" Mar 12 14:56:00 crc kubenswrapper[4869]: I0312 14:56:00.452187 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555456-2vfdb" Mar 12 14:56:00 crc kubenswrapper[4869]: I0312 14:56:00.622016 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555456-2vfdb"] Mar 12 14:56:00 crc kubenswrapper[4869]: I0312 14:56:00.631768 4869 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 14:56:00 crc kubenswrapper[4869]: I0312 14:56:00.636205 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555456-2vfdb" event={"ID":"e2c74cbb-f1e0-4776-aa0f-17893bf634f9","Type":"ContainerStarted","Data":"12e98d50bde5e5949973d3bbeffd67034ae662c7e24b2bda7fd2e77960b0f86e"} Mar 12 14:56:04 crc kubenswrapper[4869]: I0312 14:56:04.204144 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555456-2vfdb" event={"ID":"e2c74cbb-f1e0-4776-aa0f-17893bf634f9","Type":"ContainerStarted","Data":"3ec89ccc59edba5da1e1493ceee89c161d17d3f4895126e8e89c4fabe29162a8"} Mar 12 14:56:04 crc kubenswrapper[4869]: I0312 14:56:04.222594 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555456-2vfdb" podStartSLOduration=0.925846287 podStartE2EDuration="4.222565575s" podCreationTimestamp="2026-03-12 14:56:00 +0000 UTC" firstStartedPulling="2026-03-12 14:56:00.631528142 +0000 UTC m=+512.916753420" lastFinishedPulling="2026-03-12 14:56:03.92824743 +0000 UTC m=+516.213472708" observedRunningTime="2026-03-12 14:56:04.216961974 +0000 UTC m=+516.502187252" watchObservedRunningTime="2026-03-12 14:56:04.222565575 +0000 UTC m=+516.507790873" Mar 12 14:56:05 crc kubenswrapper[4869]: I0312 14:56:05.210731 4869 generic.go:334] "Generic (PLEG): container finished" podID="e2c74cbb-f1e0-4776-aa0f-17893bf634f9" containerID="3ec89ccc59edba5da1e1493ceee89c161d17d3f4895126e8e89c4fabe29162a8" exitCode=0 Mar 12 14:56:05 crc kubenswrapper[4869]: I0312 14:56:05.210832 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555456-2vfdb" event={"ID":"e2c74cbb-f1e0-4776-aa0f-17893bf634f9","Type":"ContainerDied","Data":"3ec89ccc59edba5da1e1493ceee89c161d17d3f4895126e8e89c4fabe29162a8"} Mar 12 14:56:06 crc kubenswrapper[4869]: I0312 14:56:06.415027 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555456-2vfdb" Mar 12 14:56:06 crc kubenswrapper[4869]: I0312 14:56:06.540029 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cst7\" (UniqueName: \"kubernetes.io/projected/e2c74cbb-f1e0-4776-aa0f-17893bf634f9-kube-api-access-8cst7\") pod \"e2c74cbb-f1e0-4776-aa0f-17893bf634f9\" (UID: \"e2c74cbb-f1e0-4776-aa0f-17893bf634f9\") " Mar 12 14:56:06 crc kubenswrapper[4869]: I0312 14:56:06.545001 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2c74cbb-f1e0-4776-aa0f-17893bf634f9-kube-api-access-8cst7" (OuterVolumeSpecName: "kube-api-access-8cst7") pod "e2c74cbb-f1e0-4776-aa0f-17893bf634f9" (UID: "e2c74cbb-f1e0-4776-aa0f-17893bf634f9"). InnerVolumeSpecName "kube-api-access-8cst7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:56:06 crc kubenswrapper[4869]: I0312 14:56:06.641777 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cst7\" (UniqueName: \"kubernetes.io/projected/e2c74cbb-f1e0-4776-aa0f-17893bf634f9-kube-api-access-8cst7\") on node \"crc\" DevicePath \"\"" Mar 12 14:56:07 crc kubenswrapper[4869]: I0312 14:56:07.223379 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555456-2vfdb" event={"ID":"e2c74cbb-f1e0-4776-aa0f-17893bf634f9","Type":"ContainerDied","Data":"12e98d50bde5e5949973d3bbeffd67034ae662c7e24b2bda7fd2e77960b0f86e"} Mar 12 14:56:07 crc kubenswrapper[4869]: I0312 14:56:07.223420 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12e98d50bde5e5949973d3bbeffd67034ae662c7e24b2bda7fd2e77960b0f86e" Mar 12 14:56:07 crc kubenswrapper[4869]: I0312 14:56:07.223468 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555456-2vfdb" Mar 12 14:56:07 crc kubenswrapper[4869]: I0312 14:56:07.267268 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555450-qsz9r"] Mar 12 14:56:07 crc kubenswrapper[4869]: I0312 14:56:07.270798 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555450-qsz9r"] Mar 12 14:56:08 crc kubenswrapper[4869]: I0312 14:56:08.343816 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6a83972-34aa-4505-8b6f-b1345b7981cd" path="/var/lib/kubelet/pods/c6a83972-34aa-4505-8b6f-b1345b7981cd/volumes" Mar 12 14:56:28 crc kubenswrapper[4869]: I0312 14:56:28.990729 4869 scope.go:117] "RemoveContainer" containerID="27d456d07c8dc744d30a6ee25564a39108e1680d5cba5b71d11be77d04345460" Mar 12 14:58:00 crc kubenswrapper[4869]: I0312 14:58:00.129688 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555458-b2j8t"] Mar 12 14:58:00 crc kubenswrapper[4869]: E0312 14:58:00.131570 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2c74cbb-f1e0-4776-aa0f-17893bf634f9" containerName="oc" Mar 12 14:58:00 crc kubenswrapper[4869]: I0312 14:58:00.131647 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2c74cbb-f1e0-4776-aa0f-17893bf634f9" containerName="oc" Mar 12 14:58:00 crc kubenswrapper[4869]: I0312 14:58:00.131791 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2c74cbb-f1e0-4776-aa0f-17893bf634f9" containerName="oc" Mar 12 14:58:00 crc kubenswrapper[4869]: I0312 14:58:00.132222 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555458-b2j8t" Mar 12 14:58:00 crc kubenswrapper[4869]: I0312 14:58:00.133882 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 14:58:00 crc kubenswrapper[4869]: I0312 14:58:00.134336 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:58:00 crc kubenswrapper[4869]: I0312 14:58:00.136026 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:58:00 crc kubenswrapper[4869]: I0312 14:58:00.140502 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555458-b2j8t"] Mar 12 14:58:00 crc kubenswrapper[4869]: I0312 14:58:00.148582 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jg7m\" (UniqueName: \"kubernetes.io/projected/f78a2bb0-61ee-488f-986d-4e1f62a7ff0c-kube-api-access-8jg7m\") pod \"auto-csr-approver-29555458-b2j8t\" (UID: \"f78a2bb0-61ee-488f-986d-4e1f62a7ff0c\") " pod="openshift-infra/auto-csr-approver-29555458-b2j8t" Mar 12 14:58:00 crc kubenswrapper[4869]: I0312 14:58:00.250354 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jg7m\" (UniqueName: \"kubernetes.io/projected/f78a2bb0-61ee-488f-986d-4e1f62a7ff0c-kube-api-access-8jg7m\") pod \"auto-csr-approver-29555458-b2j8t\" (UID: \"f78a2bb0-61ee-488f-986d-4e1f62a7ff0c\") " pod="openshift-infra/auto-csr-approver-29555458-b2j8t" Mar 12 14:58:00 crc kubenswrapper[4869]: I0312 14:58:00.268443 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jg7m\" (UniqueName: \"kubernetes.io/projected/f78a2bb0-61ee-488f-986d-4e1f62a7ff0c-kube-api-access-8jg7m\") pod \"auto-csr-approver-29555458-b2j8t\" (UID: \"f78a2bb0-61ee-488f-986d-4e1f62a7ff0c\") " pod="openshift-infra/auto-csr-approver-29555458-b2j8t" Mar 12 14:58:00 crc kubenswrapper[4869]: I0312 14:58:00.455362 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555458-b2j8t" Mar 12 14:58:00 crc kubenswrapper[4869]: I0312 14:58:00.653971 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555458-b2j8t"] Mar 12 14:58:00 crc kubenswrapper[4869]: I0312 14:58:00.841569 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555458-b2j8t" event={"ID":"f78a2bb0-61ee-488f-986d-4e1f62a7ff0c","Type":"ContainerStarted","Data":"48b91f0bac9d1fa57f1bf30a225353a4a5bded03089933b76dcc463701a97c3d"} Mar 12 14:58:02 crc kubenswrapper[4869]: I0312 14:58:02.853923 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555458-b2j8t" event={"ID":"f78a2bb0-61ee-488f-986d-4e1f62a7ff0c","Type":"ContainerStarted","Data":"601bd64c4b111a60c475cc2d26c60302abd2916331fd8f1ab3780f494751fdeb"} Mar 12 14:58:02 crc kubenswrapper[4869]: I0312 14:58:02.866059 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555458-b2j8t" podStartSLOduration=0.992696479 podStartE2EDuration="2.866039768s" podCreationTimestamp="2026-03-12 14:58:00 +0000 UTC" firstStartedPulling="2026-03-12 14:58:00.663717546 +0000 UTC m=+632.948942824" lastFinishedPulling="2026-03-12 14:58:02.537060825 +0000 UTC m=+634.822286113" observedRunningTime="2026-03-12 14:58:02.865304227 +0000 UTC m=+635.150529515" watchObservedRunningTime="2026-03-12 14:58:02.866039768 +0000 UTC m=+635.151265046" Mar 12 14:58:03 crc kubenswrapper[4869]: I0312 14:58:03.859337 4869 generic.go:334] "Generic (PLEG): container finished" podID="f78a2bb0-61ee-488f-986d-4e1f62a7ff0c" containerID="601bd64c4b111a60c475cc2d26c60302abd2916331fd8f1ab3780f494751fdeb" exitCode=0 Mar 12 14:58:03 crc kubenswrapper[4869]: I0312 14:58:03.859383 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555458-b2j8t" event={"ID":"f78a2bb0-61ee-488f-986d-4e1f62a7ff0c","Type":"ContainerDied","Data":"601bd64c4b111a60c475cc2d26c60302abd2916331fd8f1ab3780f494751fdeb"} Mar 12 14:58:05 crc kubenswrapper[4869]: I0312 14:58:05.047636 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555458-b2j8t" Mar 12 14:58:05 crc kubenswrapper[4869]: I0312 14:58:05.205932 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jg7m\" (UniqueName: \"kubernetes.io/projected/f78a2bb0-61ee-488f-986d-4e1f62a7ff0c-kube-api-access-8jg7m\") pod \"f78a2bb0-61ee-488f-986d-4e1f62a7ff0c\" (UID: \"f78a2bb0-61ee-488f-986d-4e1f62a7ff0c\") " Mar 12 14:58:05 crc kubenswrapper[4869]: I0312 14:58:05.212461 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f78a2bb0-61ee-488f-986d-4e1f62a7ff0c-kube-api-access-8jg7m" (OuterVolumeSpecName: "kube-api-access-8jg7m") pod "f78a2bb0-61ee-488f-986d-4e1f62a7ff0c" (UID: "f78a2bb0-61ee-488f-986d-4e1f62a7ff0c"). InnerVolumeSpecName "kube-api-access-8jg7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:58:05 crc kubenswrapper[4869]: I0312 14:58:05.307682 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jg7m\" (UniqueName: \"kubernetes.io/projected/f78a2bb0-61ee-488f-986d-4e1f62a7ff0c-kube-api-access-8jg7m\") on node \"crc\" DevicePath \"\"" Mar 12 14:58:05 crc kubenswrapper[4869]: I0312 14:58:05.871759 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555458-b2j8t" event={"ID":"f78a2bb0-61ee-488f-986d-4e1f62a7ff0c","Type":"ContainerDied","Data":"48b91f0bac9d1fa57f1bf30a225353a4a5bded03089933b76dcc463701a97c3d"} Mar 12 14:58:05 crc kubenswrapper[4869]: I0312 14:58:05.871799 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48b91f0bac9d1fa57f1bf30a225353a4a5bded03089933b76dcc463701a97c3d" Mar 12 14:58:05 crc kubenswrapper[4869]: I0312 14:58:05.871831 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555458-b2j8t" Mar 12 14:58:05 crc kubenswrapper[4869]: I0312 14:58:05.916321 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555452-jbkgg"] Mar 12 14:58:05 crc kubenswrapper[4869]: I0312 14:58:05.921658 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555452-jbkgg"] Mar 12 14:58:06 crc kubenswrapper[4869]: I0312 14:58:06.342511 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf2f746c-3fb9-425d-a06c-62c3cf768500" path="/var/lib/kubelet/pods/bf2f746c-3fb9-425d-a06c-62c3cf768500/volumes" Mar 12 14:58:19 crc kubenswrapper[4869]: I0312 14:58:19.684716 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:58:19 crc kubenswrapper[4869]: I0312 14:58:19.685952 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:58:29 crc kubenswrapper[4869]: I0312 14:58:29.062677 4869 scope.go:117] "RemoveContainer" containerID="8f4927e0f25391a515e345445412ddbdfae727c49288de1397c6a4501444f551" Mar 12 14:58:49 crc kubenswrapper[4869]: I0312 14:58:49.684085 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:58:49 crc kubenswrapper[4869]: I0312 14:58:49.684691 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:59:03 crc kubenswrapper[4869]: I0312 14:59:03.531173 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-rxw2v"] Mar 12 14:59:03 crc kubenswrapper[4869]: E0312 14:59:03.532132 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78a2bb0-61ee-488f-986d-4e1f62a7ff0c" containerName="oc" Mar 12 14:59:03 crc kubenswrapper[4869]: I0312 14:59:03.532150 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78a2bb0-61ee-488f-986d-4e1f62a7ff0c" containerName="oc" Mar 12 14:59:03 crc kubenswrapper[4869]: I0312 14:59:03.532292 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78a2bb0-61ee-488f-986d-4e1f62a7ff0c" containerName="oc" Mar 12 14:59:03 crc kubenswrapper[4869]: I0312 14:59:03.532792 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-rxw2v" Mar 12 14:59:03 crc kubenswrapper[4869]: I0312 14:59:03.538876 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 12 14:59:03 crc kubenswrapper[4869]: I0312 14:59:03.538921 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 12 14:59:03 crc kubenswrapper[4869]: I0312 14:59:03.539025 4869 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-dhw5l" Mar 12 14:59:03 crc kubenswrapper[4869]: I0312 14:59:03.543873 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-rxw2v"] Mar 12 14:59:03 crc kubenswrapper[4869]: I0312 14:59:03.552437 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-9869l"] Mar 12 14:59:03 crc kubenswrapper[4869]: I0312 14:59:03.553373 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-9869l" Mar 12 14:59:03 crc kubenswrapper[4869]: I0312 14:59:03.557026 4869 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-pgs2n" Mar 12 14:59:03 crc kubenswrapper[4869]: I0312 14:59:03.565947 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-9869l"] Mar 12 14:59:03 crc kubenswrapper[4869]: I0312 14:59:03.577981 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-bgzmw"] Mar 12 14:59:03 crc kubenswrapper[4869]: I0312 14:59:03.580415 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-bgzmw" Mar 12 14:59:03 crc kubenswrapper[4869]: I0312 14:59:03.585441 4869 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-lzwhb" Mar 12 14:59:03 crc kubenswrapper[4869]: I0312 14:59:03.590735 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-bgzmw"] Mar 12 14:59:03 crc kubenswrapper[4869]: I0312 14:59:03.665985 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwjcp\" (UniqueName: \"kubernetes.io/projected/06128598-c475-46aa-8109-12eea3f15bfb-kube-api-access-lwjcp\") pod \"cert-manager-cainjector-cf98fcc89-rxw2v\" (UID: \"06128598-c475-46aa-8109-12eea3f15bfb\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-rxw2v" Mar 12 14:59:03 crc kubenswrapper[4869]: I0312 14:59:03.666189 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwn98\" (UniqueName: \"kubernetes.io/projected/1a9edcd9-b1dd-44a3-8d46-8a8e0ae4fcf5-kube-api-access-jwn98\") pod \"cert-manager-858654f9db-9869l\" (UID: \"1a9edcd9-b1dd-44a3-8d46-8a8e0ae4fcf5\") " pod="cert-manager/cert-manager-858654f9db-9869l" Mar 12 14:59:03 crc kubenswrapper[4869]: I0312 14:59:03.767192 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9lmw\" (UniqueName: \"kubernetes.io/projected/ab036352-d0df-4dea-bd2d-27fde11414c7-kube-api-access-v9lmw\") pod \"cert-manager-webhook-687f57d79b-bgzmw\" (UID: \"ab036352-d0df-4dea-bd2d-27fde11414c7\") " pod="cert-manager/cert-manager-webhook-687f57d79b-bgzmw" Mar 12 14:59:03 crc kubenswrapper[4869]: I0312 14:59:03.767243 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwjcp\" (UniqueName: \"kubernetes.io/projected/06128598-c475-46aa-8109-12eea3f15bfb-kube-api-access-lwjcp\") pod \"cert-manager-cainjector-cf98fcc89-rxw2v\" (UID: \"06128598-c475-46aa-8109-12eea3f15bfb\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-rxw2v" Mar 12 14:59:03 crc kubenswrapper[4869]: I0312 14:59:03.767290 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwn98\" (UniqueName: \"kubernetes.io/projected/1a9edcd9-b1dd-44a3-8d46-8a8e0ae4fcf5-kube-api-access-jwn98\") pod \"cert-manager-858654f9db-9869l\" (UID: \"1a9edcd9-b1dd-44a3-8d46-8a8e0ae4fcf5\") " pod="cert-manager/cert-manager-858654f9db-9869l" Mar 12 14:59:03 crc kubenswrapper[4869]: I0312 14:59:03.792283 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwjcp\" (UniqueName: \"kubernetes.io/projected/06128598-c475-46aa-8109-12eea3f15bfb-kube-api-access-lwjcp\") pod \"cert-manager-cainjector-cf98fcc89-rxw2v\" (UID: \"06128598-c475-46aa-8109-12eea3f15bfb\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-rxw2v" Mar 12 14:59:03 crc kubenswrapper[4869]: I0312 14:59:03.793247 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwn98\" (UniqueName: \"kubernetes.io/projected/1a9edcd9-b1dd-44a3-8d46-8a8e0ae4fcf5-kube-api-access-jwn98\") pod \"cert-manager-858654f9db-9869l\" (UID: \"1a9edcd9-b1dd-44a3-8d46-8a8e0ae4fcf5\") " pod="cert-manager/cert-manager-858654f9db-9869l" Mar 12 14:59:03 crc kubenswrapper[4869]: I0312 14:59:03.849315 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-rxw2v" Mar 12 14:59:03 crc kubenswrapper[4869]: I0312 14:59:03.869394 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9lmw\" (UniqueName: \"kubernetes.io/projected/ab036352-d0df-4dea-bd2d-27fde11414c7-kube-api-access-v9lmw\") pod \"cert-manager-webhook-687f57d79b-bgzmw\" (UID: \"ab036352-d0df-4dea-bd2d-27fde11414c7\") " pod="cert-manager/cert-manager-webhook-687f57d79b-bgzmw" Mar 12 14:59:03 crc kubenswrapper[4869]: I0312 14:59:03.870589 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-9869l" Mar 12 14:59:03 crc kubenswrapper[4869]: I0312 14:59:03.885319 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9lmw\" (UniqueName: \"kubernetes.io/projected/ab036352-d0df-4dea-bd2d-27fde11414c7-kube-api-access-v9lmw\") pod \"cert-manager-webhook-687f57d79b-bgzmw\" (UID: \"ab036352-d0df-4dea-bd2d-27fde11414c7\") " pod="cert-manager/cert-manager-webhook-687f57d79b-bgzmw" Mar 12 14:59:03 crc kubenswrapper[4869]: I0312 14:59:03.912087 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-bgzmw" Mar 12 14:59:04 crc kubenswrapper[4869]: I0312 14:59:04.148080 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-bgzmw"] Mar 12 14:59:04 crc kubenswrapper[4869]: W0312 14:59:04.157825 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab036352_d0df_4dea_bd2d_27fde11414c7.slice/crio-af34eac536b254f532fc3d844d35e9f36d9990af986fe89475feb53850a0f686 WatchSource:0}: Error finding container af34eac536b254f532fc3d844d35e9f36d9990af986fe89475feb53850a0f686: Status 404 returned error can't find the container with id af34eac536b254f532fc3d844d35e9f36d9990af986fe89475feb53850a0f686 Mar 12 14:59:04 crc kubenswrapper[4869]: I0312 14:59:04.213433 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-bgzmw" event={"ID":"ab036352-d0df-4dea-bd2d-27fde11414c7","Type":"ContainerStarted","Data":"af34eac536b254f532fc3d844d35e9f36d9990af986fe89475feb53850a0f686"} Mar 12 14:59:04 crc kubenswrapper[4869]: W0312 14:59:04.289742 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06128598_c475_46aa_8109_12eea3f15bfb.slice/crio-7fce88c857ed386617e7c69e5a6c0b2e91f274b9aeaccf6b1c8cc35507eb2e17 WatchSource:0}: Error finding container 7fce88c857ed386617e7c69e5a6c0b2e91f274b9aeaccf6b1c8cc35507eb2e17: Status 404 returned error can't find the container with id 7fce88c857ed386617e7c69e5a6c0b2e91f274b9aeaccf6b1c8cc35507eb2e17 Mar 12 14:59:04 crc kubenswrapper[4869]: I0312 14:59:04.290898 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-rxw2v"] Mar 12 14:59:04 crc kubenswrapper[4869]: I0312 14:59:04.296262 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-9869l"] Mar 12 14:59:05 crc kubenswrapper[4869]: I0312 14:59:05.220862 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-rxw2v" event={"ID":"06128598-c475-46aa-8109-12eea3f15bfb","Type":"ContainerStarted","Data":"7fce88c857ed386617e7c69e5a6c0b2e91f274b9aeaccf6b1c8cc35507eb2e17"} Mar 12 14:59:05 crc kubenswrapper[4869]: I0312 14:59:05.222007 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-9869l" event={"ID":"1a9edcd9-b1dd-44a3-8d46-8a8e0ae4fcf5","Type":"ContainerStarted","Data":"c048942f1c02035597df55147fd11143ceda0325aff2e1c9fcce47641452fb7a"} Mar 12 14:59:08 crc kubenswrapper[4869]: I0312 14:59:08.249033 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-bgzmw" event={"ID":"ab036352-d0df-4dea-bd2d-27fde11414c7","Type":"ContainerStarted","Data":"ced8ddd3c61b275ed94b50984886a52671126cebbe486c83b463afb760cd9851"} Mar 12 14:59:08 crc kubenswrapper[4869]: I0312 14:59:08.250626 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-bgzmw" Mar 12 14:59:08 crc kubenswrapper[4869]: I0312 14:59:08.306475 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-bgzmw" podStartSLOduration=2.017875516 podStartE2EDuration="5.306333618s" podCreationTimestamp="2026-03-12 14:59:03 +0000 UTC" firstStartedPulling="2026-03-12 14:59:04.160187821 +0000 UTC m=+696.445413099" lastFinishedPulling="2026-03-12 14:59:07.448645923 +0000 UTC m=+699.733871201" observedRunningTime="2026-03-12 14:59:08.300185843 +0000 UTC m=+700.585411131" watchObservedRunningTime="2026-03-12 14:59:08.306333618 +0000 UTC m=+700.591558896" Mar 12 14:59:09 crc kubenswrapper[4869]: I0312 14:59:09.254922 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-rxw2v" event={"ID":"06128598-c475-46aa-8109-12eea3f15bfb","Type":"ContainerStarted","Data":"e3db37e738d3bc82eb6801437454e7ef17e0bcfc995e0e181d5373e9d47f23d7"} Mar 12 14:59:09 crc kubenswrapper[4869]: I0312 14:59:09.256298 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-9869l" event={"ID":"1a9edcd9-b1dd-44a3-8d46-8a8e0ae4fcf5","Type":"ContainerStarted","Data":"e19b7cc3a5e82cd9a9f353096bac93de5d82268dfd3f8f6101415bdc5893a58d"} Mar 12 14:59:09 crc kubenswrapper[4869]: I0312 14:59:09.272331 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-rxw2v" podStartSLOduration=1.629936786 podStartE2EDuration="6.27231045s" podCreationTimestamp="2026-03-12 14:59:03 +0000 UTC" firstStartedPulling="2026-03-12 14:59:04.292955976 +0000 UTC m=+696.578181264" lastFinishedPulling="2026-03-12 14:59:08.93532965 +0000 UTC m=+701.220554928" observedRunningTime="2026-03-12 14:59:09.268650186 +0000 UTC m=+701.553875484" watchObservedRunningTime="2026-03-12 14:59:09.27231045 +0000 UTC m=+701.557535728" Mar 12 14:59:09 crc kubenswrapper[4869]: I0312 14:59:09.294570 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-9869l" podStartSLOduration=1.583367252 podStartE2EDuration="6.294441209s" podCreationTimestamp="2026-03-12 14:59:03 +0000 UTC" firstStartedPulling="2026-03-12 14:59:04.303426833 +0000 UTC m=+696.588652111" lastFinishedPulling="2026-03-12 14:59:09.01450079 +0000 UTC m=+701.299726068" observedRunningTime="2026-03-12 14:59:09.292735251 +0000 UTC m=+701.577960529" watchObservedRunningTime="2026-03-12 14:59:09.294441209 +0000 UTC m=+701.579666497" Mar 12 14:59:13 crc kubenswrapper[4869]: I0312 14:59:13.640071 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-42vwv"] Mar 12 14:59:13 crc kubenswrapper[4869]: I0312 14:59:13.640853 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="nbdb" containerID="cri-o://51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58" gracePeriod=30 Mar 12 14:59:13 crc kubenswrapper[4869]: I0312 14:59:13.640897 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="northd" containerID="cri-o://6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e" gracePeriod=30 Mar 12 14:59:13 crc kubenswrapper[4869]: I0312 14:59:13.640923 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="kube-rbac-proxy-node" containerID="cri-o://4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004" gracePeriod=30 Mar 12 14:59:13 crc kubenswrapper[4869]: I0312 14:59:13.640845 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="ovn-controller" containerID="cri-o://5d8e2551217a24af090084a54df75ab1d7ed1cc066577b6385867e41c18115f5" gracePeriod=30 Mar 12 14:59:13 crc kubenswrapper[4869]: I0312 14:59:13.640989 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="sbdb" containerID="cri-o://c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da" gracePeriod=30 Mar 12 14:59:13 crc kubenswrapper[4869]: I0312 14:59:13.640988 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89" gracePeriod=30 Mar 12 14:59:13 crc kubenswrapper[4869]: I0312 14:59:13.641000 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="ovn-acl-logging" containerID="cri-o://8bb7615660554d3eb2719e46045dc9d6699152e0f276f8c1d6065ed4060f9f7a" gracePeriod=30 Mar 12 14:59:13 crc kubenswrapper[4869]: I0312 14:59:13.681089 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="ovnkube-controller" containerID="cri-o://571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160" gracePeriod=30 Mar 12 14:59:13 crc kubenswrapper[4869]: I0312 14:59:13.907526 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42vwv_7edaf111-2689-4453-ba78-00677e1b6316/ovn-acl-logging/0.log" Mar 12 14:59:13 crc kubenswrapper[4869]: I0312 14:59:13.908258 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42vwv_7edaf111-2689-4453-ba78-00677e1b6316/ovn-controller/0.log" Mar 12 14:59:13 crc kubenswrapper[4869]: I0312 14:59:13.908725 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:59:13 crc kubenswrapper[4869]: I0312 14:59:13.914498 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-bgzmw" Mar 12 14:59:13 crc kubenswrapper[4869]: I0312 14:59:13.957082 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-v8zsn"] Mar 12 14:59:13 crc kubenswrapper[4869]: E0312 14:59:13.957452 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="northd" Mar 12 14:59:13 crc kubenswrapper[4869]: I0312 14:59:13.957478 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="northd" Mar 12 14:59:13 crc kubenswrapper[4869]: E0312 14:59:13.957502 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="nbdb" Mar 12 14:59:13 crc kubenswrapper[4869]: I0312 14:59:13.957515 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="nbdb" Mar 12 14:59:13 crc kubenswrapper[4869]: E0312 14:59:13.957558 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="ovn-controller" Mar 12 14:59:13 crc kubenswrapper[4869]: I0312 14:59:13.957572 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="ovn-controller" Mar 12 14:59:13 crc kubenswrapper[4869]: E0312 14:59:13.957590 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="kubecfg-setup" Mar 12 14:59:13 crc kubenswrapper[4869]: I0312 14:59:13.957606 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="kubecfg-setup" Mar 12 14:59:13 crc kubenswrapper[4869]: E0312 14:59:13.957624 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="ovn-acl-logging" Mar 12 14:59:13 crc kubenswrapper[4869]: I0312 14:59:13.957636 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="ovn-acl-logging" Mar 12 14:59:13 crc kubenswrapper[4869]: E0312 14:59:13.957650 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="sbdb" Mar 12 14:59:13 crc kubenswrapper[4869]: I0312 14:59:13.957662 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="sbdb" Mar 12 14:59:13 crc kubenswrapper[4869]: E0312 14:59:13.957693 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="kube-rbac-proxy-ovn-metrics" Mar 12 14:59:13 crc kubenswrapper[4869]: I0312 14:59:13.957708 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="kube-rbac-proxy-ovn-metrics" Mar 12 14:59:13 crc kubenswrapper[4869]: E0312 14:59:13.957730 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="kube-rbac-proxy-node" Mar 12 14:59:13 crc kubenswrapper[4869]: I0312 14:59:13.957743 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="kube-rbac-proxy-node" Mar 12 14:59:13 crc kubenswrapper[4869]: E0312 14:59:13.957764 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="ovnkube-controller" Mar 12 14:59:13 crc kubenswrapper[4869]: I0312 14:59:13.957776 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="ovnkube-controller" Mar 12 14:59:13 crc kubenswrapper[4869]: I0312 14:59:13.957954 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="nbdb" Mar 12 14:59:13 crc kubenswrapper[4869]: I0312 14:59:13.957977 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="kube-rbac-proxy-node" Mar 12 14:59:13 crc kubenswrapper[4869]: I0312 14:59:13.957992 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="ovnkube-controller" Mar 12 14:59:13 crc kubenswrapper[4869]: I0312 14:59:13.958005 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="kube-rbac-proxy-ovn-metrics" Mar 12 14:59:13 crc kubenswrapper[4869]: I0312 14:59:13.958025 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="ovn-acl-logging" Mar 12 14:59:13 crc kubenswrapper[4869]: I0312 14:59:13.958042 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="ovn-controller" Mar 12 14:59:13 crc kubenswrapper[4869]: I0312 14:59:13.958058 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="northd" Mar 12 14:59:13 crc kubenswrapper[4869]: I0312 14:59:13.958075 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="7edaf111-2689-4453-ba78-00677e1b6316" containerName="sbdb" Mar 12 14:59:13 crc kubenswrapper[4869]: I0312 14:59:13.961168 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.013938 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-cni-bin\") pod \"7edaf111-2689-4453-ba78-00677e1b6316\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.013992 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7rnt\" (UniqueName: \"kubernetes.io/projected/7edaf111-2689-4453-ba78-00677e1b6316-kube-api-access-t7rnt\") pod \"7edaf111-2689-4453-ba78-00677e1b6316\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014001 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "7edaf111-2689-4453-ba78-00677e1b6316" (UID: "7edaf111-2689-4453-ba78-00677e1b6316"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014014 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-var-lib-cni-networks-ovn-kubernetes\") pod \"7edaf111-2689-4453-ba78-00677e1b6316\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014078 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "7edaf111-2689-4453-ba78-00677e1b6316" (UID: "7edaf111-2689-4453-ba78-00677e1b6316"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014155 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-cni-netd\") pod \"7edaf111-2689-4453-ba78-00677e1b6316\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014190 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "7edaf111-2689-4453-ba78-00677e1b6316" (UID: "7edaf111-2689-4453-ba78-00677e1b6316"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014205 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-run-ovn-kubernetes\") pod \"7edaf111-2689-4453-ba78-00677e1b6316\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014243 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "7edaf111-2689-4453-ba78-00677e1b6316" (UID: "7edaf111-2689-4453-ba78-00677e1b6316"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014428 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7edaf111-2689-4453-ba78-00677e1b6316-ovnkube-script-lib\") pod \"7edaf111-2689-4453-ba78-00677e1b6316\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014487 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-etc-openvswitch\") pod \"7edaf111-2689-4453-ba78-00677e1b6316\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014516 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7edaf111-2689-4453-ba78-00677e1b6316-ovnkube-config\") pod \"7edaf111-2689-4453-ba78-00677e1b6316\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014578 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "7edaf111-2689-4453-ba78-00677e1b6316" (UID: "7edaf111-2689-4453-ba78-00677e1b6316"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014585 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7edaf111-2689-4453-ba78-00677e1b6316-env-overrides\") pod \"7edaf111-2689-4453-ba78-00677e1b6316\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014617 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-kubelet\") pod \"7edaf111-2689-4453-ba78-00677e1b6316\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014636 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-run-openvswitch\") pod \"7edaf111-2689-4453-ba78-00677e1b6316\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014659 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7edaf111-2689-4453-ba78-00677e1b6316-ovn-node-metrics-cert\") pod \"7edaf111-2689-4453-ba78-00677e1b6316\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014677 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-log-socket\") pod \"7edaf111-2689-4453-ba78-00677e1b6316\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014693 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-run-systemd\") pod \"7edaf111-2689-4453-ba78-00677e1b6316\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014710 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-run-ovn\") pod \"7edaf111-2689-4453-ba78-00677e1b6316\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014723 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-var-lib-openvswitch\") pod \"7edaf111-2689-4453-ba78-00677e1b6316\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014740 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-run-netns\") pod \"7edaf111-2689-4453-ba78-00677e1b6316\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014761 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-systemd-units\") pod \"7edaf111-2689-4453-ba78-00677e1b6316\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014773 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-node-log\") pod \"7edaf111-2689-4453-ba78-00677e1b6316\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014794 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-slash\") pod \"7edaf111-2689-4453-ba78-00677e1b6316\" (UID: \"7edaf111-2689-4453-ba78-00677e1b6316\") " Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014783 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "7edaf111-2689-4453-ba78-00677e1b6316" (UID: "7edaf111-2689-4453-ba78-00677e1b6316"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014860 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-run-openvswitch\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014883 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b5e55f41-e780-4667-8435-596089e03974-ovn-node-metrics-cert\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014887 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7edaf111-2689-4453-ba78-00677e1b6316-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "7edaf111-2689-4453-ba78-00677e1b6316" (UID: "7edaf111-2689-4453-ba78-00677e1b6316"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014904 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-host-run-netns\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014900 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "7edaf111-2689-4453-ba78-00677e1b6316" (UID: "7edaf111-2689-4453-ba78-00677e1b6316"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014923 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-run-ovn\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014939 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-slash" (OuterVolumeSpecName: "host-slash") pod "7edaf111-2689-4453-ba78-00677e1b6316" (UID: "7edaf111-2689-4453-ba78-00677e1b6316"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014939 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "7edaf111-2689-4453-ba78-00677e1b6316" (UID: "7edaf111-2689-4453-ba78-00677e1b6316"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014963 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "7edaf111-2689-4453-ba78-00677e1b6316" (UID: "7edaf111-2689-4453-ba78-00677e1b6316"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014969 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-node-log" (OuterVolumeSpecName: "node-log") pod "7edaf111-2689-4453-ba78-00677e1b6316" (UID: "7edaf111-2689-4453-ba78-00677e1b6316"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014975 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-log-socket" (OuterVolumeSpecName: "log-socket") pod "7edaf111-2689-4453-ba78-00677e1b6316" (UID: "7edaf111-2689-4453-ba78-00677e1b6316"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014908 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "7edaf111-2689-4453-ba78-00677e1b6316" (UID: "7edaf111-2689-4453-ba78-00677e1b6316"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014991 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "7edaf111-2689-4453-ba78-00677e1b6316" (UID: "7edaf111-2689-4453-ba78-00677e1b6316"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.014987 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b5e55f41-e780-4667-8435-596089e03974-env-overrides\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.015016 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7edaf111-2689-4453-ba78-00677e1b6316-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "7edaf111-2689-4453-ba78-00677e1b6316" (UID: "7edaf111-2689-4453-ba78-00677e1b6316"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.015130 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-host-kubelet\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.015193 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-etc-openvswitch\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.015275 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-node-log\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.015316 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-systemd-units\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.015347 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-log-socket\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.015370 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8djs\" (UniqueName: \"kubernetes.io/projected/b5e55f41-e780-4667-8435-596089e03974-kube-api-access-s8djs\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.015434 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-var-lib-openvswitch\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.015474 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-host-cni-bin\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.015496 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b5e55f41-e780-4667-8435-596089e03974-ovnkube-script-lib\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.015531 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-run-systemd\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.015590 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-host-run-ovn-kubernetes\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.015609 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b5e55f41-e780-4667-8435-596089e03974-ovnkube-config\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.015644 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-host-cni-netd\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.015668 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-host-slash\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.015692 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.015981 4869 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.016016 4869 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7edaf111-2689-4453-ba78-00677e1b6316-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.016033 4869 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.016048 4869 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7edaf111-2689-4453-ba78-00677e1b6316-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.016060 4869 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.016071 4869 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.016083 4869 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-log-socket\") on node \"crc\" DevicePath \"\"" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.016093 4869 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.016103 4869 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.016115 4869 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.016127 4869 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.016139 4869 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-node-log\") on node \"crc\" DevicePath \"\"" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.016151 4869 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-slash\") on node \"crc\" DevicePath \"\"" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.016162 4869 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.016173 4869 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.016182 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7edaf111-2689-4453-ba78-00677e1b6316-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "7edaf111-2689-4453-ba78-00677e1b6316" (UID: "7edaf111-2689-4453-ba78-00677e1b6316"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.016188 4869 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.019448 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7edaf111-2689-4453-ba78-00677e1b6316-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "7edaf111-2689-4453-ba78-00677e1b6316" (UID: "7edaf111-2689-4453-ba78-00677e1b6316"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.019698 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7edaf111-2689-4453-ba78-00677e1b6316-kube-api-access-t7rnt" (OuterVolumeSpecName: "kube-api-access-t7rnt") pod "7edaf111-2689-4453-ba78-00677e1b6316" (UID: "7edaf111-2689-4453-ba78-00677e1b6316"). InnerVolumeSpecName "kube-api-access-t7rnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.026864 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "7edaf111-2689-4453-ba78-00677e1b6316" (UID: "7edaf111-2689-4453-ba78-00677e1b6316"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.116772 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-node-log\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.116851 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-systemd-units\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.116875 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-log-socket\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.116897 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8djs\" (UniqueName: \"kubernetes.io/projected/b5e55f41-e780-4667-8435-596089e03974-kube-api-access-s8djs\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.116918 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-node-log\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.116984 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-var-lib-openvswitch\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.116929 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-var-lib-openvswitch\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.117030 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-systemd-units\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.117079 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-log-socket\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.117099 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-host-cni-bin\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.117129 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-host-cni-bin\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.117207 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b5e55f41-e780-4667-8435-596089e03974-ovnkube-script-lib\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.117320 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-run-systemd\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.117350 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-host-run-ovn-kubernetes\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.117370 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b5e55f41-e780-4667-8435-596089e03974-ovnkube-config\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.117394 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-host-cni-netd\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.117414 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.117435 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-host-slash\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.117484 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b5e55f41-e780-4667-8435-596089e03974-ovn-node-metrics-cert\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.117521 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-run-openvswitch\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.117563 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-host-run-netns\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.117588 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-run-ovn\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.117611 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b5e55f41-e780-4667-8435-596089e03974-env-overrides\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.117673 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-host-kubelet\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.117706 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-etc-openvswitch\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.117834 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7rnt\" (UniqueName: \"kubernetes.io/projected/7edaf111-2689-4453-ba78-00677e1b6316-kube-api-access-t7rnt\") on node \"crc\" DevicePath \"\"" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.117849 4869 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7edaf111-2689-4453-ba78-00677e1b6316-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.117862 4869 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7edaf111-2689-4453-ba78-00677e1b6316-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.117871 4869 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7edaf111-2689-4453-ba78-00677e1b6316-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.117888 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-run-systemd\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.117910 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-run-openvswitch\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.117918 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-etc-openvswitch\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.117939 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-run-ovn\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.117955 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.117997 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-host-run-netns\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.118008 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-host-cni-netd\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.118009 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b5e55f41-e780-4667-8435-596089e03974-ovnkube-script-lib\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.118033 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-host-run-ovn-kubernetes\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.118054 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-host-slash\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.118059 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b5e55f41-e780-4667-8435-596089e03974-host-kubelet\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.118747 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b5e55f41-e780-4667-8435-596089e03974-ovnkube-config\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.118911 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b5e55f41-e780-4667-8435-596089e03974-env-overrides\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.128441 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b5e55f41-e780-4667-8435-596089e03974-ovn-node-metrics-cert\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.139532 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8djs\" (UniqueName: \"kubernetes.io/projected/b5e55f41-e780-4667-8435-596089e03974-kube-api-access-s8djs\") pod \"ovnkube-node-v8zsn\" (UID: \"b5e55f41-e780-4667-8435-596089e03974\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.277422 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.285212 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-l8qfx_2fd2bb3f-6860-4631-a95c-c910d33724b6/kube-multus/0.log" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.285292 4869 generic.go:334] "Generic (PLEG): container finished" podID="2fd2bb3f-6860-4631-a95c-c910d33724b6" containerID="a83093a69c54535bd941bdaf89e585ba3f4af93800644747e8e2bdfcfd3bb0c6" exitCode=2 Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.285371 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-l8qfx" event={"ID":"2fd2bb3f-6860-4631-a95c-c910d33724b6","Type":"ContainerDied","Data":"a83093a69c54535bd941bdaf89e585ba3f4af93800644747e8e2bdfcfd3bb0c6"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.286207 4869 scope.go:117] "RemoveContainer" containerID="a83093a69c54535bd941bdaf89e585ba3f4af93800644747e8e2bdfcfd3bb0c6" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.290130 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42vwv_7edaf111-2689-4453-ba78-00677e1b6316/ovn-acl-logging/0.log" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.290691 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42vwv_7edaf111-2689-4453-ba78-00677e1b6316/ovn-controller/0.log" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291645 4869 generic.go:334] "Generic (PLEG): container finished" podID="7edaf111-2689-4453-ba78-00677e1b6316" containerID="571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160" exitCode=0 Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291664 4869 generic.go:334] "Generic (PLEG): container finished" podID="7edaf111-2689-4453-ba78-00677e1b6316" containerID="c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da" exitCode=0 Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291673 4869 generic.go:334] "Generic (PLEG): container finished" podID="7edaf111-2689-4453-ba78-00677e1b6316" containerID="51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58" exitCode=0 Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291684 4869 generic.go:334] "Generic (PLEG): container finished" podID="7edaf111-2689-4453-ba78-00677e1b6316" containerID="6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e" exitCode=0 Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291693 4869 generic.go:334] "Generic (PLEG): container finished" podID="7edaf111-2689-4453-ba78-00677e1b6316" containerID="a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89" exitCode=0 Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291704 4869 generic.go:334] "Generic (PLEG): container finished" podID="7edaf111-2689-4453-ba78-00677e1b6316" containerID="4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004" exitCode=0 Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291712 4869 generic.go:334] "Generic (PLEG): container finished" podID="7edaf111-2689-4453-ba78-00677e1b6316" containerID="8bb7615660554d3eb2719e46045dc9d6699152e0f276f8c1d6065ed4060f9f7a" exitCode=143 Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291721 4869 generic.go:334] "Generic (PLEG): container finished" podID="7edaf111-2689-4453-ba78-00677e1b6316" containerID="5d8e2551217a24af090084a54df75ab1d7ed1cc066577b6385867e41c18115f5" exitCode=143 Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291741 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" event={"ID":"7edaf111-2689-4453-ba78-00677e1b6316","Type":"ContainerDied","Data":"571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291763 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" event={"ID":"7edaf111-2689-4453-ba78-00677e1b6316","Type":"ContainerDied","Data":"c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291776 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" event={"ID":"7edaf111-2689-4453-ba78-00677e1b6316","Type":"ContainerDied","Data":"51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291785 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" event={"ID":"7edaf111-2689-4453-ba78-00677e1b6316","Type":"ContainerDied","Data":"6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291794 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" event={"ID":"7edaf111-2689-4453-ba78-00677e1b6316","Type":"ContainerDied","Data":"a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291803 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" event={"ID":"7edaf111-2689-4453-ba78-00677e1b6316","Type":"ContainerDied","Data":"4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291813 4869 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8bb7615660554d3eb2719e46045dc9d6699152e0f276f8c1d6065ed4060f9f7a"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291822 4869 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5d8e2551217a24af090084a54df75ab1d7ed1cc066577b6385867e41c18115f5"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291827 4869 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291834 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" event={"ID":"7edaf111-2689-4453-ba78-00677e1b6316","Type":"ContainerDied","Data":"8bb7615660554d3eb2719e46045dc9d6699152e0f276f8c1d6065ed4060f9f7a"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291841 4869 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291847 4869 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291852 4869 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291857 4869 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291862 4869 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291868 4869 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291873 4869 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8bb7615660554d3eb2719e46045dc9d6699152e0f276f8c1d6065ed4060f9f7a"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291877 4869 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5d8e2551217a24af090084a54df75ab1d7ed1cc066577b6385867e41c18115f5"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291883 4869 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291889 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" event={"ID":"7edaf111-2689-4453-ba78-00677e1b6316","Type":"ContainerDied","Data":"5d8e2551217a24af090084a54df75ab1d7ed1cc066577b6385867e41c18115f5"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291896 4869 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291902 4869 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291907 4869 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291912 4869 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291917 4869 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291921 4869 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291926 4869 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8bb7615660554d3eb2719e46045dc9d6699152e0f276f8c1d6065ed4060f9f7a"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291931 4869 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5d8e2551217a24af090084a54df75ab1d7ed1cc066577b6385867e41c18115f5"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291935 4869 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291942 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" event={"ID":"7edaf111-2689-4453-ba78-00677e1b6316","Type":"ContainerDied","Data":"7f1e42c08801651f0218ac42cd5a6a27bea2aa0c9fff4ba369ad616449216821"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291949 4869 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291955 4869 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291959 4869 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291964 4869 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291971 4869 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291976 4869 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291980 4869 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8bb7615660554d3eb2719e46045dc9d6699152e0f276f8c1d6065ed4060f9f7a"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291985 4869 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5d8e2551217a24af090084a54df75ab1d7ed1cc066577b6385867e41c18115f5"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.291990 4869 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a"} Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.292005 4869 scope.go:117] "RemoveContainer" containerID="571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.292177 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-42vwv" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.319094 4869 scope.go:117] "RemoveContainer" containerID="c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.332029 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-42vwv"] Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.335634 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-42vwv"] Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.342131 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7edaf111-2689-4453-ba78-00677e1b6316" path="/var/lib/kubelet/pods/7edaf111-2689-4453-ba78-00677e1b6316/volumes" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.342595 4869 scope.go:117] "RemoveContainer" containerID="51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.364735 4869 scope.go:117] "RemoveContainer" containerID="6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.380231 4869 scope.go:117] "RemoveContainer" containerID="a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.397933 4869 scope.go:117] "RemoveContainer" containerID="4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.414318 4869 scope.go:117] "RemoveContainer" containerID="8bb7615660554d3eb2719e46045dc9d6699152e0f276f8c1d6065ed4060f9f7a" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.498505 4869 scope.go:117] "RemoveContainer" containerID="5d8e2551217a24af090084a54df75ab1d7ed1cc066577b6385867e41c18115f5" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.511097 4869 scope.go:117] "RemoveContainer" containerID="70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.533748 4869 scope.go:117] "RemoveContainer" containerID="571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160" Mar 12 14:59:14 crc kubenswrapper[4869]: E0312 14:59:14.534814 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160\": container with ID starting with 571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160 not found: ID does not exist" containerID="571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.534865 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160"} err="failed to get container status \"571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160\": rpc error: code = NotFound desc = could not find container \"571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160\": container with ID starting with 571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160 not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.534939 4869 scope.go:117] "RemoveContainer" containerID="c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da" Mar 12 14:59:14 crc kubenswrapper[4869]: E0312 14:59:14.535259 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da\": container with ID starting with c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da not found: ID does not exist" containerID="c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.535299 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da"} err="failed to get container status \"c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da\": rpc error: code = NotFound desc = could not find container \"c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da\": container with ID starting with c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.535323 4869 scope.go:117] "RemoveContainer" containerID="51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58" Mar 12 14:59:14 crc kubenswrapper[4869]: E0312 14:59:14.535651 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58\": container with ID starting with 51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58 not found: ID does not exist" containerID="51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.535695 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58"} err="failed to get container status \"51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58\": rpc error: code = NotFound desc = could not find container \"51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58\": container with ID starting with 51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58 not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.535723 4869 scope.go:117] "RemoveContainer" containerID="6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e" Mar 12 14:59:14 crc kubenswrapper[4869]: E0312 14:59:14.536246 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e\": container with ID starting with 6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e not found: ID does not exist" containerID="6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.536285 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e"} err="failed to get container status \"6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e\": rpc error: code = NotFound desc = could not find container \"6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e\": container with ID starting with 6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.536306 4869 scope.go:117] "RemoveContainer" containerID="a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89" Mar 12 14:59:14 crc kubenswrapper[4869]: E0312 14:59:14.536648 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89\": container with ID starting with a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89 not found: ID does not exist" containerID="a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.536678 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89"} err="failed to get container status \"a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89\": rpc error: code = NotFound desc = could not find container \"a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89\": container with ID starting with a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89 not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.536696 4869 scope.go:117] "RemoveContainer" containerID="4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004" Mar 12 14:59:14 crc kubenswrapper[4869]: E0312 14:59:14.537212 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004\": container with ID starting with 4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004 not found: ID does not exist" containerID="4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.537242 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004"} err="failed to get container status \"4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004\": rpc error: code = NotFound desc = could not find container \"4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004\": container with ID starting with 4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004 not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.537264 4869 scope.go:117] "RemoveContainer" containerID="8bb7615660554d3eb2719e46045dc9d6699152e0f276f8c1d6065ed4060f9f7a" Mar 12 14:59:14 crc kubenswrapper[4869]: E0312 14:59:14.537553 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bb7615660554d3eb2719e46045dc9d6699152e0f276f8c1d6065ed4060f9f7a\": container with ID starting with 8bb7615660554d3eb2719e46045dc9d6699152e0f276f8c1d6065ed4060f9f7a not found: ID does not exist" containerID="8bb7615660554d3eb2719e46045dc9d6699152e0f276f8c1d6065ed4060f9f7a" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.537580 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bb7615660554d3eb2719e46045dc9d6699152e0f276f8c1d6065ed4060f9f7a"} err="failed to get container status \"8bb7615660554d3eb2719e46045dc9d6699152e0f276f8c1d6065ed4060f9f7a\": rpc error: code = NotFound desc = could not find container \"8bb7615660554d3eb2719e46045dc9d6699152e0f276f8c1d6065ed4060f9f7a\": container with ID starting with 8bb7615660554d3eb2719e46045dc9d6699152e0f276f8c1d6065ed4060f9f7a not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.537601 4869 scope.go:117] "RemoveContainer" containerID="5d8e2551217a24af090084a54df75ab1d7ed1cc066577b6385867e41c18115f5" Mar 12 14:59:14 crc kubenswrapper[4869]: E0312 14:59:14.537892 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d8e2551217a24af090084a54df75ab1d7ed1cc066577b6385867e41c18115f5\": container with ID starting with 5d8e2551217a24af090084a54df75ab1d7ed1cc066577b6385867e41c18115f5 not found: ID does not exist" containerID="5d8e2551217a24af090084a54df75ab1d7ed1cc066577b6385867e41c18115f5" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.537922 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d8e2551217a24af090084a54df75ab1d7ed1cc066577b6385867e41c18115f5"} err="failed to get container status \"5d8e2551217a24af090084a54df75ab1d7ed1cc066577b6385867e41c18115f5\": rpc error: code = NotFound desc = could not find container \"5d8e2551217a24af090084a54df75ab1d7ed1cc066577b6385867e41c18115f5\": container with ID starting with 5d8e2551217a24af090084a54df75ab1d7ed1cc066577b6385867e41c18115f5 not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.537940 4869 scope.go:117] "RemoveContainer" containerID="70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a" Mar 12 14:59:14 crc kubenswrapper[4869]: E0312 14:59:14.538225 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a\": container with ID starting with 70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a not found: ID does not exist" containerID="70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.538258 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a"} err="failed to get container status \"70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a\": rpc error: code = NotFound desc = could not find container \"70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a\": container with ID starting with 70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.538280 4869 scope.go:117] "RemoveContainer" containerID="571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.538561 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160"} err="failed to get container status \"571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160\": rpc error: code = NotFound desc = could not find container \"571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160\": container with ID starting with 571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160 not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.538586 4869 scope.go:117] "RemoveContainer" containerID="c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.538839 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da"} err="failed to get container status \"c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da\": rpc error: code = NotFound desc = could not find container \"c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da\": container with ID starting with c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.538862 4869 scope.go:117] "RemoveContainer" containerID="51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.539083 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58"} err="failed to get container status \"51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58\": rpc error: code = NotFound desc = could not find container \"51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58\": container with ID starting with 51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58 not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.539107 4869 scope.go:117] "RemoveContainer" containerID="6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.539371 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e"} err="failed to get container status \"6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e\": rpc error: code = NotFound desc = could not find container \"6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e\": container with ID starting with 6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.539402 4869 scope.go:117] "RemoveContainer" containerID="a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.539654 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89"} err="failed to get container status \"a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89\": rpc error: code = NotFound desc = could not find container \"a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89\": container with ID starting with a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89 not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.539678 4869 scope.go:117] "RemoveContainer" containerID="4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.540129 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004"} err="failed to get container status \"4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004\": rpc error: code = NotFound desc = could not find container \"4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004\": container with ID starting with 4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004 not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.540154 4869 scope.go:117] "RemoveContainer" containerID="8bb7615660554d3eb2719e46045dc9d6699152e0f276f8c1d6065ed4060f9f7a" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.540390 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bb7615660554d3eb2719e46045dc9d6699152e0f276f8c1d6065ed4060f9f7a"} err="failed to get container status \"8bb7615660554d3eb2719e46045dc9d6699152e0f276f8c1d6065ed4060f9f7a\": rpc error: code = NotFound desc = could not find container \"8bb7615660554d3eb2719e46045dc9d6699152e0f276f8c1d6065ed4060f9f7a\": container with ID starting with 8bb7615660554d3eb2719e46045dc9d6699152e0f276f8c1d6065ed4060f9f7a not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.540419 4869 scope.go:117] "RemoveContainer" containerID="5d8e2551217a24af090084a54df75ab1d7ed1cc066577b6385867e41c18115f5" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.540717 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d8e2551217a24af090084a54df75ab1d7ed1cc066577b6385867e41c18115f5"} err="failed to get container status \"5d8e2551217a24af090084a54df75ab1d7ed1cc066577b6385867e41c18115f5\": rpc error: code = NotFound desc = could not find container \"5d8e2551217a24af090084a54df75ab1d7ed1cc066577b6385867e41c18115f5\": container with ID starting with 5d8e2551217a24af090084a54df75ab1d7ed1cc066577b6385867e41c18115f5 not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.540743 4869 scope.go:117] "RemoveContainer" containerID="70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.541212 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a"} err="failed to get container status \"70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a\": rpc error: code = NotFound desc = could not find container \"70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a\": container with ID starting with 70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.541287 4869 scope.go:117] "RemoveContainer" containerID="571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.541592 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160"} err="failed to get container status \"571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160\": rpc error: code = NotFound desc = could not find container \"571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160\": container with ID starting with 571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160 not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.541635 4869 scope.go:117] "RemoveContainer" containerID="c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.541998 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da"} err="failed to get container status \"c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da\": rpc error: code = NotFound desc = could not find container \"c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da\": container with ID starting with c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.542053 4869 scope.go:117] "RemoveContainer" containerID="51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.542363 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58"} err="failed to get container status \"51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58\": rpc error: code = NotFound desc = could not find container \"51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58\": container with ID starting with 51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58 not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.542393 4869 scope.go:117] "RemoveContainer" containerID="6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.542665 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e"} err="failed to get container status \"6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e\": rpc error: code = NotFound desc = could not find container \"6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e\": container with ID starting with 6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.542693 4869 scope.go:117] "RemoveContainer" containerID="a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.542941 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89"} err="failed to get container status \"a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89\": rpc error: code = NotFound desc = could not find container \"a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89\": container with ID starting with a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89 not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.542973 4869 scope.go:117] "RemoveContainer" containerID="4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.543289 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004"} err="failed to get container status \"4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004\": rpc error: code = NotFound desc = could not find container \"4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004\": container with ID starting with 4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004 not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.543318 4869 scope.go:117] "RemoveContainer" containerID="8bb7615660554d3eb2719e46045dc9d6699152e0f276f8c1d6065ed4060f9f7a" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.543666 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bb7615660554d3eb2719e46045dc9d6699152e0f276f8c1d6065ed4060f9f7a"} err="failed to get container status \"8bb7615660554d3eb2719e46045dc9d6699152e0f276f8c1d6065ed4060f9f7a\": rpc error: code = NotFound desc = could not find container \"8bb7615660554d3eb2719e46045dc9d6699152e0f276f8c1d6065ed4060f9f7a\": container with ID starting with 8bb7615660554d3eb2719e46045dc9d6699152e0f276f8c1d6065ed4060f9f7a not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.543690 4869 scope.go:117] "RemoveContainer" containerID="5d8e2551217a24af090084a54df75ab1d7ed1cc066577b6385867e41c18115f5" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.544036 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d8e2551217a24af090084a54df75ab1d7ed1cc066577b6385867e41c18115f5"} err="failed to get container status \"5d8e2551217a24af090084a54df75ab1d7ed1cc066577b6385867e41c18115f5\": rpc error: code = NotFound desc = could not find container \"5d8e2551217a24af090084a54df75ab1d7ed1cc066577b6385867e41c18115f5\": container with ID starting with 5d8e2551217a24af090084a54df75ab1d7ed1cc066577b6385867e41c18115f5 not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.544070 4869 scope.go:117] "RemoveContainer" containerID="70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.544590 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a"} err="failed to get container status \"70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a\": rpc error: code = NotFound desc = could not find container \"70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a\": container with ID starting with 70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.544618 4869 scope.go:117] "RemoveContainer" containerID="571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.544869 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160"} err="failed to get container status \"571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160\": rpc error: code = NotFound desc = could not find container \"571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160\": container with ID starting with 571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160 not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.544893 4869 scope.go:117] "RemoveContainer" containerID="c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.545164 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da"} err="failed to get container status \"c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da\": rpc error: code = NotFound desc = could not find container \"c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da\": container with ID starting with c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.545188 4869 scope.go:117] "RemoveContainer" containerID="51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.545422 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58"} err="failed to get container status \"51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58\": rpc error: code = NotFound desc = could not find container \"51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58\": container with ID starting with 51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58 not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.545450 4869 scope.go:117] "RemoveContainer" containerID="6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.545777 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e"} err="failed to get container status \"6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e\": rpc error: code = NotFound desc = could not find container \"6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e\": container with ID starting with 6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.545802 4869 scope.go:117] "RemoveContainer" containerID="a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.546072 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89"} err="failed to get container status \"a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89\": rpc error: code = NotFound desc = could not find container \"a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89\": container with ID starting with a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89 not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.546108 4869 scope.go:117] "RemoveContainer" containerID="4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.546474 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004"} err="failed to get container status \"4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004\": rpc error: code = NotFound desc = could not find container \"4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004\": container with ID starting with 4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004 not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.546509 4869 scope.go:117] "RemoveContainer" containerID="8bb7615660554d3eb2719e46045dc9d6699152e0f276f8c1d6065ed4060f9f7a" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.546838 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bb7615660554d3eb2719e46045dc9d6699152e0f276f8c1d6065ed4060f9f7a"} err="failed to get container status \"8bb7615660554d3eb2719e46045dc9d6699152e0f276f8c1d6065ed4060f9f7a\": rpc error: code = NotFound desc = could not find container \"8bb7615660554d3eb2719e46045dc9d6699152e0f276f8c1d6065ed4060f9f7a\": container with ID starting with 8bb7615660554d3eb2719e46045dc9d6699152e0f276f8c1d6065ed4060f9f7a not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.546862 4869 scope.go:117] "RemoveContainer" containerID="5d8e2551217a24af090084a54df75ab1d7ed1cc066577b6385867e41c18115f5" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.547173 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d8e2551217a24af090084a54df75ab1d7ed1cc066577b6385867e41c18115f5"} err="failed to get container status \"5d8e2551217a24af090084a54df75ab1d7ed1cc066577b6385867e41c18115f5\": rpc error: code = NotFound desc = could not find container \"5d8e2551217a24af090084a54df75ab1d7ed1cc066577b6385867e41c18115f5\": container with ID starting with 5d8e2551217a24af090084a54df75ab1d7ed1cc066577b6385867e41c18115f5 not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.547212 4869 scope.go:117] "RemoveContainer" containerID="70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.547510 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a"} err="failed to get container status \"70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a\": rpc error: code = NotFound desc = could not find container \"70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a\": container with ID starting with 70fc1f84a9a35111063d35008ea70881943f5513b37dae1f93bfe8abf805554a not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.547553 4869 scope.go:117] "RemoveContainer" containerID="571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.547852 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160"} err="failed to get container status \"571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160\": rpc error: code = NotFound desc = could not find container \"571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160\": container with ID starting with 571fb7817d81131066b4026a8b964a3f230361d460540ab7e9369fc8b5961160 not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.547877 4869 scope.go:117] "RemoveContainer" containerID="c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.548237 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da"} err="failed to get container status \"c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da\": rpc error: code = NotFound desc = could not find container \"c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da\": container with ID starting with c8ff27d67647b948c0d7471c0fb406507de4634db88ce75901622be5af0d65da not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.548255 4869 scope.go:117] "RemoveContainer" containerID="51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.548550 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58"} err="failed to get container status \"51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58\": rpc error: code = NotFound desc = could not find container \"51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58\": container with ID starting with 51bbfda11dafa6eb7a920efac5723aacffb3fc3e3951dbed1db6b880fa8c4f58 not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.548575 4869 scope.go:117] "RemoveContainer" containerID="6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.548874 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e"} err="failed to get container status \"6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e\": rpc error: code = NotFound desc = could not find container \"6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e\": container with ID starting with 6ae530aa7257022916f6d0d92adea3c7053968c54ed9ef05c1fac10773209f3e not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.548917 4869 scope.go:117] "RemoveContainer" containerID="a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.549339 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89"} err="failed to get container status \"a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89\": rpc error: code = NotFound desc = could not find container \"a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89\": container with ID starting with a09be45ccfff10dcb63e3bff4393406583cd07eee574b5a862a32d93dda1ea89 not found: ID does not exist" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.549375 4869 scope.go:117] "RemoveContainer" containerID="4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004" Mar 12 14:59:14 crc kubenswrapper[4869]: I0312 14:59:14.549671 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004"} err="failed to get container status \"4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004\": rpc error: code = NotFound desc = could not find container \"4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004\": container with ID starting with 4ad2e003a61afe80836d59a40785a5f1eb243874c62353dcdd13025af4f48004 not found: ID does not exist" Mar 12 14:59:15 crc kubenswrapper[4869]: I0312 14:59:15.300661 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-l8qfx_2fd2bb3f-6860-4631-a95c-c910d33724b6/kube-multus/0.log" Mar 12 14:59:15 crc kubenswrapper[4869]: I0312 14:59:15.300757 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-l8qfx" event={"ID":"2fd2bb3f-6860-4631-a95c-c910d33724b6","Type":"ContainerStarted","Data":"610800b5fc04120447d6320b6de17100c9c0a5a3ac40ff5adf3b79895b285041"} Mar 12 14:59:15 crc kubenswrapper[4869]: I0312 14:59:15.303391 4869 generic.go:334] "Generic (PLEG): container finished" podID="b5e55f41-e780-4667-8435-596089e03974" containerID="8b6f9fdbb7142e28cf039ec181204f6867cfae8170aa03c35b0b250d40529e11" exitCode=0 Mar 12 14:59:15 crc kubenswrapper[4869]: I0312 14:59:15.303438 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" event={"ID":"b5e55f41-e780-4667-8435-596089e03974","Type":"ContainerDied","Data":"8b6f9fdbb7142e28cf039ec181204f6867cfae8170aa03c35b0b250d40529e11"} Mar 12 14:59:15 crc kubenswrapper[4869]: I0312 14:59:15.303472 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" event={"ID":"b5e55f41-e780-4667-8435-596089e03974","Type":"ContainerStarted","Data":"989aefcf7d575f12e4d024ad003db2c09ecc29531139c12f71ba2259ab87eef9"} Mar 12 14:59:16 crc kubenswrapper[4869]: I0312 14:59:16.311268 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" event={"ID":"b5e55f41-e780-4667-8435-596089e03974","Type":"ContainerStarted","Data":"2319adb4e57208b0e36446d3806dc9d7f93f526776f9a0aba2c2427817781c81"} Mar 12 14:59:16 crc kubenswrapper[4869]: I0312 14:59:16.311781 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" event={"ID":"b5e55f41-e780-4667-8435-596089e03974","Type":"ContainerStarted","Data":"07a07a5fec91d34b8ee6e1f7cf5c8147789606acc7a063774b3f6d0ba46ad9c3"} Mar 12 14:59:16 crc kubenswrapper[4869]: I0312 14:59:16.311795 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" event={"ID":"b5e55f41-e780-4667-8435-596089e03974","Type":"ContainerStarted","Data":"ec02f96653dd31c2bcf54638a2bdd034f74dd00b856a36b27abdf1881541eb32"} Mar 12 14:59:16 crc kubenswrapper[4869]: I0312 14:59:16.311805 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" event={"ID":"b5e55f41-e780-4667-8435-596089e03974","Type":"ContainerStarted","Data":"2c2cf707f5f9a1be883e36586cd22054bcb9108b9cceeb628b966f05104aecb4"} Mar 12 14:59:16 crc kubenswrapper[4869]: I0312 14:59:16.311813 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" event={"ID":"b5e55f41-e780-4667-8435-596089e03974","Type":"ContainerStarted","Data":"a4353c1fd20cdf5a8e0f7d0d89ff19c4b6781105e192f55274039104abf2a858"} Mar 12 14:59:16 crc kubenswrapper[4869]: I0312 14:59:16.311821 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" event={"ID":"b5e55f41-e780-4667-8435-596089e03974","Type":"ContainerStarted","Data":"da0f8048ec6e3af994fd62fadf36af397e8a1fe69e45d3cc281551d93fc2d014"} Mar 12 14:59:18 crc kubenswrapper[4869]: I0312 14:59:18.330974 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" event={"ID":"b5e55f41-e780-4667-8435-596089e03974","Type":"ContainerStarted","Data":"93babac243279ec1c0e6eea681a810439bf8e39524c9eef07546d4570f52552e"} Mar 12 14:59:19 crc kubenswrapper[4869]: I0312 14:59:19.684028 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:59:19 crc kubenswrapper[4869]: I0312 14:59:19.684303 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:59:19 crc kubenswrapper[4869]: I0312 14:59:19.684340 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" Mar 12 14:59:20 crc kubenswrapper[4869]: I0312 14:59:20.341137 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"782a4d0854bd913b58ff7c98e0d483c61a66e1f21a0fa88c58c100bef1826876"} pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 14:59:20 crc kubenswrapper[4869]: I0312 14:59:20.341219 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" containerID="cri-o://782a4d0854bd913b58ff7c98e0d483c61a66e1f21a0fa88c58c100bef1826876" gracePeriod=600 Mar 12 14:59:21 crc kubenswrapper[4869]: I0312 14:59:21.348660 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" event={"ID":"b5e55f41-e780-4667-8435-596089e03974","Type":"ContainerStarted","Data":"f9ea830b70fe9e3603f19d80406452c596affee3bc7f40115192047edf011a0a"} Mar 12 14:59:21 crc kubenswrapper[4869]: I0312 14:59:21.349003 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:21 crc kubenswrapper[4869]: I0312 14:59:21.349015 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:21 crc kubenswrapper[4869]: I0312 14:59:21.351618 4869 generic.go:334] "Generic (PLEG): container finished" podID="1621c994-94d2-4105-a988-f4739518ba91" containerID="782a4d0854bd913b58ff7c98e0d483c61a66e1f21a0fa88c58c100bef1826876" exitCode=0 Mar 12 14:59:21 crc kubenswrapper[4869]: I0312 14:59:21.351665 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerDied","Data":"782a4d0854bd913b58ff7c98e0d483c61a66e1f21a0fa88c58c100bef1826876"} Mar 12 14:59:21 crc kubenswrapper[4869]: I0312 14:59:21.351704 4869 scope.go:117] "RemoveContainer" containerID="7938ffdcb0f5d68aef181bbbe274d4e45cbcba7c963f76ee6c1f153d4d2ccdd0" Mar 12 14:59:21 crc kubenswrapper[4869]: I0312 14:59:21.377574 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:21 crc kubenswrapper[4869]: I0312 14:59:21.380746 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" podStartSLOduration=8.380728353 podStartE2EDuration="8.380728353s" podCreationTimestamp="2026-03-12 14:59:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:59:21.377615125 +0000 UTC m=+713.662840413" watchObservedRunningTime="2026-03-12 14:59:21.380728353 +0000 UTC m=+713.665953631" Mar 12 14:59:22 crc kubenswrapper[4869]: I0312 14:59:22.359904 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerStarted","Data":"f13d6e2c562cf7652d696fbd35b8d5dcd20c099639553782e920f330cc3ff75c"} Mar 12 14:59:22 crc kubenswrapper[4869]: I0312 14:59:22.362573 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:22 crc kubenswrapper[4869]: I0312 14:59:22.428941 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:30 crc kubenswrapper[4869]: I0312 14:59:30.255436 4869 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 12 14:59:34 crc kubenswrapper[4869]: I0312 14:59:34.907194 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph"] Mar 12 14:59:34 crc kubenswrapper[4869]: I0312 14:59:34.908524 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph" Mar 12 14:59:34 crc kubenswrapper[4869]: I0312 14:59:34.911078 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 12 14:59:34 crc kubenswrapper[4869]: I0312 14:59:34.911225 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 12 14:59:34 crc kubenswrapper[4869]: I0312 14:59:34.911788 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-fskfm" Mar 12 14:59:35 crc kubenswrapper[4869]: I0312 14:59:35.075941 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/84a3366d-25a7-430b-839c-f7f21cbac99a-log\") pod \"ceph\" (UID: \"84a3366d-25a7-430b-839c-f7f21cbac99a\") " pod="openstack/ceph" Mar 12 14:59:35 crc kubenswrapper[4869]: I0312 14:59:35.075999 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg46x\" (UniqueName: \"kubernetes.io/projected/84a3366d-25a7-430b-839c-f7f21cbac99a-kube-api-access-kg46x\") pod \"ceph\" (UID: \"84a3366d-25a7-430b-839c-f7f21cbac99a\") " pod="openstack/ceph" Mar 12 14:59:35 crc kubenswrapper[4869]: I0312 14:59:35.076039 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/84a3366d-25a7-430b-839c-f7f21cbac99a-data\") pod \"ceph\" (UID: \"84a3366d-25a7-430b-839c-f7f21cbac99a\") " pod="openstack/ceph" Mar 12 14:59:35 crc kubenswrapper[4869]: I0312 14:59:35.076659 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/84a3366d-25a7-430b-839c-f7f21cbac99a-run\") pod \"ceph\" (UID: \"84a3366d-25a7-430b-839c-f7f21cbac99a\") " pod="openstack/ceph" Mar 12 14:59:35 crc kubenswrapper[4869]: I0312 14:59:35.178284 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg46x\" (UniqueName: \"kubernetes.io/projected/84a3366d-25a7-430b-839c-f7f21cbac99a-kube-api-access-kg46x\") pod \"ceph\" (UID: \"84a3366d-25a7-430b-839c-f7f21cbac99a\") " pod="openstack/ceph" Mar 12 14:59:35 crc kubenswrapper[4869]: I0312 14:59:35.178393 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/84a3366d-25a7-430b-839c-f7f21cbac99a-data\") pod \"ceph\" (UID: \"84a3366d-25a7-430b-839c-f7f21cbac99a\") " pod="openstack/ceph" Mar 12 14:59:35 crc kubenswrapper[4869]: I0312 14:59:35.178435 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/84a3366d-25a7-430b-839c-f7f21cbac99a-run\") pod \"ceph\" (UID: \"84a3366d-25a7-430b-839c-f7f21cbac99a\") " pod="openstack/ceph" Mar 12 14:59:35 crc kubenswrapper[4869]: I0312 14:59:35.178609 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/84a3366d-25a7-430b-839c-f7f21cbac99a-log\") pod \"ceph\" (UID: \"84a3366d-25a7-430b-839c-f7f21cbac99a\") " pod="openstack/ceph" Mar 12 14:59:35 crc kubenswrapper[4869]: I0312 14:59:35.178936 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/84a3366d-25a7-430b-839c-f7f21cbac99a-data\") pod \"ceph\" (UID: \"84a3366d-25a7-430b-839c-f7f21cbac99a\") " pod="openstack/ceph" Mar 12 14:59:35 crc kubenswrapper[4869]: I0312 14:59:35.179154 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/84a3366d-25a7-430b-839c-f7f21cbac99a-log\") pod \"ceph\" (UID: \"84a3366d-25a7-430b-839c-f7f21cbac99a\") " pod="openstack/ceph" Mar 12 14:59:35 crc kubenswrapper[4869]: I0312 14:59:35.179491 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/84a3366d-25a7-430b-839c-f7f21cbac99a-run\") pod \"ceph\" (UID: \"84a3366d-25a7-430b-839c-f7f21cbac99a\") " pod="openstack/ceph" Mar 12 14:59:35 crc kubenswrapper[4869]: I0312 14:59:35.202100 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg46x\" (UniqueName: \"kubernetes.io/projected/84a3366d-25a7-430b-839c-f7f21cbac99a-kube-api-access-kg46x\") pod \"ceph\" (UID: \"84a3366d-25a7-430b-839c-f7f21cbac99a\") " pod="openstack/ceph" Mar 12 14:59:35 crc kubenswrapper[4869]: I0312 14:59:35.229608 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph" Mar 12 14:59:35 crc kubenswrapper[4869]: W0312 14:59:35.256777 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84a3366d_25a7_430b_839c_f7f21cbac99a.slice/crio-29702e2c20b82b8e2ad9b611a5040f555168d2dac2f13d2d212f98a18e2f7aa7 WatchSource:0}: Error finding container 29702e2c20b82b8e2ad9b611a5040f555168d2dac2f13d2d212f98a18e2f7aa7: Status 404 returned error can't find the container with id 29702e2c20b82b8e2ad9b611a5040f555168d2dac2f13d2d212f98a18e2f7aa7 Mar 12 14:59:35 crc kubenswrapper[4869]: I0312 14:59:35.436128 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph" event={"ID":"84a3366d-25a7-430b-839c-f7f21cbac99a","Type":"ContainerStarted","Data":"29702e2c20b82b8e2ad9b611a5040f555168d2dac2f13d2d212f98a18e2f7aa7"} Mar 12 14:59:44 crc kubenswrapper[4869]: I0312 14:59:44.303053 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v8zsn" Mar 12 14:59:44 crc kubenswrapper[4869]: I0312 14:59:44.335182 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-trwvm"] Mar 12 14:59:44 crc kubenswrapper[4869]: I0312 14:59:44.337208 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-trwvm" Mar 12 14:59:44 crc kubenswrapper[4869]: I0312 14:59:44.356718 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-trwvm"] Mar 12 14:59:44 crc kubenswrapper[4869]: I0312 14:59:44.403413 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c69f1078-e5b7-45f0-a193-d9537762ea14-catalog-content\") pod \"redhat-marketplace-trwvm\" (UID: \"c69f1078-e5b7-45f0-a193-d9537762ea14\") " pod="openshift-marketplace/redhat-marketplace-trwvm" Mar 12 14:59:44 crc kubenswrapper[4869]: I0312 14:59:44.403563 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c69f1078-e5b7-45f0-a193-d9537762ea14-utilities\") pod \"redhat-marketplace-trwvm\" (UID: \"c69f1078-e5b7-45f0-a193-d9537762ea14\") " pod="openshift-marketplace/redhat-marketplace-trwvm" Mar 12 14:59:44 crc kubenswrapper[4869]: I0312 14:59:44.403582 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phrms\" (UniqueName: \"kubernetes.io/projected/c69f1078-e5b7-45f0-a193-d9537762ea14-kube-api-access-phrms\") pod \"redhat-marketplace-trwvm\" (UID: \"c69f1078-e5b7-45f0-a193-d9537762ea14\") " pod="openshift-marketplace/redhat-marketplace-trwvm" Mar 12 14:59:44 crc kubenswrapper[4869]: I0312 14:59:44.504259 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c69f1078-e5b7-45f0-a193-d9537762ea14-utilities\") pod \"redhat-marketplace-trwvm\" (UID: \"c69f1078-e5b7-45f0-a193-d9537762ea14\") " pod="openshift-marketplace/redhat-marketplace-trwvm" Mar 12 14:59:44 crc kubenswrapper[4869]: I0312 14:59:44.504304 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phrms\" (UniqueName: \"kubernetes.io/projected/c69f1078-e5b7-45f0-a193-d9537762ea14-kube-api-access-phrms\") pod \"redhat-marketplace-trwvm\" (UID: \"c69f1078-e5b7-45f0-a193-d9537762ea14\") " pod="openshift-marketplace/redhat-marketplace-trwvm" Mar 12 14:59:44 crc kubenswrapper[4869]: I0312 14:59:44.504344 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c69f1078-e5b7-45f0-a193-d9537762ea14-catalog-content\") pod \"redhat-marketplace-trwvm\" (UID: \"c69f1078-e5b7-45f0-a193-d9537762ea14\") " pod="openshift-marketplace/redhat-marketplace-trwvm" Mar 12 14:59:44 crc kubenswrapper[4869]: I0312 14:59:44.504810 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c69f1078-e5b7-45f0-a193-d9537762ea14-utilities\") pod \"redhat-marketplace-trwvm\" (UID: \"c69f1078-e5b7-45f0-a193-d9537762ea14\") " pod="openshift-marketplace/redhat-marketplace-trwvm" Mar 12 14:59:44 crc kubenswrapper[4869]: I0312 14:59:44.504822 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c69f1078-e5b7-45f0-a193-d9537762ea14-catalog-content\") pod \"redhat-marketplace-trwvm\" (UID: \"c69f1078-e5b7-45f0-a193-d9537762ea14\") " pod="openshift-marketplace/redhat-marketplace-trwvm" Mar 12 14:59:44 crc kubenswrapper[4869]: I0312 14:59:44.524433 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phrms\" (UniqueName: \"kubernetes.io/projected/c69f1078-e5b7-45f0-a193-d9537762ea14-kube-api-access-phrms\") pod \"redhat-marketplace-trwvm\" (UID: \"c69f1078-e5b7-45f0-a193-d9537762ea14\") " pod="openshift-marketplace/redhat-marketplace-trwvm" Mar 12 14:59:44 crc kubenswrapper[4869]: I0312 14:59:44.681239 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-trwvm" Mar 12 14:59:50 crc kubenswrapper[4869]: I0312 14:59:50.793372 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-trwvm"] Mar 12 14:59:51 crc kubenswrapper[4869]: I0312 14:59:51.521473 4869 generic.go:334] "Generic (PLEG): container finished" podID="c69f1078-e5b7-45f0-a193-d9537762ea14" containerID="8763c469ce6fee5bcadd57cdf9c119c1b531e8e4a3f269cfdc820ed4e303ede5" exitCode=0 Mar 12 14:59:51 crc kubenswrapper[4869]: I0312 14:59:51.521610 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trwvm" event={"ID":"c69f1078-e5b7-45f0-a193-d9537762ea14","Type":"ContainerDied","Data":"8763c469ce6fee5bcadd57cdf9c119c1b531e8e4a3f269cfdc820ed4e303ede5"} Mar 12 14:59:51 crc kubenswrapper[4869]: I0312 14:59:51.522314 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trwvm" event={"ID":"c69f1078-e5b7-45f0-a193-d9537762ea14","Type":"ContainerStarted","Data":"b9cd9ae40bc860a90bcbdcb5744535dbdd2a5e6678a3653bba61022730fecea1"} Mar 12 14:59:51 crc kubenswrapper[4869]: I0312 14:59:51.528517 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph" event={"ID":"84a3366d-25a7-430b-839c-f7f21cbac99a","Type":"ContainerStarted","Data":"2302bf3721c709311b8d5f7dca2715b9f87ab481ab12af2d5a4d6912b661586b"} Mar 12 14:59:51 crc kubenswrapper[4869]: I0312 14:59:51.561161 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph" podStartSLOduration=2.234511905 podStartE2EDuration="17.561130389s" podCreationTimestamp="2026-03-12 14:59:34 +0000 UTC" firstStartedPulling="2026-03-12 14:59:35.259107453 +0000 UTC m=+727.544332731" lastFinishedPulling="2026-03-12 14:59:50.585725927 +0000 UTC m=+742.870951215" observedRunningTime="2026-03-12 14:59:51.553836543 +0000 UTC m=+743.839061871" watchObservedRunningTime="2026-03-12 14:59:51.561130389 +0000 UTC m=+743.846355707" Mar 12 14:59:52 crc kubenswrapper[4869]: I0312 14:59:52.536446 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trwvm" event={"ID":"c69f1078-e5b7-45f0-a193-d9537762ea14","Type":"ContainerStarted","Data":"3c3a02490f688019697f2cff47eb4fc5a599706a0cd09a3f5e7d77c06e346f3d"} Mar 12 14:59:53 crc kubenswrapper[4869]: I0312 14:59:53.542954 4869 generic.go:334] "Generic (PLEG): container finished" podID="c69f1078-e5b7-45f0-a193-d9537762ea14" containerID="3c3a02490f688019697f2cff47eb4fc5a599706a0cd09a3f5e7d77c06e346f3d" exitCode=0 Mar 12 14:59:53 crc kubenswrapper[4869]: I0312 14:59:53.543318 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trwvm" event={"ID":"c69f1078-e5b7-45f0-a193-d9537762ea14","Type":"ContainerDied","Data":"3c3a02490f688019697f2cff47eb4fc5a599706a0cd09a3f5e7d77c06e346f3d"} Mar 12 14:59:54 crc kubenswrapper[4869]: I0312 14:59:54.549153 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trwvm" event={"ID":"c69f1078-e5b7-45f0-a193-d9537762ea14","Type":"ContainerStarted","Data":"aaebaf6370ac43a2e34efce62594d3947fe9d433f17c244192a49ff621404948"} Mar 12 14:59:54 crc kubenswrapper[4869]: I0312 14:59:54.682327 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-trwvm" Mar 12 14:59:54 crc kubenswrapper[4869]: I0312 14:59:54.682672 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-trwvm" Mar 12 14:59:55 crc kubenswrapper[4869]: E0312 14:59:55.695768 4869 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.223:59422->38.102.83.223:34669: write tcp 38.102.83.223:59422->38.102.83.223:34669: write: broken pipe Mar 12 14:59:55 crc kubenswrapper[4869]: I0312 14:59:55.727016 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-trwvm" podUID="c69f1078-e5b7-45f0-a193-d9537762ea14" containerName="registry-server" probeResult="failure" output=< Mar 12 14:59:55 crc kubenswrapper[4869]: timeout: failed to connect service ":50051" within 1s Mar 12 14:59:55 crc kubenswrapper[4869]: > Mar 12 14:59:55 crc kubenswrapper[4869]: I0312 14:59:55.746580 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-trwvm" podStartSLOduration=9.321298443 podStartE2EDuration="11.746557874s" podCreationTimestamp="2026-03-12 14:59:44 +0000 UTC" firstStartedPulling="2026-03-12 14:59:51.528694944 +0000 UTC m=+743.813920232" lastFinishedPulling="2026-03-12 14:59:53.953954375 +0000 UTC m=+746.239179663" observedRunningTime="2026-03-12 14:59:54.571636613 +0000 UTC m=+746.856861891" watchObservedRunningTime="2026-03-12 14:59:55.746557874 +0000 UTC m=+748.031783162" Mar 12 14:59:55 crc kubenswrapper[4869]: I0312 14:59:55.751229 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q5lql"] Mar 12 14:59:55 crc kubenswrapper[4869]: I0312 14:59:55.752470 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q5lql" Mar 12 14:59:55 crc kubenswrapper[4869]: I0312 14:59:55.768617 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q5lql"] Mar 12 14:59:55 crc kubenswrapper[4869]: I0312 14:59:55.902652 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgdl7\" (UniqueName: \"kubernetes.io/projected/af93b315-e2d0-4075-a423-944feada01ec-kube-api-access-rgdl7\") pod \"certified-operators-q5lql\" (UID: \"af93b315-e2d0-4075-a423-944feada01ec\") " pod="openshift-marketplace/certified-operators-q5lql" Mar 12 14:59:55 crc kubenswrapper[4869]: I0312 14:59:55.902820 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af93b315-e2d0-4075-a423-944feada01ec-utilities\") pod \"certified-operators-q5lql\" (UID: \"af93b315-e2d0-4075-a423-944feada01ec\") " pod="openshift-marketplace/certified-operators-q5lql" Mar 12 14:59:55 crc kubenswrapper[4869]: I0312 14:59:55.903003 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af93b315-e2d0-4075-a423-944feada01ec-catalog-content\") pod \"certified-operators-q5lql\" (UID: \"af93b315-e2d0-4075-a423-944feada01ec\") " pod="openshift-marketplace/certified-operators-q5lql" Mar 12 14:59:56 crc kubenswrapper[4869]: I0312 14:59:56.003917 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af93b315-e2d0-4075-a423-944feada01ec-catalog-content\") pod \"certified-operators-q5lql\" (UID: \"af93b315-e2d0-4075-a423-944feada01ec\") " pod="openshift-marketplace/certified-operators-q5lql" Mar 12 14:59:56 crc kubenswrapper[4869]: I0312 14:59:56.004013 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgdl7\" (UniqueName: \"kubernetes.io/projected/af93b315-e2d0-4075-a423-944feada01ec-kube-api-access-rgdl7\") pod \"certified-operators-q5lql\" (UID: \"af93b315-e2d0-4075-a423-944feada01ec\") " pod="openshift-marketplace/certified-operators-q5lql" Mar 12 14:59:56 crc kubenswrapper[4869]: I0312 14:59:56.004041 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af93b315-e2d0-4075-a423-944feada01ec-utilities\") pod \"certified-operators-q5lql\" (UID: \"af93b315-e2d0-4075-a423-944feada01ec\") " pod="openshift-marketplace/certified-operators-q5lql" Mar 12 14:59:56 crc kubenswrapper[4869]: I0312 14:59:56.004583 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af93b315-e2d0-4075-a423-944feada01ec-catalog-content\") pod \"certified-operators-q5lql\" (UID: \"af93b315-e2d0-4075-a423-944feada01ec\") " pod="openshift-marketplace/certified-operators-q5lql" Mar 12 14:59:56 crc kubenswrapper[4869]: I0312 14:59:56.004634 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af93b315-e2d0-4075-a423-944feada01ec-utilities\") pod \"certified-operators-q5lql\" (UID: \"af93b315-e2d0-4075-a423-944feada01ec\") " pod="openshift-marketplace/certified-operators-q5lql" Mar 12 14:59:56 crc kubenswrapper[4869]: I0312 14:59:56.024907 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgdl7\" (UniqueName: \"kubernetes.io/projected/af93b315-e2d0-4075-a423-944feada01ec-kube-api-access-rgdl7\") pod \"certified-operators-q5lql\" (UID: \"af93b315-e2d0-4075-a423-944feada01ec\") " pod="openshift-marketplace/certified-operators-q5lql" Mar 12 14:59:56 crc kubenswrapper[4869]: I0312 14:59:56.067673 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q5lql" Mar 12 14:59:56 crc kubenswrapper[4869]: I0312 14:59:56.282392 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q5lql"] Mar 12 14:59:56 crc kubenswrapper[4869]: W0312 14:59:56.293901 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf93b315_e2d0_4075_a423_944feada01ec.slice/crio-71dbded7bcd75fd80cb355db9a9c36b88d09698678c823492217dd7f9c4ca317 WatchSource:0}: Error finding container 71dbded7bcd75fd80cb355db9a9c36b88d09698678c823492217dd7f9c4ca317: Status 404 returned error can't find the container with id 71dbded7bcd75fd80cb355db9a9c36b88d09698678c823492217dd7f9c4ca317 Mar 12 14:59:56 crc kubenswrapper[4869]: I0312 14:59:56.561528 4869 generic.go:334] "Generic (PLEG): container finished" podID="af93b315-e2d0-4075-a423-944feada01ec" containerID="0848af52a63d922288031fa0c54e5b9053d2425cb253f027df19afc19907cb10" exitCode=0 Mar 12 14:59:56 crc kubenswrapper[4869]: I0312 14:59:56.562689 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q5lql" event={"ID":"af93b315-e2d0-4075-a423-944feada01ec","Type":"ContainerDied","Data":"0848af52a63d922288031fa0c54e5b9053d2425cb253f027df19afc19907cb10"} Mar 12 14:59:56 crc kubenswrapper[4869]: I0312 14:59:56.562752 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q5lql" event={"ID":"af93b315-e2d0-4075-a423-944feada01ec","Type":"ContainerStarted","Data":"71dbded7bcd75fd80cb355db9a9c36b88d09698678c823492217dd7f9c4ca317"} Mar 12 14:59:58 crc kubenswrapper[4869]: I0312 14:59:58.576236 4869 generic.go:334] "Generic (PLEG): container finished" podID="af93b315-e2d0-4075-a423-944feada01ec" containerID="58e54c879ae3e57f12e1933f3b1495daba9de10031454ff4abe3d4da77841a92" exitCode=0 Mar 12 14:59:58 crc kubenswrapper[4869]: I0312 14:59:58.576458 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q5lql" event={"ID":"af93b315-e2d0-4075-a423-944feada01ec","Type":"ContainerDied","Data":"58e54c879ae3e57f12e1933f3b1495daba9de10031454ff4abe3d4da77841a92"} Mar 12 14:59:59 crc kubenswrapper[4869]: I0312 14:59:59.583855 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q5lql" event={"ID":"af93b315-e2d0-4075-a423-944feada01ec","Type":"ContainerStarted","Data":"290c87c4a87501b48fa21f152ad6caff8bebaf9312e19f4e276e5f259011759a"} Mar 12 14:59:59 crc kubenswrapper[4869]: I0312 14:59:59.599565 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q5lql" podStartSLOduration=2.167171679 podStartE2EDuration="4.59955007s" podCreationTimestamp="2026-03-12 14:59:55 +0000 UTC" firstStartedPulling="2026-03-12 14:59:56.56365559 +0000 UTC m=+748.848880868" lastFinishedPulling="2026-03-12 14:59:58.996033961 +0000 UTC m=+751.281259259" observedRunningTime="2026-03-12 14:59:59.598416398 +0000 UTC m=+751.883641706" watchObservedRunningTime="2026-03-12 14:59:59.59955007 +0000 UTC m=+751.884775348" Mar 12 15:00:00 crc kubenswrapper[4869]: I0312 15:00:00.134967 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555460-f925k"] Mar 12 15:00:00 crc kubenswrapper[4869]: I0312 15:00:00.135843 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-f925k" Mar 12 15:00:00 crc kubenswrapper[4869]: I0312 15:00:00.137244 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 12 15:00:00 crc kubenswrapper[4869]: I0312 15:00:00.137303 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 12 15:00:00 crc kubenswrapper[4869]: I0312 15:00:00.139352 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555460-9kdz9"] Mar 12 15:00:00 crc kubenswrapper[4869]: I0312 15:00:00.140456 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555460-9kdz9" Mar 12 15:00:00 crc kubenswrapper[4869]: I0312 15:00:00.142241 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 15:00:00 crc kubenswrapper[4869]: I0312 15:00:00.143279 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:00:00 crc kubenswrapper[4869]: I0312 15:00:00.144287 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555460-f925k"] Mar 12 15:00:00 crc kubenswrapper[4869]: I0312 15:00:00.145404 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:00:00 crc kubenswrapper[4869]: I0312 15:00:00.149190 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555460-9kdz9"] Mar 12 15:00:00 crc kubenswrapper[4869]: I0312 15:00:00.162964 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnz8b\" (UniqueName: \"kubernetes.io/projected/6119e4f3-9261-46ad-a270-8ce6a7b7bba4-kube-api-access-vnz8b\") pod \"collect-profiles-29555460-f925k\" (UID: \"6119e4f3-9261-46ad-a270-8ce6a7b7bba4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-f925k" Mar 12 15:00:00 crc kubenswrapper[4869]: I0312 15:00:00.163045 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6119e4f3-9261-46ad-a270-8ce6a7b7bba4-secret-volume\") pod \"collect-profiles-29555460-f925k\" (UID: \"6119e4f3-9261-46ad-a270-8ce6a7b7bba4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-f925k" Mar 12 15:00:00 crc kubenswrapper[4869]: I0312 15:00:00.163109 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tq7p\" (UniqueName: \"kubernetes.io/projected/a6888aca-055f-4d64-a0a6-e42ea6011fef-kube-api-access-6tq7p\") pod \"auto-csr-approver-29555460-9kdz9\" (UID: \"a6888aca-055f-4d64-a0a6-e42ea6011fef\") " pod="openshift-infra/auto-csr-approver-29555460-9kdz9" Mar 12 15:00:00 crc kubenswrapper[4869]: I0312 15:00:00.163137 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6119e4f3-9261-46ad-a270-8ce6a7b7bba4-config-volume\") pod \"collect-profiles-29555460-f925k\" (UID: \"6119e4f3-9261-46ad-a270-8ce6a7b7bba4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-f925k" Mar 12 15:00:00 crc kubenswrapper[4869]: I0312 15:00:00.264159 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tq7p\" (UniqueName: \"kubernetes.io/projected/a6888aca-055f-4d64-a0a6-e42ea6011fef-kube-api-access-6tq7p\") pod \"auto-csr-approver-29555460-9kdz9\" (UID: \"a6888aca-055f-4d64-a0a6-e42ea6011fef\") " pod="openshift-infra/auto-csr-approver-29555460-9kdz9" Mar 12 15:00:00 crc kubenswrapper[4869]: I0312 15:00:00.264222 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6119e4f3-9261-46ad-a270-8ce6a7b7bba4-config-volume\") pod \"collect-profiles-29555460-f925k\" (UID: \"6119e4f3-9261-46ad-a270-8ce6a7b7bba4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-f925k" Mar 12 15:00:00 crc kubenswrapper[4869]: I0312 15:00:00.264259 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnz8b\" (UniqueName: \"kubernetes.io/projected/6119e4f3-9261-46ad-a270-8ce6a7b7bba4-kube-api-access-vnz8b\") pod \"collect-profiles-29555460-f925k\" (UID: \"6119e4f3-9261-46ad-a270-8ce6a7b7bba4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-f925k" Mar 12 15:00:00 crc kubenswrapper[4869]: I0312 15:00:00.264295 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6119e4f3-9261-46ad-a270-8ce6a7b7bba4-secret-volume\") pod \"collect-profiles-29555460-f925k\" (UID: \"6119e4f3-9261-46ad-a270-8ce6a7b7bba4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-f925k" Mar 12 15:00:00 crc kubenswrapper[4869]: I0312 15:00:00.265497 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6119e4f3-9261-46ad-a270-8ce6a7b7bba4-config-volume\") pod \"collect-profiles-29555460-f925k\" (UID: \"6119e4f3-9261-46ad-a270-8ce6a7b7bba4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-f925k" Mar 12 15:00:00 crc kubenswrapper[4869]: I0312 15:00:00.269964 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6119e4f3-9261-46ad-a270-8ce6a7b7bba4-secret-volume\") pod \"collect-profiles-29555460-f925k\" (UID: \"6119e4f3-9261-46ad-a270-8ce6a7b7bba4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-f925k" Mar 12 15:00:00 crc kubenswrapper[4869]: I0312 15:00:00.279575 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tq7p\" (UniqueName: \"kubernetes.io/projected/a6888aca-055f-4d64-a0a6-e42ea6011fef-kube-api-access-6tq7p\") pod \"auto-csr-approver-29555460-9kdz9\" (UID: \"a6888aca-055f-4d64-a0a6-e42ea6011fef\") " pod="openshift-infra/auto-csr-approver-29555460-9kdz9" Mar 12 15:00:00 crc kubenswrapper[4869]: I0312 15:00:00.308261 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnz8b\" (UniqueName: \"kubernetes.io/projected/6119e4f3-9261-46ad-a270-8ce6a7b7bba4-kube-api-access-vnz8b\") pod \"collect-profiles-29555460-f925k\" (UID: \"6119e4f3-9261-46ad-a270-8ce6a7b7bba4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-f925k" Mar 12 15:00:00 crc kubenswrapper[4869]: I0312 15:00:00.451347 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-f925k" Mar 12 15:00:00 crc kubenswrapper[4869]: I0312 15:00:00.473838 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555460-9kdz9" Mar 12 15:00:00 crc kubenswrapper[4869]: I0312 15:00:00.851855 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555460-f925k"] Mar 12 15:00:00 crc kubenswrapper[4869]: W0312 15:00:00.858205 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6119e4f3_9261_46ad_a270_8ce6a7b7bba4.slice/crio-35ee87a0b554a0b78a5d898dd1ee798bff0c610a3026a8ab9533e3e4d264ea8d WatchSource:0}: Error finding container 35ee87a0b554a0b78a5d898dd1ee798bff0c610a3026a8ab9533e3e4d264ea8d: Status 404 returned error can't find the container with id 35ee87a0b554a0b78a5d898dd1ee798bff0c610a3026a8ab9533e3e4d264ea8d Mar 12 15:00:00 crc kubenswrapper[4869]: I0312 15:00:00.921990 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555460-9kdz9"] Mar 12 15:00:00 crc kubenswrapper[4869]: W0312 15:00:00.934401 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6888aca_055f_4d64_a0a6_e42ea6011fef.slice/crio-a4a015603a7fa7572599c8e34713d4fabd72eae11c422bdc84b7be1b87d7da0d WatchSource:0}: Error finding container a4a015603a7fa7572599c8e34713d4fabd72eae11c422bdc84b7be1b87d7da0d: Status 404 returned error can't find the container with id a4a015603a7fa7572599c8e34713d4fabd72eae11c422bdc84b7be1b87d7da0d Mar 12 15:00:01 crc kubenswrapper[4869]: I0312 15:00:01.600153 4869 generic.go:334] "Generic (PLEG): container finished" podID="6119e4f3-9261-46ad-a270-8ce6a7b7bba4" containerID="58ecb8bf03ff4dcc85710784c1aa389c9aeffc9e295fef9f20523709c63a9d10" exitCode=0 Mar 12 15:00:01 crc kubenswrapper[4869]: I0312 15:00:01.600280 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-f925k" event={"ID":"6119e4f3-9261-46ad-a270-8ce6a7b7bba4","Type":"ContainerDied","Data":"58ecb8bf03ff4dcc85710784c1aa389c9aeffc9e295fef9f20523709c63a9d10"} Mar 12 15:00:01 crc kubenswrapper[4869]: I0312 15:00:01.600788 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-f925k" event={"ID":"6119e4f3-9261-46ad-a270-8ce6a7b7bba4","Type":"ContainerStarted","Data":"35ee87a0b554a0b78a5d898dd1ee798bff0c610a3026a8ab9533e3e4d264ea8d"} Mar 12 15:00:01 crc kubenswrapper[4869]: I0312 15:00:01.602208 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555460-9kdz9" event={"ID":"a6888aca-055f-4d64-a0a6-e42ea6011fef","Type":"ContainerStarted","Data":"a4a015603a7fa7572599c8e34713d4fabd72eae11c422bdc84b7be1b87d7da0d"} Mar 12 15:00:02 crc kubenswrapper[4869]: I0312 15:00:02.834587 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-f925k" Mar 12 15:00:02 crc kubenswrapper[4869]: I0312 15:00:02.897753 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6119e4f3-9261-46ad-a270-8ce6a7b7bba4-config-volume\") pod \"6119e4f3-9261-46ad-a270-8ce6a7b7bba4\" (UID: \"6119e4f3-9261-46ad-a270-8ce6a7b7bba4\") " Mar 12 15:00:02 crc kubenswrapper[4869]: I0312 15:00:02.897831 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6119e4f3-9261-46ad-a270-8ce6a7b7bba4-secret-volume\") pod \"6119e4f3-9261-46ad-a270-8ce6a7b7bba4\" (UID: \"6119e4f3-9261-46ad-a270-8ce6a7b7bba4\") " Mar 12 15:00:02 crc kubenswrapper[4869]: I0312 15:00:02.897873 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnz8b\" (UniqueName: \"kubernetes.io/projected/6119e4f3-9261-46ad-a270-8ce6a7b7bba4-kube-api-access-vnz8b\") pod \"6119e4f3-9261-46ad-a270-8ce6a7b7bba4\" (UID: \"6119e4f3-9261-46ad-a270-8ce6a7b7bba4\") " Mar 12 15:00:02 crc kubenswrapper[4869]: I0312 15:00:02.899076 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6119e4f3-9261-46ad-a270-8ce6a7b7bba4-config-volume" (OuterVolumeSpecName: "config-volume") pod "6119e4f3-9261-46ad-a270-8ce6a7b7bba4" (UID: "6119e4f3-9261-46ad-a270-8ce6a7b7bba4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:00:02 crc kubenswrapper[4869]: I0312 15:00:02.904973 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6119e4f3-9261-46ad-a270-8ce6a7b7bba4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6119e4f3-9261-46ad-a270-8ce6a7b7bba4" (UID: "6119e4f3-9261-46ad-a270-8ce6a7b7bba4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:00:02 crc kubenswrapper[4869]: I0312 15:00:02.905361 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6119e4f3-9261-46ad-a270-8ce6a7b7bba4-kube-api-access-vnz8b" (OuterVolumeSpecName: "kube-api-access-vnz8b") pod "6119e4f3-9261-46ad-a270-8ce6a7b7bba4" (UID: "6119e4f3-9261-46ad-a270-8ce6a7b7bba4"). InnerVolumeSpecName "kube-api-access-vnz8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:00:02 crc kubenswrapper[4869]: I0312 15:00:02.999079 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnz8b\" (UniqueName: \"kubernetes.io/projected/6119e4f3-9261-46ad-a270-8ce6a7b7bba4-kube-api-access-vnz8b\") on node \"crc\" DevicePath \"\"" Mar 12 15:00:02 crc kubenswrapper[4869]: I0312 15:00:02.999128 4869 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6119e4f3-9261-46ad-a270-8ce6a7b7bba4-config-volume\") on node \"crc\" DevicePath \"\"" Mar 12 15:00:02 crc kubenswrapper[4869]: I0312 15:00:02.999147 4869 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6119e4f3-9261-46ad-a270-8ce6a7b7bba4-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 12 15:00:03 crc kubenswrapper[4869]: I0312 15:00:03.619711 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-f925k" event={"ID":"6119e4f3-9261-46ad-a270-8ce6a7b7bba4","Type":"ContainerDied","Data":"35ee87a0b554a0b78a5d898dd1ee798bff0c610a3026a8ab9533e3e4d264ea8d"} Mar 12 15:00:03 crc kubenswrapper[4869]: I0312 15:00:03.619763 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35ee87a0b554a0b78a5d898dd1ee798bff0c610a3026a8ab9533e3e4d264ea8d" Mar 12 15:00:03 crc kubenswrapper[4869]: I0312 15:00:03.619914 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-f925k" Mar 12 15:00:04 crc kubenswrapper[4869]: I0312 15:00:04.724333 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-trwvm" Mar 12 15:00:04 crc kubenswrapper[4869]: I0312 15:00:04.766601 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-trwvm" Mar 12 15:00:04 crc kubenswrapper[4869]: I0312 15:00:04.960308 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-trwvm"] Mar 12 15:00:06 crc kubenswrapper[4869]: I0312 15:00:06.068659 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q5lql" Mar 12 15:00:06 crc kubenswrapper[4869]: I0312 15:00:06.068730 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q5lql" Mar 12 15:00:06 crc kubenswrapper[4869]: I0312 15:00:06.105579 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q5lql" Mar 12 15:00:06 crc kubenswrapper[4869]: I0312 15:00:06.636502 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-trwvm" podUID="c69f1078-e5b7-45f0-a193-d9537762ea14" containerName="registry-server" containerID="cri-o://aaebaf6370ac43a2e34efce62594d3947fe9d433f17c244192a49ff621404948" gracePeriod=2 Mar 12 15:00:06 crc kubenswrapper[4869]: I0312 15:00:06.675476 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q5lql" Mar 12 15:00:07 crc kubenswrapper[4869]: I0312 15:00:07.361489 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q5lql"] Mar 12 15:00:07 crc kubenswrapper[4869]: I0312 15:00:07.653609 4869 generic.go:334] "Generic (PLEG): container finished" podID="c69f1078-e5b7-45f0-a193-d9537762ea14" containerID="aaebaf6370ac43a2e34efce62594d3947fe9d433f17c244192a49ff621404948" exitCode=0 Mar 12 15:00:07 crc kubenswrapper[4869]: I0312 15:00:07.653660 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trwvm" event={"ID":"c69f1078-e5b7-45f0-a193-d9537762ea14","Type":"ContainerDied","Data":"aaebaf6370ac43a2e34efce62594d3947fe9d433f17c244192a49ff621404948"} Mar 12 15:00:07 crc kubenswrapper[4869]: I0312 15:00:07.795513 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-trwvm" Mar 12 15:00:07 crc kubenswrapper[4869]: I0312 15:00:07.881363 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phrms\" (UniqueName: \"kubernetes.io/projected/c69f1078-e5b7-45f0-a193-d9537762ea14-kube-api-access-phrms\") pod \"c69f1078-e5b7-45f0-a193-d9537762ea14\" (UID: \"c69f1078-e5b7-45f0-a193-d9537762ea14\") " Mar 12 15:00:07 crc kubenswrapper[4869]: I0312 15:00:07.881706 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c69f1078-e5b7-45f0-a193-d9537762ea14-utilities\") pod \"c69f1078-e5b7-45f0-a193-d9537762ea14\" (UID: \"c69f1078-e5b7-45f0-a193-d9537762ea14\") " Mar 12 15:00:07 crc kubenswrapper[4869]: I0312 15:00:07.882136 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c69f1078-e5b7-45f0-a193-d9537762ea14-catalog-content\") pod \"c69f1078-e5b7-45f0-a193-d9537762ea14\" (UID: \"c69f1078-e5b7-45f0-a193-d9537762ea14\") " Mar 12 15:00:07 crc kubenswrapper[4869]: I0312 15:00:07.883072 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c69f1078-e5b7-45f0-a193-d9537762ea14-utilities" (OuterVolumeSpecName: "utilities") pod "c69f1078-e5b7-45f0-a193-d9537762ea14" (UID: "c69f1078-e5b7-45f0-a193-d9537762ea14"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:00:07 crc kubenswrapper[4869]: I0312 15:00:07.890196 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c69f1078-e5b7-45f0-a193-d9537762ea14-kube-api-access-phrms" (OuterVolumeSpecName: "kube-api-access-phrms") pod "c69f1078-e5b7-45f0-a193-d9537762ea14" (UID: "c69f1078-e5b7-45f0-a193-d9537762ea14"). InnerVolumeSpecName "kube-api-access-phrms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:00:07 crc kubenswrapper[4869]: I0312 15:00:07.910864 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c69f1078-e5b7-45f0-a193-d9537762ea14-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c69f1078-e5b7-45f0-a193-d9537762ea14" (UID: "c69f1078-e5b7-45f0-a193-d9537762ea14"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:00:07 crc kubenswrapper[4869]: I0312 15:00:07.984686 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c69f1078-e5b7-45f0-a193-d9537762ea14-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:00:07 crc kubenswrapper[4869]: I0312 15:00:07.984748 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phrms\" (UniqueName: \"kubernetes.io/projected/c69f1078-e5b7-45f0-a193-d9537762ea14-kube-api-access-phrms\") on node \"crc\" DevicePath \"\"" Mar 12 15:00:07 crc kubenswrapper[4869]: I0312 15:00:07.984766 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c69f1078-e5b7-45f0-a193-d9537762ea14-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:00:08 crc kubenswrapper[4869]: I0312 15:00:08.662095 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trwvm" event={"ID":"c69f1078-e5b7-45f0-a193-d9537762ea14","Type":"ContainerDied","Data":"b9cd9ae40bc860a90bcbdcb5744535dbdd2a5e6678a3653bba61022730fecea1"} Mar 12 15:00:08 crc kubenswrapper[4869]: I0312 15:00:08.662475 4869 scope.go:117] "RemoveContainer" containerID="aaebaf6370ac43a2e34efce62594d3947fe9d433f17c244192a49ff621404948" Mar 12 15:00:08 crc kubenswrapper[4869]: I0312 15:00:08.662315 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-trwvm" Mar 12 15:00:08 crc kubenswrapper[4869]: I0312 15:00:08.665446 4869 generic.go:334] "Generic (PLEG): container finished" podID="a6888aca-055f-4d64-a0a6-e42ea6011fef" containerID="897337953be2efb379efb1dd04745ec6b90bdbb52ca0003d268f70b0aae27816" exitCode=0 Mar 12 15:00:08 crc kubenswrapper[4869]: I0312 15:00:08.665678 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555460-9kdz9" event={"ID":"a6888aca-055f-4d64-a0a6-e42ea6011fef","Type":"ContainerDied","Data":"897337953be2efb379efb1dd04745ec6b90bdbb52ca0003d268f70b0aae27816"} Mar 12 15:00:08 crc kubenswrapper[4869]: I0312 15:00:08.665836 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-q5lql" podUID="af93b315-e2d0-4075-a423-944feada01ec" containerName="registry-server" containerID="cri-o://290c87c4a87501b48fa21f152ad6caff8bebaf9312e19f4e276e5f259011759a" gracePeriod=2 Mar 12 15:00:08 crc kubenswrapper[4869]: I0312 15:00:08.678532 4869 scope.go:117] "RemoveContainer" containerID="3c3a02490f688019697f2cff47eb4fc5a599706a0cd09a3f5e7d77c06e346f3d" Mar 12 15:00:08 crc kubenswrapper[4869]: I0312 15:00:08.703794 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-trwvm"] Mar 12 15:00:08 crc kubenswrapper[4869]: I0312 15:00:08.711431 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-trwvm"] Mar 12 15:00:08 crc kubenswrapper[4869]: I0312 15:00:08.711677 4869 scope.go:117] "RemoveContainer" containerID="8763c469ce6fee5bcadd57cdf9c119c1b531e8e4a3f269cfdc820ed4e303ede5" Mar 12 15:00:08 crc kubenswrapper[4869]: I0312 15:00:08.999505 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q5lql" Mar 12 15:00:09 crc kubenswrapper[4869]: I0312 15:00:09.097637 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgdl7\" (UniqueName: \"kubernetes.io/projected/af93b315-e2d0-4075-a423-944feada01ec-kube-api-access-rgdl7\") pod \"af93b315-e2d0-4075-a423-944feada01ec\" (UID: \"af93b315-e2d0-4075-a423-944feada01ec\") " Mar 12 15:00:09 crc kubenswrapper[4869]: I0312 15:00:09.097728 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af93b315-e2d0-4075-a423-944feada01ec-catalog-content\") pod \"af93b315-e2d0-4075-a423-944feada01ec\" (UID: \"af93b315-e2d0-4075-a423-944feada01ec\") " Mar 12 15:00:09 crc kubenswrapper[4869]: I0312 15:00:09.097806 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af93b315-e2d0-4075-a423-944feada01ec-utilities\") pod \"af93b315-e2d0-4075-a423-944feada01ec\" (UID: \"af93b315-e2d0-4075-a423-944feada01ec\") " Mar 12 15:00:09 crc kubenswrapper[4869]: I0312 15:00:09.099421 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af93b315-e2d0-4075-a423-944feada01ec-utilities" (OuterVolumeSpecName: "utilities") pod "af93b315-e2d0-4075-a423-944feada01ec" (UID: "af93b315-e2d0-4075-a423-944feada01ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:00:09 crc kubenswrapper[4869]: I0312 15:00:09.105420 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af93b315-e2d0-4075-a423-944feada01ec-kube-api-access-rgdl7" (OuterVolumeSpecName: "kube-api-access-rgdl7") pod "af93b315-e2d0-4075-a423-944feada01ec" (UID: "af93b315-e2d0-4075-a423-944feada01ec"). InnerVolumeSpecName "kube-api-access-rgdl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:00:09 crc kubenswrapper[4869]: I0312 15:00:09.178344 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af93b315-e2d0-4075-a423-944feada01ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af93b315-e2d0-4075-a423-944feada01ec" (UID: "af93b315-e2d0-4075-a423-944feada01ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:00:09 crc kubenswrapper[4869]: I0312 15:00:09.198657 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af93b315-e2d0-4075-a423-944feada01ec-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:00:09 crc kubenswrapper[4869]: I0312 15:00:09.198685 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgdl7\" (UniqueName: \"kubernetes.io/projected/af93b315-e2d0-4075-a423-944feada01ec-kube-api-access-rgdl7\") on node \"crc\" DevicePath \"\"" Mar 12 15:00:09 crc kubenswrapper[4869]: I0312 15:00:09.198696 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af93b315-e2d0-4075-a423-944feada01ec-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:00:09 crc kubenswrapper[4869]: I0312 15:00:09.673394 4869 generic.go:334] "Generic (PLEG): container finished" podID="af93b315-e2d0-4075-a423-944feada01ec" containerID="290c87c4a87501b48fa21f152ad6caff8bebaf9312e19f4e276e5f259011759a" exitCode=0 Mar 12 15:00:09 crc kubenswrapper[4869]: I0312 15:00:09.673479 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q5lql" Mar 12 15:00:09 crc kubenswrapper[4869]: I0312 15:00:09.673488 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q5lql" event={"ID":"af93b315-e2d0-4075-a423-944feada01ec","Type":"ContainerDied","Data":"290c87c4a87501b48fa21f152ad6caff8bebaf9312e19f4e276e5f259011759a"} Mar 12 15:00:09 crc kubenswrapper[4869]: I0312 15:00:09.674447 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q5lql" event={"ID":"af93b315-e2d0-4075-a423-944feada01ec","Type":"ContainerDied","Data":"71dbded7bcd75fd80cb355db9a9c36b88d09698678c823492217dd7f9c4ca317"} Mar 12 15:00:09 crc kubenswrapper[4869]: I0312 15:00:09.674497 4869 scope.go:117] "RemoveContainer" containerID="290c87c4a87501b48fa21f152ad6caff8bebaf9312e19f4e276e5f259011759a" Mar 12 15:00:09 crc kubenswrapper[4869]: I0312 15:00:09.688791 4869 scope.go:117] "RemoveContainer" containerID="58e54c879ae3e57f12e1933f3b1495daba9de10031454ff4abe3d4da77841a92" Mar 12 15:00:09 crc kubenswrapper[4869]: I0312 15:00:09.716220 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q5lql"] Mar 12 15:00:09 crc kubenswrapper[4869]: I0312 15:00:09.721712 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-q5lql"] Mar 12 15:00:09 crc kubenswrapper[4869]: I0312 15:00:09.733146 4869 scope.go:117] "RemoveContainer" containerID="0848af52a63d922288031fa0c54e5b9053d2425cb253f027df19afc19907cb10" Mar 12 15:00:09 crc kubenswrapper[4869]: I0312 15:00:09.746684 4869 scope.go:117] "RemoveContainer" containerID="290c87c4a87501b48fa21f152ad6caff8bebaf9312e19f4e276e5f259011759a" Mar 12 15:00:09 crc kubenswrapper[4869]: E0312 15:00:09.747374 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"290c87c4a87501b48fa21f152ad6caff8bebaf9312e19f4e276e5f259011759a\": container with ID starting with 290c87c4a87501b48fa21f152ad6caff8bebaf9312e19f4e276e5f259011759a not found: ID does not exist" containerID="290c87c4a87501b48fa21f152ad6caff8bebaf9312e19f4e276e5f259011759a" Mar 12 15:00:09 crc kubenswrapper[4869]: I0312 15:00:09.747413 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"290c87c4a87501b48fa21f152ad6caff8bebaf9312e19f4e276e5f259011759a"} err="failed to get container status \"290c87c4a87501b48fa21f152ad6caff8bebaf9312e19f4e276e5f259011759a\": rpc error: code = NotFound desc = could not find container \"290c87c4a87501b48fa21f152ad6caff8bebaf9312e19f4e276e5f259011759a\": container with ID starting with 290c87c4a87501b48fa21f152ad6caff8bebaf9312e19f4e276e5f259011759a not found: ID does not exist" Mar 12 15:00:09 crc kubenswrapper[4869]: I0312 15:00:09.747438 4869 scope.go:117] "RemoveContainer" containerID="58e54c879ae3e57f12e1933f3b1495daba9de10031454ff4abe3d4da77841a92" Mar 12 15:00:09 crc kubenswrapper[4869]: E0312 15:00:09.747817 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58e54c879ae3e57f12e1933f3b1495daba9de10031454ff4abe3d4da77841a92\": container with ID starting with 58e54c879ae3e57f12e1933f3b1495daba9de10031454ff4abe3d4da77841a92 not found: ID does not exist" containerID="58e54c879ae3e57f12e1933f3b1495daba9de10031454ff4abe3d4da77841a92" Mar 12 15:00:09 crc kubenswrapper[4869]: I0312 15:00:09.747854 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58e54c879ae3e57f12e1933f3b1495daba9de10031454ff4abe3d4da77841a92"} err="failed to get container status \"58e54c879ae3e57f12e1933f3b1495daba9de10031454ff4abe3d4da77841a92\": rpc error: code = NotFound desc = could not find container \"58e54c879ae3e57f12e1933f3b1495daba9de10031454ff4abe3d4da77841a92\": container with ID starting with 58e54c879ae3e57f12e1933f3b1495daba9de10031454ff4abe3d4da77841a92 not found: ID does not exist" Mar 12 15:00:09 crc kubenswrapper[4869]: I0312 15:00:09.747883 4869 scope.go:117] "RemoveContainer" containerID="0848af52a63d922288031fa0c54e5b9053d2425cb253f027df19afc19907cb10" Mar 12 15:00:09 crc kubenswrapper[4869]: E0312 15:00:09.748139 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0848af52a63d922288031fa0c54e5b9053d2425cb253f027df19afc19907cb10\": container with ID starting with 0848af52a63d922288031fa0c54e5b9053d2425cb253f027df19afc19907cb10 not found: ID does not exist" containerID="0848af52a63d922288031fa0c54e5b9053d2425cb253f027df19afc19907cb10" Mar 12 15:00:09 crc kubenswrapper[4869]: I0312 15:00:09.748160 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0848af52a63d922288031fa0c54e5b9053d2425cb253f027df19afc19907cb10"} err="failed to get container status \"0848af52a63d922288031fa0c54e5b9053d2425cb253f027df19afc19907cb10\": rpc error: code = NotFound desc = could not find container \"0848af52a63d922288031fa0c54e5b9053d2425cb253f027df19afc19907cb10\": container with ID starting with 0848af52a63d922288031fa0c54e5b9053d2425cb253f027df19afc19907cb10 not found: ID does not exist" Mar 12 15:00:09 crc kubenswrapper[4869]: I0312 15:00:09.913871 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555460-9kdz9" Mar 12 15:00:10 crc kubenswrapper[4869]: I0312 15:00:10.013943 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tq7p\" (UniqueName: \"kubernetes.io/projected/a6888aca-055f-4d64-a0a6-e42ea6011fef-kube-api-access-6tq7p\") pod \"a6888aca-055f-4d64-a0a6-e42ea6011fef\" (UID: \"a6888aca-055f-4d64-a0a6-e42ea6011fef\") " Mar 12 15:00:10 crc kubenswrapper[4869]: I0312 15:00:10.019107 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6888aca-055f-4d64-a0a6-e42ea6011fef-kube-api-access-6tq7p" (OuterVolumeSpecName: "kube-api-access-6tq7p") pod "a6888aca-055f-4d64-a0a6-e42ea6011fef" (UID: "a6888aca-055f-4d64-a0a6-e42ea6011fef"). InnerVolumeSpecName "kube-api-access-6tq7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:00:10 crc kubenswrapper[4869]: I0312 15:00:10.115018 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tq7p\" (UniqueName: \"kubernetes.io/projected/a6888aca-055f-4d64-a0a6-e42ea6011fef-kube-api-access-6tq7p\") on node \"crc\" DevicePath \"\"" Mar 12 15:00:10 crc kubenswrapper[4869]: I0312 15:00:10.343656 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af93b315-e2d0-4075-a423-944feada01ec" path="/var/lib/kubelet/pods/af93b315-e2d0-4075-a423-944feada01ec/volumes" Mar 12 15:00:10 crc kubenswrapper[4869]: I0312 15:00:10.344745 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c69f1078-e5b7-45f0-a193-d9537762ea14" path="/var/lib/kubelet/pods/c69f1078-e5b7-45f0-a193-d9537762ea14/volumes" Mar 12 15:00:10 crc kubenswrapper[4869]: I0312 15:00:10.684934 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555460-9kdz9" event={"ID":"a6888aca-055f-4d64-a0a6-e42ea6011fef","Type":"ContainerDied","Data":"a4a015603a7fa7572599c8e34713d4fabd72eae11c422bdc84b7be1b87d7da0d"} Mar 12 15:00:10 crc kubenswrapper[4869]: I0312 15:00:10.684980 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4a015603a7fa7572599c8e34713d4fabd72eae11c422bdc84b7be1b87d7da0d" Mar 12 15:00:10 crc kubenswrapper[4869]: I0312 15:00:10.685025 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555460-9kdz9" Mar 12 15:00:10 crc kubenswrapper[4869]: I0312 15:00:10.989176 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555454-5svs7"] Mar 12 15:00:10 crc kubenswrapper[4869]: I0312 15:00:10.994104 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555454-5svs7"] Mar 12 15:00:12 crc kubenswrapper[4869]: I0312 15:00:12.348186 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1a92e51-8487-471e-a055-e7f63f101490" path="/var/lib/kubelet/pods/f1a92e51-8487-471e-a055-e7f63f101490/volumes" Mar 12 15:00:29 crc kubenswrapper[4869]: I0312 15:00:29.151349 4869 scope.go:117] "RemoveContainer" containerID="962c67795a5a57351d812d8e4dfa75849a6944336d685c83fe2d3167f79c78d9" Mar 12 15:00:30 crc kubenswrapper[4869]: E0312 15:00:30.551047 4869 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.223:51104->38.102.83.223:34669: write tcp 38.102.83.223:51104->38.102.83.223:34669: write: connection reset by peer Mar 12 15:01:06 crc kubenswrapper[4869]: I0312 15:01:06.126534 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw"] Mar 12 15:01:06 crc kubenswrapper[4869]: E0312 15:01:06.127248 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af93b315-e2d0-4075-a423-944feada01ec" containerName="extract-content" Mar 12 15:01:06 crc kubenswrapper[4869]: I0312 15:01:06.127259 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="af93b315-e2d0-4075-a423-944feada01ec" containerName="extract-content" Mar 12 15:01:06 crc kubenswrapper[4869]: E0312 15:01:06.127270 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af93b315-e2d0-4075-a423-944feada01ec" containerName="registry-server" Mar 12 15:01:06 crc kubenswrapper[4869]: I0312 15:01:06.127276 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="af93b315-e2d0-4075-a423-944feada01ec" containerName="registry-server" Mar 12 15:01:06 crc kubenswrapper[4869]: E0312 15:01:06.127284 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6888aca-055f-4d64-a0a6-e42ea6011fef" containerName="oc" Mar 12 15:01:06 crc kubenswrapper[4869]: I0312 15:01:06.127290 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6888aca-055f-4d64-a0a6-e42ea6011fef" containerName="oc" Mar 12 15:01:06 crc kubenswrapper[4869]: E0312 15:01:06.127299 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c69f1078-e5b7-45f0-a193-d9537762ea14" containerName="registry-server" Mar 12 15:01:06 crc kubenswrapper[4869]: I0312 15:01:06.127305 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="c69f1078-e5b7-45f0-a193-d9537762ea14" containerName="registry-server" Mar 12 15:01:06 crc kubenswrapper[4869]: E0312 15:01:06.127313 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c69f1078-e5b7-45f0-a193-d9537762ea14" containerName="extract-utilities" Mar 12 15:01:06 crc kubenswrapper[4869]: I0312 15:01:06.127318 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="c69f1078-e5b7-45f0-a193-d9537762ea14" containerName="extract-utilities" Mar 12 15:01:06 crc kubenswrapper[4869]: E0312 15:01:06.127326 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af93b315-e2d0-4075-a423-944feada01ec" containerName="extract-utilities" Mar 12 15:01:06 crc kubenswrapper[4869]: I0312 15:01:06.127332 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="af93b315-e2d0-4075-a423-944feada01ec" containerName="extract-utilities" Mar 12 15:01:06 crc kubenswrapper[4869]: E0312 15:01:06.127339 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c69f1078-e5b7-45f0-a193-d9537762ea14" containerName="extract-content" Mar 12 15:01:06 crc kubenswrapper[4869]: I0312 15:01:06.127345 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="c69f1078-e5b7-45f0-a193-d9537762ea14" containerName="extract-content" Mar 12 15:01:06 crc kubenswrapper[4869]: E0312 15:01:06.127351 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6119e4f3-9261-46ad-a270-8ce6a7b7bba4" containerName="collect-profiles" Mar 12 15:01:06 crc kubenswrapper[4869]: I0312 15:01:06.127356 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="6119e4f3-9261-46ad-a270-8ce6a7b7bba4" containerName="collect-profiles" Mar 12 15:01:06 crc kubenswrapper[4869]: I0312 15:01:06.127435 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="c69f1078-e5b7-45f0-a193-d9537762ea14" containerName="registry-server" Mar 12 15:01:06 crc kubenswrapper[4869]: I0312 15:01:06.127445 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="af93b315-e2d0-4075-a423-944feada01ec" containerName="registry-server" Mar 12 15:01:06 crc kubenswrapper[4869]: I0312 15:01:06.127457 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6888aca-055f-4d64-a0a6-e42ea6011fef" containerName="oc" Mar 12 15:01:06 crc kubenswrapper[4869]: I0312 15:01:06.127467 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="6119e4f3-9261-46ad-a270-8ce6a7b7bba4" containerName="collect-profiles" Mar 12 15:01:06 crc kubenswrapper[4869]: I0312 15:01:06.128163 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw" Mar 12 15:01:06 crc kubenswrapper[4869]: I0312 15:01:06.135129 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 12 15:01:06 crc kubenswrapper[4869]: I0312 15:01:06.135814 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw"] Mar 12 15:01:06 crc kubenswrapper[4869]: I0312 15:01:06.233795 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/15ef5f34-7088-4444-8423-ace6aaa9661a-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw\" (UID: \"15ef5f34-7088-4444-8423-ace6aaa9661a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw" Mar 12 15:01:06 crc kubenswrapper[4869]: I0312 15:01:06.233855 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/15ef5f34-7088-4444-8423-ace6aaa9661a-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw\" (UID: \"15ef5f34-7088-4444-8423-ace6aaa9661a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw" Mar 12 15:01:06 crc kubenswrapper[4869]: I0312 15:01:06.233952 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thzdz\" (UniqueName: \"kubernetes.io/projected/15ef5f34-7088-4444-8423-ace6aaa9661a-kube-api-access-thzdz\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw\" (UID: \"15ef5f34-7088-4444-8423-ace6aaa9661a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw" Mar 12 15:01:06 crc kubenswrapper[4869]: I0312 15:01:06.334930 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thzdz\" (UniqueName: \"kubernetes.io/projected/15ef5f34-7088-4444-8423-ace6aaa9661a-kube-api-access-thzdz\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw\" (UID: \"15ef5f34-7088-4444-8423-ace6aaa9661a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw" Mar 12 15:01:06 crc kubenswrapper[4869]: I0312 15:01:06.334995 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/15ef5f34-7088-4444-8423-ace6aaa9661a-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw\" (UID: \"15ef5f34-7088-4444-8423-ace6aaa9661a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw" Mar 12 15:01:06 crc kubenswrapper[4869]: I0312 15:01:06.335025 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/15ef5f34-7088-4444-8423-ace6aaa9661a-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw\" (UID: \"15ef5f34-7088-4444-8423-ace6aaa9661a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw" Mar 12 15:01:06 crc kubenswrapper[4869]: I0312 15:01:06.335514 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/15ef5f34-7088-4444-8423-ace6aaa9661a-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw\" (UID: \"15ef5f34-7088-4444-8423-ace6aaa9661a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw" Mar 12 15:01:06 crc kubenswrapper[4869]: I0312 15:01:06.336457 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/15ef5f34-7088-4444-8423-ace6aaa9661a-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw\" (UID: \"15ef5f34-7088-4444-8423-ace6aaa9661a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw" Mar 12 15:01:06 crc kubenswrapper[4869]: I0312 15:01:06.353485 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thzdz\" (UniqueName: \"kubernetes.io/projected/15ef5f34-7088-4444-8423-ace6aaa9661a-kube-api-access-thzdz\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw\" (UID: \"15ef5f34-7088-4444-8423-ace6aaa9661a\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw" Mar 12 15:01:06 crc kubenswrapper[4869]: I0312 15:01:06.442170 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw" Mar 12 15:01:06 crc kubenswrapper[4869]: I0312 15:01:06.629449 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw"] Mar 12 15:01:07 crc kubenswrapper[4869]: I0312 15:01:07.056498 4869 generic.go:334] "Generic (PLEG): container finished" podID="15ef5f34-7088-4444-8423-ace6aaa9661a" containerID="02122ee1c94db50769e31c3044d5494d39a3a4f7b359d7baef79e69607960b18" exitCode=0 Mar 12 15:01:07 crc kubenswrapper[4869]: I0312 15:01:07.056574 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw" event={"ID":"15ef5f34-7088-4444-8423-ace6aaa9661a","Type":"ContainerDied","Data":"02122ee1c94db50769e31c3044d5494d39a3a4f7b359d7baef79e69607960b18"} Mar 12 15:01:07 crc kubenswrapper[4869]: I0312 15:01:07.056634 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw" event={"ID":"15ef5f34-7088-4444-8423-ace6aaa9661a","Type":"ContainerStarted","Data":"7782b6ea333553eb0b1bf39e1d1ba19e645b7dd03ad2185c89e4bb03e7e8a7ad"} Mar 12 15:01:07 crc kubenswrapper[4869]: I0312 15:01:07.058138 4869 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 15:01:08 crc kubenswrapper[4869]: I0312 15:01:08.492422 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v6qsw"] Mar 12 15:01:08 crc kubenswrapper[4869]: I0312 15:01:08.493811 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v6qsw" Mar 12 15:01:08 crc kubenswrapper[4869]: I0312 15:01:08.512432 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v6qsw"] Mar 12 15:01:08 crc kubenswrapper[4869]: I0312 15:01:08.564145 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1f177f0-abce-4ed6-9aee-0d11fd6818ef-utilities\") pod \"redhat-operators-v6qsw\" (UID: \"c1f177f0-abce-4ed6-9aee-0d11fd6818ef\") " pod="openshift-marketplace/redhat-operators-v6qsw" Mar 12 15:01:08 crc kubenswrapper[4869]: I0312 15:01:08.564349 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvcqs\" (UniqueName: \"kubernetes.io/projected/c1f177f0-abce-4ed6-9aee-0d11fd6818ef-kube-api-access-cvcqs\") pod \"redhat-operators-v6qsw\" (UID: \"c1f177f0-abce-4ed6-9aee-0d11fd6818ef\") " pod="openshift-marketplace/redhat-operators-v6qsw" Mar 12 15:01:08 crc kubenswrapper[4869]: I0312 15:01:08.564487 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1f177f0-abce-4ed6-9aee-0d11fd6818ef-catalog-content\") pod \"redhat-operators-v6qsw\" (UID: \"c1f177f0-abce-4ed6-9aee-0d11fd6818ef\") " pod="openshift-marketplace/redhat-operators-v6qsw" Mar 12 15:01:08 crc kubenswrapper[4869]: I0312 15:01:08.665916 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvcqs\" (UniqueName: \"kubernetes.io/projected/c1f177f0-abce-4ed6-9aee-0d11fd6818ef-kube-api-access-cvcqs\") pod \"redhat-operators-v6qsw\" (UID: \"c1f177f0-abce-4ed6-9aee-0d11fd6818ef\") " pod="openshift-marketplace/redhat-operators-v6qsw" Mar 12 15:01:08 crc kubenswrapper[4869]: I0312 15:01:08.666001 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1f177f0-abce-4ed6-9aee-0d11fd6818ef-catalog-content\") pod \"redhat-operators-v6qsw\" (UID: \"c1f177f0-abce-4ed6-9aee-0d11fd6818ef\") " pod="openshift-marketplace/redhat-operators-v6qsw" Mar 12 15:01:08 crc kubenswrapper[4869]: I0312 15:01:08.666035 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1f177f0-abce-4ed6-9aee-0d11fd6818ef-utilities\") pod \"redhat-operators-v6qsw\" (UID: \"c1f177f0-abce-4ed6-9aee-0d11fd6818ef\") " pod="openshift-marketplace/redhat-operators-v6qsw" Mar 12 15:01:08 crc kubenswrapper[4869]: I0312 15:01:08.666451 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1f177f0-abce-4ed6-9aee-0d11fd6818ef-utilities\") pod \"redhat-operators-v6qsw\" (UID: \"c1f177f0-abce-4ed6-9aee-0d11fd6818ef\") " pod="openshift-marketplace/redhat-operators-v6qsw" Mar 12 15:01:08 crc kubenswrapper[4869]: I0312 15:01:08.666516 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1f177f0-abce-4ed6-9aee-0d11fd6818ef-catalog-content\") pod \"redhat-operators-v6qsw\" (UID: \"c1f177f0-abce-4ed6-9aee-0d11fd6818ef\") " pod="openshift-marketplace/redhat-operators-v6qsw" Mar 12 15:01:08 crc kubenswrapper[4869]: I0312 15:01:08.684131 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvcqs\" (UniqueName: \"kubernetes.io/projected/c1f177f0-abce-4ed6-9aee-0d11fd6818ef-kube-api-access-cvcqs\") pod \"redhat-operators-v6qsw\" (UID: \"c1f177f0-abce-4ed6-9aee-0d11fd6818ef\") " pod="openshift-marketplace/redhat-operators-v6qsw" Mar 12 15:01:08 crc kubenswrapper[4869]: I0312 15:01:08.821290 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v6qsw" Mar 12 15:01:09 crc kubenswrapper[4869]: I0312 15:01:09.008562 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v6qsw"] Mar 12 15:01:09 crc kubenswrapper[4869]: W0312 15:01:09.017154 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1f177f0_abce_4ed6_9aee_0d11fd6818ef.slice/crio-eb5fd5215b95a5dfc61432c00571bf848cd672aad473627e8aac7731ba301edc WatchSource:0}: Error finding container eb5fd5215b95a5dfc61432c00571bf848cd672aad473627e8aac7731ba301edc: Status 404 returned error can't find the container with id eb5fd5215b95a5dfc61432c00571bf848cd672aad473627e8aac7731ba301edc Mar 12 15:01:09 crc kubenswrapper[4869]: I0312 15:01:09.075855 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v6qsw" event={"ID":"c1f177f0-abce-4ed6-9aee-0d11fd6818ef","Type":"ContainerStarted","Data":"eb5fd5215b95a5dfc61432c00571bf848cd672aad473627e8aac7731ba301edc"} Mar 12 15:01:10 crc kubenswrapper[4869]: I0312 15:01:10.084150 4869 generic.go:334] "Generic (PLEG): container finished" podID="15ef5f34-7088-4444-8423-ace6aaa9661a" containerID="c5cffcb49e3991e91834e00d949f8cf6f16838c2ba02d08d1de9e0ab9227a73a" exitCode=0 Mar 12 15:01:10 crc kubenswrapper[4869]: I0312 15:01:10.084244 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw" event={"ID":"15ef5f34-7088-4444-8423-ace6aaa9661a","Type":"ContainerDied","Data":"c5cffcb49e3991e91834e00d949f8cf6f16838c2ba02d08d1de9e0ab9227a73a"} Mar 12 15:01:10 crc kubenswrapper[4869]: I0312 15:01:10.088676 4869 generic.go:334] "Generic (PLEG): container finished" podID="c1f177f0-abce-4ed6-9aee-0d11fd6818ef" containerID="fb91372b97f64a359c9f97a8fb0b34e57e64ecb7d5a5809af143f9850906dfe2" exitCode=0 Mar 12 15:01:10 crc kubenswrapper[4869]: I0312 15:01:10.088720 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v6qsw" event={"ID":"c1f177f0-abce-4ed6-9aee-0d11fd6818ef","Type":"ContainerDied","Data":"fb91372b97f64a359c9f97a8fb0b34e57e64ecb7d5a5809af143f9850906dfe2"} Mar 12 15:01:11 crc kubenswrapper[4869]: I0312 15:01:11.095969 4869 generic.go:334] "Generic (PLEG): container finished" podID="15ef5f34-7088-4444-8423-ace6aaa9661a" containerID="7750dac0229ad6b5daf24cc0c8ca749b39dd482e182688f54fd99015992a09bb" exitCode=0 Mar 12 15:01:11 crc kubenswrapper[4869]: I0312 15:01:11.096044 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw" event={"ID":"15ef5f34-7088-4444-8423-ace6aaa9661a","Type":"ContainerDied","Data":"7750dac0229ad6b5daf24cc0c8ca749b39dd482e182688f54fd99015992a09bb"} Mar 12 15:01:12 crc kubenswrapper[4869]: I0312 15:01:12.320999 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw" Mar 12 15:01:12 crc kubenswrapper[4869]: I0312 15:01:12.406385 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/15ef5f34-7088-4444-8423-ace6aaa9661a-util\") pod \"15ef5f34-7088-4444-8423-ace6aaa9661a\" (UID: \"15ef5f34-7088-4444-8423-ace6aaa9661a\") " Mar 12 15:01:12 crc kubenswrapper[4869]: I0312 15:01:12.406475 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/15ef5f34-7088-4444-8423-ace6aaa9661a-bundle\") pod \"15ef5f34-7088-4444-8423-ace6aaa9661a\" (UID: \"15ef5f34-7088-4444-8423-ace6aaa9661a\") " Mar 12 15:01:12 crc kubenswrapper[4869]: I0312 15:01:12.406576 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thzdz\" (UniqueName: \"kubernetes.io/projected/15ef5f34-7088-4444-8423-ace6aaa9661a-kube-api-access-thzdz\") pod \"15ef5f34-7088-4444-8423-ace6aaa9661a\" (UID: \"15ef5f34-7088-4444-8423-ace6aaa9661a\") " Mar 12 15:01:12 crc kubenswrapper[4869]: I0312 15:01:12.407342 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15ef5f34-7088-4444-8423-ace6aaa9661a-bundle" (OuterVolumeSpecName: "bundle") pod "15ef5f34-7088-4444-8423-ace6aaa9661a" (UID: "15ef5f34-7088-4444-8423-ace6aaa9661a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:01:12 crc kubenswrapper[4869]: I0312 15:01:12.411829 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15ef5f34-7088-4444-8423-ace6aaa9661a-kube-api-access-thzdz" (OuterVolumeSpecName: "kube-api-access-thzdz") pod "15ef5f34-7088-4444-8423-ace6aaa9661a" (UID: "15ef5f34-7088-4444-8423-ace6aaa9661a"). InnerVolumeSpecName "kube-api-access-thzdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:01:12 crc kubenswrapper[4869]: I0312 15:01:12.507713 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thzdz\" (UniqueName: \"kubernetes.io/projected/15ef5f34-7088-4444-8423-ace6aaa9661a-kube-api-access-thzdz\") on node \"crc\" DevicePath \"\"" Mar 12 15:01:12 crc kubenswrapper[4869]: I0312 15:01:12.507755 4869 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/15ef5f34-7088-4444-8423-ace6aaa9661a-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:01:12 crc kubenswrapper[4869]: I0312 15:01:12.559434 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15ef5f34-7088-4444-8423-ace6aaa9661a-util" (OuterVolumeSpecName: "util") pod "15ef5f34-7088-4444-8423-ace6aaa9661a" (UID: "15ef5f34-7088-4444-8423-ace6aaa9661a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:01:12 crc kubenswrapper[4869]: I0312 15:01:12.609104 4869 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/15ef5f34-7088-4444-8423-ace6aaa9661a-util\") on node \"crc\" DevicePath \"\"" Mar 12 15:01:13 crc kubenswrapper[4869]: I0312 15:01:13.108989 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw" event={"ID":"15ef5f34-7088-4444-8423-ace6aaa9661a","Type":"ContainerDied","Data":"7782b6ea333553eb0b1bf39e1d1ba19e645b7dd03ad2185c89e4bb03e7e8a7ad"} Mar 12 15:01:13 crc kubenswrapper[4869]: I0312 15:01:13.109037 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7782b6ea333553eb0b1bf39e1d1ba19e645b7dd03ad2185c89e4bb03e7e8a7ad" Mar 12 15:01:13 crc kubenswrapper[4869]: I0312 15:01:13.109034 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw" Mar 12 15:01:15 crc kubenswrapper[4869]: I0312 15:01:15.176714 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-l4dd7"] Mar 12 15:01:15 crc kubenswrapper[4869]: E0312 15:01:15.177282 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15ef5f34-7088-4444-8423-ace6aaa9661a" containerName="pull" Mar 12 15:01:15 crc kubenswrapper[4869]: I0312 15:01:15.177301 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="15ef5f34-7088-4444-8423-ace6aaa9661a" containerName="pull" Mar 12 15:01:15 crc kubenswrapper[4869]: E0312 15:01:15.177321 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15ef5f34-7088-4444-8423-ace6aaa9661a" containerName="extract" Mar 12 15:01:15 crc kubenswrapper[4869]: I0312 15:01:15.177329 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="15ef5f34-7088-4444-8423-ace6aaa9661a" containerName="extract" Mar 12 15:01:15 crc kubenswrapper[4869]: E0312 15:01:15.177342 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15ef5f34-7088-4444-8423-ace6aaa9661a" containerName="util" Mar 12 15:01:15 crc kubenswrapper[4869]: I0312 15:01:15.177350 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="15ef5f34-7088-4444-8423-ace6aaa9661a" containerName="util" Mar 12 15:01:15 crc kubenswrapper[4869]: I0312 15:01:15.177469 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="15ef5f34-7088-4444-8423-ace6aaa9661a" containerName="extract" Mar 12 15:01:15 crc kubenswrapper[4869]: I0312 15:01:15.177886 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-l4dd7" Mar 12 15:01:15 crc kubenswrapper[4869]: I0312 15:01:15.179971 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-l6d7z" Mar 12 15:01:15 crc kubenswrapper[4869]: I0312 15:01:15.180368 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 12 15:01:15 crc kubenswrapper[4869]: I0312 15:01:15.183972 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 12 15:01:15 crc kubenswrapper[4869]: I0312 15:01:15.188955 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-l4dd7"] Mar 12 15:01:15 crc kubenswrapper[4869]: I0312 15:01:15.274626 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl6qg\" (UniqueName: \"kubernetes.io/projected/6bd6a2b1-e9dc-4f93-9c48-a2c2d8622e49-kube-api-access-cl6qg\") pod \"nmstate-operator-796d4cfff4-l4dd7\" (UID: \"6bd6a2b1-e9dc-4f93-9c48-a2c2d8622e49\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-l4dd7" Mar 12 15:01:15 crc kubenswrapper[4869]: I0312 15:01:15.376017 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl6qg\" (UniqueName: \"kubernetes.io/projected/6bd6a2b1-e9dc-4f93-9c48-a2c2d8622e49-kube-api-access-cl6qg\") pod \"nmstate-operator-796d4cfff4-l4dd7\" (UID: \"6bd6a2b1-e9dc-4f93-9c48-a2c2d8622e49\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-l4dd7" Mar 12 15:01:15 crc kubenswrapper[4869]: I0312 15:01:15.394789 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl6qg\" (UniqueName: \"kubernetes.io/projected/6bd6a2b1-e9dc-4f93-9c48-a2c2d8622e49-kube-api-access-cl6qg\") pod \"nmstate-operator-796d4cfff4-l4dd7\" (UID: \"6bd6a2b1-e9dc-4f93-9c48-a2c2d8622e49\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-l4dd7" Mar 12 15:01:15 crc kubenswrapper[4869]: I0312 15:01:15.494514 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-l4dd7" Mar 12 15:01:18 crc kubenswrapper[4869]: I0312 15:01:18.496987 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-l4dd7"] Mar 12 15:01:18 crc kubenswrapper[4869]: W0312 15:01:18.505992 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bd6a2b1_e9dc_4f93_9c48_a2c2d8622e49.slice/crio-5f3a1003b59754ff8b07d52c8e836ed42907d7871ce8470f548e8e88b412befb WatchSource:0}: Error finding container 5f3a1003b59754ff8b07d52c8e836ed42907d7871ce8470f548e8e88b412befb: Status 404 returned error can't find the container with id 5f3a1003b59754ff8b07d52c8e836ed42907d7871ce8470f548e8e88b412befb Mar 12 15:01:19 crc kubenswrapper[4869]: I0312 15:01:19.151708 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-l4dd7" event={"ID":"6bd6a2b1-e9dc-4f93-9c48-a2c2d8622e49","Type":"ContainerStarted","Data":"5f3a1003b59754ff8b07d52c8e836ed42907d7871ce8470f548e8e88b412befb"} Mar 12 15:01:19 crc kubenswrapper[4869]: I0312 15:01:19.153364 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v6qsw" event={"ID":"c1f177f0-abce-4ed6-9aee-0d11fd6818ef","Type":"ContainerStarted","Data":"c8ff320857a3ce305f32a334386a2cd906e89e338d93bb5ebcbe5a640768d368"} Mar 12 15:01:20 crc kubenswrapper[4869]: I0312 15:01:20.159672 4869 generic.go:334] "Generic (PLEG): container finished" podID="c1f177f0-abce-4ed6-9aee-0d11fd6818ef" containerID="c8ff320857a3ce305f32a334386a2cd906e89e338d93bb5ebcbe5a640768d368" exitCode=0 Mar 12 15:01:20 crc kubenswrapper[4869]: I0312 15:01:20.159749 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v6qsw" event={"ID":"c1f177f0-abce-4ed6-9aee-0d11fd6818ef","Type":"ContainerDied","Data":"c8ff320857a3ce305f32a334386a2cd906e89e338d93bb5ebcbe5a640768d368"} Mar 12 15:01:20 crc kubenswrapper[4869]: I0312 15:01:20.160014 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v6qsw" event={"ID":"c1f177f0-abce-4ed6-9aee-0d11fd6818ef","Type":"ContainerStarted","Data":"440f267a36de6d392cd6242d4e2467da64d70f0befa2fac4def1c8525f449fec"} Mar 12 15:01:20 crc kubenswrapper[4869]: I0312 15:01:20.177354 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v6qsw" podStartSLOduration=2.691297085 podStartE2EDuration="12.177336282s" podCreationTimestamp="2026-03-12 15:01:08 +0000 UTC" firstStartedPulling="2026-03-12 15:01:10.095782041 +0000 UTC m=+822.381007319" lastFinishedPulling="2026-03-12 15:01:19.581821228 +0000 UTC m=+831.867046516" observedRunningTime="2026-03-12 15:01:20.175792908 +0000 UTC m=+832.461018196" watchObservedRunningTime="2026-03-12 15:01:20.177336282 +0000 UTC m=+832.462561560" Mar 12 15:01:21 crc kubenswrapper[4869]: I0312 15:01:21.166673 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-l4dd7" event={"ID":"6bd6a2b1-e9dc-4f93-9c48-a2c2d8622e49","Type":"ContainerStarted","Data":"31b8fc8bb7250922126f6f32031bfa16eaf75ec025facb35758c41bf55ccfb35"} Mar 12 15:01:21 crc kubenswrapper[4869]: I0312 15:01:21.185985 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-l4dd7" podStartSLOduration=4.082781736 podStartE2EDuration="6.18596485s" podCreationTimestamp="2026-03-12 15:01:15 +0000 UTC" firstStartedPulling="2026-03-12 15:01:18.508980857 +0000 UTC m=+830.794206135" lastFinishedPulling="2026-03-12 15:01:20.612163971 +0000 UTC m=+832.897389249" observedRunningTime="2026-03-12 15:01:21.183759978 +0000 UTC m=+833.468985256" watchObservedRunningTime="2026-03-12 15:01:21.18596485 +0000 UTC m=+833.471190128" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.150231 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-29dkf"] Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.151346 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-29dkf" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.153031 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-88xzg" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.159785 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-gkvrq"] Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.160650 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-gkvrq" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.162479 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.185594 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-gkvrq"] Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.202673 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-b74gm"] Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.203391 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-b74gm" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.215473 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-29dkf"] Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.249621 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrtvn\" (UniqueName: \"kubernetes.io/projected/f2b26b4e-3c20-4729-b93e-81e8909a4c86-kube-api-access-lrtvn\") pod \"nmstate-metrics-9b8c8685d-29dkf\" (UID: \"f2b26b4e-3c20-4729-b93e-81e8909a4c86\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-29dkf" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.249754 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7073c354-8a03-4c91-a9aa-9ec780f52b65-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-gkvrq\" (UID: \"7073c354-8a03-4c91-a9aa-9ec780f52b65\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-gkvrq" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.249800 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs9g9\" (UniqueName: \"kubernetes.io/projected/7073c354-8a03-4c91-a9aa-9ec780f52b65-kube-api-access-fs9g9\") pod \"nmstate-webhook-5f558f5558-gkvrq\" (UID: \"7073c354-8a03-4c91-a9aa-9ec780f52b65\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-gkvrq" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.328936 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-xljxl"] Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.329641 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-xljxl" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.331852 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.332203 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-mqzpk" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.332347 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.347839 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-xljxl"] Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.351146 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/71ed27a7-141a-44c4-833d-baa692ec7af0-ovs-socket\") pod \"nmstate-handler-b74gm\" (UID: \"71ed27a7-141a-44c4-833d-baa692ec7af0\") " pod="openshift-nmstate/nmstate-handler-b74gm" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.351194 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7073c354-8a03-4c91-a9aa-9ec780f52b65-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-gkvrq\" (UID: \"7073c354-8a03-4c91-a9aa-9ec780f52b65\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-gkvrq" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.351224 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/71ed27a7-141a-44c4-833d-baa692ec7af0-dbus-socket\") pod \"nmstate-handler-b74gm\" (UID: \"71ed27a7-141a-44c4-833d-baa692ec7af0\") " pod="openshift-nmstate/nmstate-handler-b74gm" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.351253 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs9g9\" (UniqueName: \"kubernetes.io/projected/7073c354-8a03-4c91-a9aa-9ec780f52b65-kube-api-access-fs9g9\") pod \"nmstate-webhook-5f558f5558-gkvrq\" (UID: \"7073c354-8a03-4c91-a9aa-9ec780f52b65\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-gkvrq" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.351291 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrtvn\" (UniqueName: \"kubernetes.io/projected/f2b26b4e-3c20-4729-b93e-81e8909a4c86-kube-api-access-lrtvn\") pod \"nmstate-metrics-9b8c8685d-29dkf\" (UID: \"f2b26b4e-3c20-4729-b93e-81e8909a4c86\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-29dkf" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.351314 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/71ed27a7-141a-44c4-833d-baa692ec7af0-nmstate-lock\") pod \"nmstate-handler-b74gm\" (UID: \"71ed27a7-141a-44c4-833d-baa692ec7af0\") " pod="openshift-nmstate/nmstate-handler-b74gm" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.351352 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvpj8\" (UniqueName: \"kubernetes.io/projected/71ed27a7-141a-44c4-833d-baa692ec7af0-kube-api-access-pvpj8\") pod \"nmstate-handler-b74gm\" (UID: \"71ed27a7-141a-44c4-833d-baa692ec7af0\") " pod="openshift-nmstate/nmstate-handler-b74gm" Mar 12 15:01:26 crc kubenswrapper[4869]: E0312 15:01:26.351586 4869 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 12 15:01:26 crc kubenswrapper[4869]: E0312 15:01:26.351646 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7073c354-8a03-4c91-a9aa-9ec780f52b65-tls-key-pair podName:7073c354-8a03-4c91-a9aa-9ec780f52b65 nodeName:}" failed. No retries permitted until 2026-03-12 15:01:26.851624775 +0000 UTC m=+839.136850053 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/7073c354-8a03-4c91-a9aa-9ec780f52b65-tls-key-pair") pod "nmstate-webhook-5f558f5558-gkvrq" (UID: "7073c354-8a03-4c91-a9aa-9ec780f52b65") : secret "openshift-nmstate-webhook" not found Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.375632 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs9g9\" (UniqueName: \"kubernetes.io/projected/7073c354-8a03-4c91-a9aa-9ec780f52b65-kube-api-access-fs9g9\") pod \"nmstate-webhook-5f558f5558-gkvrq\" (UID: \"7073c354-8a03-4c91-a9aa-9ec780f52b65\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-gkvrq" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.378641 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrtvn\" (UniqueName: \"kubernetes.io/projected/f2b26b4e-3c20-4729-b93e-81e8909a4c86-kube-api-access-lrtvn\") pod \"nmstate-metrics-9b8c8685d-29dkf\" (UID: \"f2b26b4e-3c20-4729-b93e-81e8909a4c86\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-29dkf" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.452808 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/71ed27a7-141a-44c4-833d-baa692ec7af0-ovs-socket\") pod \"nmstate-handler-b74gm\" (UID: \"71ed27a7-141a-44c4-833d-baa692ec7af0\") " pod="openshift-nmstate/nmstate-handler-b74gm" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.452878 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/71ed27a7-141a-44c4-833d-baa692ec7af0-dbus-socket\") pod \"nmstate-handler-b74gm\" (UID: \"71ed27a7-141a-44c4-833d-baa692ec7af0\") " pod="openshift-nmstate/nmstate-handler-b74gm" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.452920 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/685a6d03-a0dd-4818-82f4-1839b0c094c3-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-xljxl\" (UID: \"685a6d03-a0dd-4818-82f4-1839b0c094c3\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-xljxl" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.452972 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/71ed27a7-141a-44c4-833d-baa692ec7af0-ovs-socket\") pod \"nmstate-handler-b74gm\" (UID: \"71ed27a7-141a-44c4-833d-baa692ec7af0\") " pod="openshift-nmstate/nmstate-handler-b74gm" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.452963 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/71ed27a7-141a-44c4-833d-baa692ec7af0-nmstate-lock\") pod \"nmstate-handler-b74gm\" (UID: \"71ed27a7-141a-44c4-833d-baa692ec7af0\") " pod="openshift-nmstate/nmstate-handler-b74gm" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.453033 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/71ed27a7-141a-44c4-833d-baa692ec7af0-nmstate-lock\") pod \"nmstate-handler-b74gm\" (UID: \"71ed27a7-141a-44c4-833d-baa692ec7af0\") " pod="openshift-nmstate/nmstate-handler-b74gm" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.453144 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2qvm\" (UniqueName: \"kubernetes.io/projected/685a6d03-a0dd-4818-82f4-1839b0c094c3-kube-api-access-j2qvm\") pod \"nmstate-console-plugin-86f58fcf4-xljxl\" (UID: \"685a6d03-a0dd-4818-82f4-1839b0c094c3\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-xljxl" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.453205 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvpj8\" (UniqueName: \"kubernetes.io/projected/71ed27a7-141a-44c4-833d-baa692ec7af0-kube-api-access-pvpj8\") pod \"nmstate-handler-b74gm\" (UID: \"71ed27a7-141a-44c4-833d-baa692ec7af0\") " pod="openshift-nmstate/nmstate-handler-b74gm" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.453258 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/685a6d03-a0dd-4818-82f4-1839b0c094c3-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-xljxl\" (UID: \"685a6d03-a0dd-4818-82f4-1839b0c094c3\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-xljxl" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.453285 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/71ed27a7-141a-44c4-833d-baa692ec7af0-dbus-socket\") pod \"nmstate-handler-b74gm\" (UID: \"71ed27a7-141a-44c4-833d-baa692ec7af0\") " pod="openshift-nmstate/nmstate-handler-b74gm" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.472559 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-29dkf" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.484098 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvpj8\" (UniqueName: \"kubernetes.io/projected/71ed27a7-141a-44c4-833d-baa692ec7af0-kube-api-access-pvpj8\") pod \"nmstate-handler-b74gm\" (UID: \"71ed27a7-141a-44c4-833d-baa692ec7af0\") " pod="openshift-nmstate/nmstate-handler-b74gm" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.519150 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-b74gm" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.554629 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/685a6d03-a0dd-4818-82f4-1839b0c094c3-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-xljxl\" (UID: \"685a6d03-a0dd-4818-82f4-1839b0c094c3\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-xljxl" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.554684 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2qvm\" (UniqueName: \"kubernetes.io/projected/685a6d03-a0dd-4818-82f4-1839b0c094c3-kube-api-access-j2qvm\") pod \"nmstate-console-plugin-86f58fcf4-xljxl\" (UID: \"685a6d03-a0dd-4818-82f4-1839b0c094c3\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-xljxl" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.554708 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/685a6d03-a0dd-4818-82f4-1839b0c094c3-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-xljxl\" (UID: \"685a6d03-a0dd-4818-82f4-1839b0c094c3\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-xljxl" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.555839 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/685a6d03-a0dd-4818-82f4-1839b0c094c3-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-xljxl\" (UID: \"685a6d03-a0dd-4818-82f4-1839b0c094c3\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-xljxl" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.557661 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/685a6d03-a0dd-4818-82f4-1839b0c094c3-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-xljxl\" (UID: \"685a6d03-a0dd-4818-82f4-1839b0c094c3\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-xljxl" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.578211 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2qvm\" (UniqueName: \"kubernetes.io/projected/685a6d03-a0dd-4818-82f4-1839b0c094c3-kube-api-access-j2qvm\") pod \"nmstate-console-plugin-86f58fcf4-xljxl\" (UID: \"685a6d03-a0dd-4818-82f4-1839b0c094c3\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-xljxl" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.601718 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-777468bbb4-wt974"] Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.602625 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-777468bbb4-wt974" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.625705 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-777468bbb4-wt974"] Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.653770 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-xljxl" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.762817 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1882280-4ac9-4972-a830-2992c4e2c3c8-trusted-ca-bundle\") pod \"console-777468bbb4-wt974\" (UID: \"d1882280-4ac9-4972-a830-2992c4e2c3c8\") " pod="openshift-console/console-777468bbb4-wt974" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.762860 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d1882280-4ac9-4972-a830-2992c4e2c3c8-console-config\") pod \"console-777468bbb4-wt974\" (UID: \"d1882280-4ac9-4972-a830-2992c4e2c3c8\") " pod="openshift-console/console-777468bbb4-wt974" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.762960 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d1882280-4ac9-4972-a830-2992c4e2c3c8-oauth-serving-cert\") pod \"console-777468bbb4-wt974\" (UID: \"d1882280-4ac9-4972-a830-2992c4e2c3c8\") " pod="openshift-console/console-777468bbb4-wt974" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.762996 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pff8z\" (UniqueName: \"kubernetes.io/projected/d1882280-4ac9-4972-a830-2992c4e2c3c8-kube-api-access-pff8z\") pod \"console-777468bbb4-wt974\" (UID: \"d1882280-4ac9-4972-a830-2992c4e2c3c8\") " pod="openshift-console/console-777468bbb4-wt974" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.763016 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d1882280-4ac9-4972-a830-2992c4e2c3c8-console-oauth-config\") pod \"console-777468bbb4-wt974\" (UID: \"d1882280-4ac9-4972-a830-2992c4e2c3c8\") " pod="openshift-console/console-777468bbb4-wt974" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.763188 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d1882280-4ac9-4972-a830-2992c4e2c3c8-service-ca\") pod \"console-777468bbb4-wt974\" (UID: \"d1882280-4ac9-4972-a830-2992c4e2c3c8\") " pod="openshift-console/console-777468bbb4-wt974" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.763336 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1882280-4ac9-4972-a830-2992c4e2c3c8-console-serving-cert\") pod \"console-777468bbb4-wt974\" (UID: \"d1882280-4ac9-4972-a830-2992c4e2c3c8\") " pod="openshift-console/console-777468bbb4-wt974" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.864666 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d1882280-4ac9-4972-a830-2992c4e2c3c8-oauth-serving-cert\") pod \"console-777468bbb4-wt974\" (UID: \"d1882280-4ac9-4972-a830-2992c4e2c3c8\") " pod="openshift-console/console-777468bbb4-wt974" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.864724 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pff8z\" (UniqueName: \"kubernetes.io/projected/d1882280-4ac9-4972-a830-2992c4e2c3c8-kube-api-access-pff8z\") pod \"console-777468bbb4-wt974\" (UID: \"d1882280-4ac9-4972-a830-2992c4e2c3c8\") " pod="openshift-console/console-777468bbb4-wt974" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.864754 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d1882280-4ac9-4972-a830-2992c4e2c3c8-console-oauth-config\") pod \"console-777468bbb4-wt974\" (UID: \"d1882280-4ac9-4972-a830-2992c4e2c3c8\") " pod="openshift-console/console-777468bbb4-wt974" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.864825 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7073c354-8a03-4c91-a9aa-9ec780f52b65-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-gkvrq\" (UID: \"7073c354-8a03-4c91-a9aa-9ec780f52b65\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-gkvrq" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.864858 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d1882280-4ac9-4972-a830-2992c4e2c3c8-service-ca\") pod \"console-777468bbb4-wt974\" (UID: \"d1882280-4ac9-4972-a830-2992c4e2c3c8\") " pod="openshift-console/console-777468bbb4-wt974" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.864916 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1882280-4ac9-4972-a830-2992c4e2c3c8-console-serving-cert\") pod \"console-777468bbb4-wt974\" (UID: \"d1882280-4ac9-4972-a830-2992c4e2c3c8\") " pod="openshift-console/console-777468bbb4-wt974" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.864940 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1882280-4ac9-4972-a830-2992c4e2c3c8-trusted-ca-bundle\") pod \"console-777468bbb4-wt974\" (UID: \"d1882280-4ac9-4972-a830-2992c4e2c3c8\") " pod="openshift-console/console-777468bbb4-wt974" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.864962 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d1882280-4ac9-4972-a830-2992c4e2c3c8-console-config\") pod \"console-777468bbb4-wt974\" (UID: \"d1882280-4ac9-4972-a830-2992c4e2c3c8\") " pod="openshift-console/console-777468bbb4-wt974" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.866575 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d1882280-4ac9-4972-a830-2992c4e2c3c8-oauth-serving-cert\") pod \"console-777468bbb4-wt974\" (UID: \"d1882280-4ac9-4972-a830-2992c4e2c3c8\") " pod="openshift-console/console-777468bbb4-wt974" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.866717 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1882280-4ac9-4972-a830-2992c4e2c3c8-trusted-ca-bundle\") pod \"console-777468bbb4-wt974\" (UID: \"d1882280-4ac9-4972-a830-2992c4e2c3c8\") " pod="openshift-console/console-777468bbb4-wt974" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.867476 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d1882280-4ac9-4972-a830-2992c4e2c3c8-console-config\") pod \"console-777468bbb4-wt974\" (UID: \"d1882280-4ac9-4972-a830-2992c4e2c3c8\") " pod="openshift-console/console-777468bbb4-wt974" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.868317 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d1882280-4ac9-4972-a830-2992c4e2c3c8-service-ca\") pod \"console-777468bbb4-wt974\" (UID: \"d1882280-4ac9-4972-a830-2992c4e2c3c8\") " pod="openshift-console/console-777468bbb4-wt974" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.871153 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d1882280-4ac9-4972-a830-2992c4e2c3c8-console-oauth-config\") pod \"console-777468bbb4-wt974\" (UID: \"d1882280-4ac9-4972-a830-2992c4e2c3c8\") " pod="openshift-console/console-777468bbb4-wt974" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.871153 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7073c354-8a03-4c91-a9aa-9ec780f52b65-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-gkvrq\" (UID: \"7073c354-8a03-4c91-a9aa-9ec780f52b65\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-gkvrq" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.871261 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1882280-4ac9-4972-a830-2992c4e2c3c8-console-serving-cert\") pod \"console-777468bbb4-wt974\" (UID: \"d1882280-4ac9-4972-a830-2992c4e2c3c8\") " pod="openshift-console/console-777468bbb4-wt974" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.883649 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pff8z\" (UniqueName: \"kubernetes.io/projected/d1882280-4ac9-4972-a830-2992c4e2c3c8-kube-api-access-pff8z\") pod \"console-777468bbb4-wt974\" (UID: \"d1882280-4ac9-4972-a830-2992c4e2c3c8\") " pod="openshift-console/console-777468bbb4-wt974" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.943651 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-777468bbb4-wt974" Mar 12 15:01:26 crc kubenswrapper[4869]: I0312 15:01:26.963764 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-29dkf"] Mar 12 15:01:26 crc kubenswrapper[4869]: W0312 15:01:26.974757 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2b26b4e_3c20_4729_b93e_81e8909a4c86.slice/crio-7484e10bdfbf7ee4a33f9fa672781e7ba6566f54ac1b37e1f5b6d52e902616d9 WatchSource:0}: Error finding container 7484e10bdfbf7ee4a33f9fa672781e7ba6566f54ac1b37e1f5b6d52e902616d9: Status 404 returned error can't find the container with id 7484e10bdfbf7ee4a33f9fa672781e7ba6566f54ac1b37e1f5b6d52e902616d9 Mar 12 15:01:27 crc kubenswrapper[4869]: I0312 15:01:27.034177 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-xljxl"] Mar 12 15:01:27 crc kubenswrapper[4869]: I0312 15:01:27.092532 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-gkvrq" Mar 12 15:01:27 crc kubenswrapper[4869]: I0312 15:01:27.115478 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-777468bbb4-wt974"] Mar 12 15:01:27 crc kubenswrapper[4869]: W0312 15:01:27.123766 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1882280_4ac9_4972_a830_2992c4e2c3c8.slice/crio-dc6cca937573a812d017dee287368ed2141bb728b5b1185965e7345b726f2174 WatchSource:0}: Error finding container dc6cca937573a812d017dee287368ed2141bb728b5b1185965e7345b726f2174: Status 404 returned error can't find the container with id dc6cca937573a812d017dee287368ed2141bb728b5b1185965e7345b726f2174 Mar 12 15:01:27 crc kubenswrapper[4869]: I0312 15:01:27.203036 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-xljxl" event={"ID":"685a6d03-a0dd-4818-82f4-1839b0c094c3","Type":"ContainerStarted","Data":"9dd94ba8009903f4783c062002229c89ba3ab0e1da3edcb6a281066d4bd1f471"} Mar 12 15:01:27 crc kubenswrapper[4869]: I0312 15:01:27.204369 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-29dkf" event={"ID":"f2b26b4e-3c20-4729-b93e-81e8909a4c86","Type":"ContainerStarted","Data":"7484e10bdfbf7ee4a33f9fa672781e7ba6566f54ac1b37e1f5b6d52e902616d9"} Mar 12 15:01:27 crc kubenswrapper[4869]: I0312 15:01:27.205914 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-b74gm" event={"ID":"71ed27a7-141a-44c4-833d-baa692ec7af0","Type":"ContainerStarted","Data":"66440bfa7e7e4dd00600d10d771ab7734c9d541c62f37cbdb566fdd26dbbb8a1"} Mar 12 15:01:27 crc kubenswrapper[4869]: I0312 15:01:27.207784 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-777468bbb4-wt974" event={"ID":"d1882280-4ac9-4972-a830-2992c4e2c3c8","Type":"ContainerStarted","Data":"dc6cca937573a812d017dee287368ed2141bb728b5b1185965e7345b726f2174"} Mar 12 15:01:27 crc kubenswrapper[4869]: I0312 15:01:27.273862 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-gkvrq"] Mar 12 15:01:27 crc kubenswrapper[4869]: W0312 15:01:27.279937 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7073c354_8a03_4c91_a9aa_9ec780f52b65.slice/crio-df9ce43e5d4cef3d159a452a14d8696561897d6be913d9531cb87953abcb1d8e WatchSource:0}: Error finding container df9ce43e5d4cef3d159a452a14d8696561897d6be913d9531cb87953abcb1d8e: Status 404 returned error can't find the container with id df9ce43e5d4cef3d159a452a14d8696561897d6be913d9531cb87953abcb1d8e Mar 12 15:01:28 crc kubenswrapper[4869]: I0312 15:01:28.214315 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-777468bbb4-wt974" event={"ID":"d1882280-4ac9-4972-a830-2992c4e2c3c8","Type":"ContainerStarted","Data":"a902aebd42d3224c87953af7b2a4f29766941dcdc436ea0914b43802563d551a"} Mar 12 15:01:28 crc kubenswrapper[4869]: I0312 15:01:28.215781 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-gkvrq" event={"ID":"7073c354-8a03-4c91-a9aa-9ec780f52b65","Type":"ContainerStarted","Data":"df9ce43e5d4cef3d159a452a14d8696561897d6be913d9531cb87953abcb1d8e"} Mar 12 15:01:28 crc kubenswrapper[4869]: I0312 15:01:28.241100 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-777468bbb4-wt974" podStartSLOduration=2.241076707 podStartE2EDuration="2.241076707s" podCreationTimestamp="2026-03-12 15:01:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:01:28.229663706 +0000 UTC m=+840.514889024" watchObservedRunningTime="2026-03-12 15:01:28.241076707 +0000 UTC m=+840.526301985" Mar 12 15:01:28 crc kubenswrapper[4869]: I0312 15:01:28.821398 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v6qsw" Mar 12 15:01:28 crc kubenswrapper[4869]: I0312 15:01:28.821741 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v6qsw" Mar 12 15:01:28 crc kubenswrapper[4869]: I0312 15:01:28.867087 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v6qsw" Mar 12 15:01:29 crc kubenswrapper[4869]: I0312 15:01:29.263613 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v6qsw" Mar 12 15:01:29 crc kubenswrapper[4869]: I0312 15:01:29.364308 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v6qsw"] Mar 12 15:01:29 crc kubenswrapper[4869]: I0312 15:01:29.397674 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p6g28"] Mar 12 15:01:29 crc kubenswrapper[4869]: I0312 15:01:29.398175 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p6g28" podUID="4aedeb34-f607-43d8-89bc-dac85b2c68ba" containerName="registry-server" containerID="cri-o://32e8a9c91cb3ed5482c72bf170c2cae833a6a108e194b535ea94f1d5114ae514" gracePeriod=2 Mar 12 15:01:29 crc kubenswrapper[4869]: I0312 15:01:29.775981 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6g28" Mar 12 15:01:29 crc kubenswrapper[4869]: I0312 15:01:29.926628 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9nrm\" (UniqueName: \"kubernetes.io/projected/4aedeb34-f607-43d8-89bc-dac85b2c68ba-kube-api-access-g9nrm\") pod \"4aedeb34-f607-43d8-89bc-dac85b2c68ba\" (UID: \"4aedeb34-f607-43d8-89bc-dac85b2c68ba\") " Mar 12 15:01:29 crc kubenswrapper[4869]: I0312 15:01:29.926730 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aedeb34-f607-43d8-89bc-dac85b2c68ba-catalog-content\") pod \"4aedeb34-f607-43d8-89bc-dac85b2c68ba\" (UID: \"4aedeb34-f607-43d8-89bc-dac85b2c68ba\") " Mar 12 15:01:29 crc kubenswrapper[4869]: I0312 15:01:29.926829 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aedeb34-f607-43d8-89bc-dac85b2c68ba-utilities\") pod \"4aedeb34-f607-43d8-89bc-dac85b2c68ba\" (UID: \"4aedeb34-f607-43d8-89bc-dac85b2c68ba\") " Mar 12 15:01:29 crc kubenswrapper[4869]: I0312 15:01:29.928994 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4aedeb34-f607-43d8-89bc-dac85b2c68ba-utilities" (OuterVolumeSpecName: "utilities") pod "4aedeb34-f607-43d8-89bc-dac85b2c68ba" (UID: "4aedeb34-f607-43d8-89bc-dac85b2c68ba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:01:29 crc kubenswrapper[4869]: I0312 15:01:29.932918 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aedeb34-f607-43d8-89bc-dac85b2c68ba-kube-api-access-g9nrm" (OuterVolumeSpecName: "kube-api-access-g9nrm") pod "4aedeb34-f607-43d8-89bc-dac85b2c68ba" (UID: "4aedeb34-f607-43d8-89bc-dac85b2c68ba"). InnerVolumeSpecName "kube-api-access-g9nrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:01:30 crc kubenswrapper[4869]: I0312 15:01:30.028471 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9nrm\" (UniqueName: \"kubernetes.io/projected/4aedeb34-f607-43d8-89bc-dac85b2c68ba-kube-api-access-g9nrm\") on node \"crc\" DevicePath \"\"" Mar 12 15:01:30 crc kubenswrapper[4869]: I0312 15:01:30.028503 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aedeb34-f607-43d8-89bc-dac85b2c68ba-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:01:30 crc kubenswrapper[4869]: I0312 15:01:30.073314 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4aedeb34-f607-43d8-89bc-dac85b2c68ba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4aedeb34-f607-43d8-89bc-dac85b2c68ba" (UID: "4aedeb34-f607-43d8-89bc-dac85b2c68ba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:01:30 crc kubenswrapper[4869]: I0312 15:01:30.130231 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aedeb34-f607-43d8-89bc-dac85b2c68ba-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:01:30 crc kubenswrapper[4869]: I0312 15:01:30.226808 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-29dkf" event={"ID":"f2b26b4e-3c20-4729-b93e-81e8909a4c86","Type":"ContainerStarted","Data":"bc4c19951593cc86aa7fb096b460819101017aa7b9a16e511bf34d2d0b40c80a"} Mar 12 15:01:30 crc kubenswrapper[4869]: I0312 15:01:30.228802 4869 generic.go:334] "Generic (PLEG): container finished" podID="4aedeb34-f607-43d8-89bc-dac85b2c68ba" containerID="32e8a9c91cb3ed5482c72bf170c2cae833a6a108e194b535ea94f1d5114ae514" exitCode=0 Mar 12 15:01:30 crc kubenswrapper[4869]: I0312 15:01:30.228823 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6g28" event={"ID":"4aedeb34-f607-43d8-89bc-dac85b2c68ba","Type":"ContainerDied","Data":"32e8a9c91cb3ed5482c72bf170c2cae833a6a108e194b535ea94f1d5114ae514"} Mar 12 15:01:30 crc kubenswrapper[4869]: I0312 15:01:30.228849 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6g28" event={"ID":"4aedeb34-f607-43d8-89bc-dac85b2c68ba","Type":"ContainerDied","Data":"0c32f7809452e239700566f87ff05b0e679a14939fb35bef474c2f0f412b51ce"} Mar 12 15:01:30 crc kubenswrapper[4869]: I0312 15:01:30.228867 4869 scope.go:117] "RemoveContainer" containerID="32e8a9c91cb3ed5482c72bf170c2cae833a6a108e194b535ea94f1d5114ae514" Mar 12 15:01:30 crc kubenswrapper[4869]: I0312 15:01:30.228873 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6g28" Mar 12 15:01:30 crc kubenswrapper[4869]: I0312 15:01:30.231275 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-b74gm" event={"ID":"71ed27a7-141a-44c4-833d-baa692ec7af0","Type":"ContainerStarted","Data":"1f0a54a98a5c7a0998f4cd9456c1f4fbd33f2baa81800607e3c69c128fb64af5"} Mar 12 15:01:30 crc kubenswrapper[4869]: I0312 15:01:30.231551 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-b74gm" Mar 12 15:01:30 crc kubenswrapper[4869]: I0312 15:01:30.233306 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-gkvrq" event={"ID":"7073c354-8a03-4c91-a9aa-9ec780f52b65","Type":"ContainerStarted","Data":"1da41c3dfbfb45d511714143e4a89f28d8dfc8bcd4d007a93bb747f3d9922430"} Mar 12 15:01:30 crc kubenswrapper[4869]: I0312 15:01:30.233334 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-gkvrq" Mar 12 15:01:30 crc kubenswrapper[4869]: I0312 15:01:30.250918 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-b74gm" podStartSLOduration=1.465979815 podStartE2EDuration="4.25090312s" podCreationTimestamp="2026-03-12 15:01:26 +0000 UTC" firstStartedPulling="2026-03-12 15:01:26.564582654 +0000 UTC m=+838.849807932" lastFinishedPulling="2026-03-12 15:01:29.349505959 +0000 UTC m=+841.634731237" observedRunningTime="2026-03-12 15:01:30.248368158 +0000 UTC m=+842.533593436" watchObservedRunningTime="2026-03-12 15:01:30.25090312 +0000 UTC m=+842.536128408" Mar 12 15:01:30 crc kubenswrapper[4869]: I0312 15:01:30.267637 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p6g28"] Mar 12 15:01:30 crc kubenswrapper[4869]: I0312 15:01:30.272681 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p6g28"] Mar 12 15:01:30 crc kubenswrapper[4869]: I0312 15:01:30.285436 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-gkvrq" podStartSLOduration=2.21889783 podStartE2EDuration="4.285414194s" podCreationTimestamp="2026-03-12 15:01:26 +0000 UTC" firstStartedPulling="2026-03-12 15:01:27.283210621 +0000 UTC m=+839.568435889" lastFinishedPulling="2026-03-12 15:01:29.349726975 +0000 UTC m=+841.634952253" observedRunningTime="2026-03-12 15:01:30.284288442 +0000 UTC m=+842.569513720" watchObservedRunningTime="2026-03-12 15:01:30.285414194 +0000 UTC m=+842.570639472" Mar 12 15:01:30 crc kubenswrapper[4869]: I0312 15:01:30.345965 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4aedeb34-f607-43d8-89bc-dac85b2c68ba" path="/var/lib/kubelet/pods/4aedeb34-f607-43d8-89bc-dac85b2c68ba/volumes" Mar 12 15:01:30 crc kubenswrapper[4869]: I0312 15:01:30.868814 4869 scope.go:117] "RemoveContainer" containerID="010158b87ed73596d3e8829a35f709536ed5fea0c527ca6539e691f10093416c" Mar 12 15:01:30 crc kubenswrapper[4869]: I0312 15:01:30.885283 4869 scope.go:117] "RemoveContainer" containerID="6060196f2d18d65563121d8ccab44b2ca195380ea598180a489e38a65cea5011" Mar 12 15:01:30 crc kubenswrapper[4869]: I0312 15:01:30.902295 4869 scope.go:117] "RemoveContainer" containerID="32e8a9c91cb3ed5482c72bf170c2cae833a6a108e194b535ea94f1d5114ae514" Mar 12 15:01:30 crc kubenswrapper[4869]: E0312 15:01:30.902920 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32e8a9c91cb3ed5482c72bf170c2cae833a6a108e194b535ea94f1d5114ae514\": container with ID starting with 32e8a9c91cb3ed5482c72bf170c2cae833a6a108e194b535ea94f1d5114ae514 not found: ID does not exist" containerID="32e8a9c91cb3ed5482c72bf170c2cae833a6a108e194b535ea94f1d5114ae514" Mar 12 15:01:30 crc kubenswrapper[4869]: I0312 15:01:30.902968 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32e8a9c91cb3ed5482c72bf170c2cae833a6a108e194b535ea94f1d5114ae514"} err="failed to get container status \"32e8a9c91cb3ed5482c72bf170c2cae833a6a108e194b535ea94f1d5114ae514\": rpc error: code = NotFound desc = could not find container \"32e8a9c91cb3ed5482c72bf170c2cae833a6a108e194b535ea94f1d5114ae514\": container with ID starting with 32e8a9c91cb3ed5482c72bf170c2cae833a6a108e194b535ea94f1d5114ae514 not found: ID does not exist" Mar 12 15:01:30 crc kubenswrapper[4869]: I0312 15:01:30.902997 4869 scope.go:117] "RemoveContainer" containerID="010158b87ed73596d3e8829a35f709536ed5fea0c527ca6539e691f10093416c" Mar 12 15:01:30 crc kubenswrapper[4869]: E0312 15:01:30.904875 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"010158b87ed73596d3e8829a35f709536ed5fea0c527ca6539e691f10093416c\": container with ID starting with 010158b87ed73596d3e8829a35f709536ed5fea0c527ca6539e691f10093416c not found: ID does not exist" containerID="010158b87ed73596d3e8829a35f709536ed5fea0c527ca6539e691f10093416c" Mar 12 15:01:30 crc kubenswrapper[4869]: I0312 15:01:30.904907 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"010158b87ed73596d3e8829a35f709536ed5fea0c527ca6539e691f10093416c"} err="failed to get container status \"010158b87ed73596d3e8829a35f709536ed5fea0c527ca6539e691f10093416c\": rpc error: code = NotFound desc = could not find container \"010158b87ed73596d3e8829a35f709536ed5fea0c527ca6539e691f10093416c\": container with ID starting with 010158b87ed73596d3e8829a35f709536ed5fea0c527ca6539e691f10093416c not found: ID does not exist" Mar 12 15:01:30 crc kubenswrapper[4869]: I0312 15:01:30.904926 4869 scope.go:117] "RemoveContainer" containerID="6060196f2d18d65563121d8ccab44b2ca195380ea598180a489e38a65cea5011" Mar 12 15:01:30 crc kubenswrapper[4869]: E0312 15:01:30.905284 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6060196f2d18d65563121d8ccab44b2ca195380ea598180a489e38a65cea5011\": container with ID starting with 6060196f2d18d65563121d8ccab44b2ca195380ea598180a489e38a65cea5011 not found: ID does not exist" containerID="6060196f2d18d65563121d8ccab44b2ca195380ea598180a489e38a65cea5011" Mar 12 15:01:30 crc kubenswrapper[4869]: I0312 15:01:30.905308 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6060196f2d18d65563121d8ccab44b2ca195380ea598180a489e38a65cea5011"} err="failed to get container status \"6060196f2d18d65563121d8ccab44b2ca195380ea598180a489e38a65cea5011\": rpc error: code = NotFound desc = could not find container \"6060196f2d18d65563121d8ccab44b2ca195380ea598180a489e38a65cea5011\": container with ID starting with 6060196f2d18d65563121d8ccab44b2ca195380ea598180a489e38a65cea5011 not found: ID does not exist" Mar 12 15:01:32 crc kubenswrapper[4869]: I0312 15:01:32.250100 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-xljxl" event={"ID":"685a6d03-a0dd-4818-82f4-1839b0c094c3","Type":"ContainerStarted","Data":"e3e1fa6f0d63879c4bd0277c857f1b7d5850e914b7450969a9297bd160fb03ad"} Mar 12 15:01:32 crc kubenswrapper[4869]: I0312 15:01:32.268597 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-xljxl" podStartSLOduration=2.116104138 podStartE2EDuration="6.268583183s" podCreationTimestamp="2026-03-12 15:01:26 +0000 UTC" firstStartedPulling="2026-03-12 15:01:27.045952406 +0000 UTC m=+839.331177684" lastFinishedPulling="2026-03-12 15:01:31.198431451 +0000 UTC m=+843.483656729" observedRunningTime="2026-03-12 15:01:32.267294007 +0000 UTC m=+844.552519285" watchObservedRunningTime="2026-03-12 15:01:32.268583183 +0000 UTC m=+844.553808451" Mar 12 15:01:33 crc kubenswrapper[4869]: I0312 15:01:33.256834 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-29dkf" event={"ID":"f2b26b4e-3c20-4729-b93e-81e8909a4c86","Type":"ContainerStarted","Data":"ff6d90cb8d440a0eaefb9c4030b93a3deea3dc3a3a1921dcbe93f236eac1bcfd"} Mar 12 15:01:33 crc kubenswrapper[4869]: I0312 15:01:33.282815 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-29dkf" podStartSLOduration=1.58958258 podStartE2EDuration="7.282789118s" podCreationTimestamp="2026-03-12 15:01:26 +0000 UTC" firstStartedPulling="2026-03-12 15:01:26.978792351 +0000 UTC m=+839.264017629" lastFinishedPulling="2026-03-12 15:01:32.671998889 +0000 UTC m=+844.957224167" observedRunningTime="2026-03-12 15:01:33.275892693 +0000 UTC m=+845.561117991" watchObservedRunningTime="2026-03-12 15:01:33.282789118 +0000 UTC m=+845.568014416" Mar 12 15:01:36 crc kubenswrapper[4869]: I0312 15:01:36.553910 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-b74gm" Mar 12 15:01:36 crc kubenswrapper[4869]: I0312 15:01:36.944228 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-777468bbb4-wt974" Mar 12 15:01:36 crc kubenswrapper[4869]: I0312 15:01:36.944298 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-777468bbb4-wt974" Mar 12 15:01:36 crc kubenswrapper[4869]: I0312 15:01:36.950984 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-777468bbb4-wt974" Mar 12 15:01:37 crc kubenswrapper[4869]: I0312 15:01:37.286075 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-777468bbb4-wt974" Mar 12 15:01:37 crc kubenswrapper[4869]: I0312 15:01:37.368146 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-qfqjj"] Mar 12 15:01:47 crc kubenswrapper[4869]: I0312 15:01:47.098649 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-gkvrq" Mar 12 15:01:49 crc kubenswrapper[4869]: I0312 15:01:49.684258 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:01:49 crc kubenswrapper[4869]: I0312 15:01:49.684687 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:01:58 crc kubenswrapper[4869]: I0312 15:01:58.702455 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6"] Mar 12 15:01:58 crc kubenswrapper[4869]: E0312 15:01:58.703168 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aedeb34-f607-43d8-89bc-dac85b2c68ba" containerName="extract-content" Mar 12 15:01:58 crc kubenswrapper[4869]: I0312 15:01:58.703180 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aedeb34-f607-43d8-89bc-dac85b2c68ba" containerName="extract-content" Mar 12 15:01:58 crc kubenswrapper[4869]: E0312 15:01:58.703194 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aedeb34-f607-43d8-89bc-dac85b2c68ba" containerName="extract-utilities" Mar 12 15:01:58 crc kubenswrapper[4869]: I0312 15:01:58.703200 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aedeb34-f607-43d8-89bc-dac85b2c68ba" containerName="extract-utilities" Mar 12 15:01:58 crc kubenswrapper[4869]: E0312 15:01:58.703214 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aedeb34-f607-43d8-89bc-dac85b2c68ba" containerName="registry-server" Mar 12 15:01:58 crc kubenswrapper[4869]: I0312 15:01:58.703222 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aedeb34-f607-43d8-89bc-dac85b2c68ba" containerName="registry-server" Mar 12 15:01:58 crc kubenswrapper[4869]: I0312 15:01:58.703326 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aedeb34-f607-43d8-89bc-dac85b2c68ba" containerName="registry-server" Mar 12 15:01:58 crc kubenswrapper[4869]: I0312 15:01:58.704303 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6" Mar 12 15:01:58 crc kubenswrapper[4869]: I0312 15:01:58.706737 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 12 15:01:58 crc kubenswrapper[4869]: I0312 15:01:58.715314 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6"] Mar 12 15:01:58 crc kubenswrapper[4869]: I0312 15:01:58.719686 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3647e6ae-9add-4c7f-a1d8-abb0397e4954-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6\" (UID: \"3647e6ae-9add-4c7f-a1d8-abb0397e4954\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6" Mar 12 15:01:58 crc kubenswrapper[4869]: I0312 15:01:58.821357 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3647e6ae-9add-4c7f-a1d8-abb0397e4954-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6\" (UID: \"3647e6ae-9add-4c7f-a1d8-abb0397e4954\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6" Mar 12 15:01:58 crc kubenswrapper[4869]: I0312 15:01:58.821419 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qp56\" (UniqueName: \"kubernetes.io/projected/3647e6ae-9add-4c7f-a1d8-abb0397e4954-kube-api-access-6qp56\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6\" (UID: \"3647e6ae-9add-4c7f-a1d8-abb0397e4954\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6" Mar 12 15:01:58 crc kubenswrapper[4869]: I0312 15:01:58.821498 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3647e6ae-9add-4c7f-a1d8-abb0397e4954-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6\" (UID: \"3647e6ae-9add-4c7f-a1d8-abb0397e4954\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6" Mar 12 15:01:58 crc kubenswrapper[4869]: I0312 15:01:58.822227 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3647e6ae-9add-4c7f-a1d8-abb0397e4954-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6\" (UID: \"3647e6ae-9add-4c7f-a1d8-abb0397e4954\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6" Mar 12 15:01:58 crc kubenswrapper[4869]: I0312 15:01:58.922996 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3647e6ae-9add-4c7f-a1d8-abb0397e4954-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6\" (UID: \"3647e6ae-9add-4c7f-a1d8-abb0397e4954\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6" Mar 12 15:01:58 crc kubenswrapper[4869]: I0312 15:01:58.923050 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qp56\" (UniqueName: \"kubernetes.io/projected/3647e6ae-9add-4c7f-a1d8-abb0397e4954-kube-api-access-6qp56\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6\" (UID: \"3647e6ae-9add-4c7f-a1d8-abb0397e4954\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6" Mar 12 15:01:58 crc kubenswrapper[4869]: I0312 15:01:58.923444 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3647e6ae-9add-4c7f-a1d8-abb0397e4954-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6\" (UID: \"3647e6ae-9add-4c7f-a1d8-abb0397e4954\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6" Mar 12 15:01:58 crc kubenswrapper[4869]: I0312 15:01:58.942518 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qp56\" (UniqueName: \"kubernetes.io/projected/3647e6ae-9add-4c7f-a1d8-abb0397e4954-kube-api-access-6qp56\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6\" (UID: \"3647e6ae-9add-4c7f-a1d8-abb0397e4954\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6" Mar 12 15:01:59 crc kubenswrapper[4869]: I0312 15:01:59.021169 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6" Mar 12 15:01:59 crc kubenswrapper[4869]: I0312 15:01:59.428428 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6"] Mar 12 15:02:00 crc kubenswrapper[4869]: I0312 15:02:00.133995 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555462-wdv62"] Mar 12 15:02:00 crc kubenswrapper[4869]: I0312 15:02:00.134949 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555462-wdv62" Mar 12 15:02:00 crc kubenswrapper[4869]: I0312 15:02:00.136245 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccknd\" (UniqueName: \"kubernetes.io/projected/dd70d1b1-5059-4e5f-9217-eeb022665766-kube-api-access-ccknd\") pod \"auto-csr-approver-29555462-wdv62\" (UID: \"dd70d1b1-5059-4e5f-9217-eeb022665766\") " pod="openshift-infra/auto-csr-approver-29555462-wdv62" Mar 12 15:02:00 crc kubenswrapper[4869]: I0312 15:02:00.137344 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:02:00 crc kubenswrapper[4869]: I0312 15:02:00.137643 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:02:00 crc kubenswrapper[4869]: I0312 15:02:00.140940 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 15:02:00 crc kubenswrapper[4869]: I0312 15:02:00.142502 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555462-wdv62"] Mar 12 15:02:00 crc kubenswrapper[4869]: I0312 15:02:00.237652 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccknd\" (UniqueName: \"kubernetes.io/projected/dd70d1b1-5059-4e5f-9217-eeb022665766-kube-api-access-ccknd\") pod \"auto-csr-approver-29555462-wdv62\" (UID: \"dd70d1b1-5059-4e5f-9217-eeb022665766\") " pod="openshift-infra/auto-csr-approver-29555462-wdv62" Mar 12 15:02:00 crc kubenswrapper[4869]: I0312 15:02:00.266908 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccknd\" (UniqueName: \"kubernetes.io/projected/dd70d1b1-5059-4e5f-9217-eeb022665766-kube-api-access-ccknd\") pod \"auto-csr-approver-29555462-wdv62\" (UID: \"dd70d1b1-5059-4e5f-9217-eeb022665766\") " pod="openshift-infra/auto-csr-approver-29555462-wdv62" Mar 12 15:02:00 crc kubenswrapper[4869]: I0312 15:02:00.427237 4869 generic.go:334] "Generic (PLEG): container finished" podID="3647e6ae-9add-4c7f-a1d8-abb0397e4954" containerID="1cd4cd937b2a3c80ef4cc8706a6ff97fd04c9bc617922c2bd7f75eaf978a6ce3" exitCode=0 Mar 12 15:02:00 crc kubenswrapper[4869]: I0312 15:02:00.427322 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6" event={"ID":"3647e6ae-9add-4c7f-a1d8-abb0397e4954","Type":"ContainerDied","Data":"1cd4cd937b2a3c80ef4cc8706a6ff97fd04c9bc617922c2bd7f75eaf978a6ce3"} Mar 12 15:02:00 crc kubenswrapper[4869]: I0312 15:02:00.427375 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6" event={"ID":"3647e6ae-9add-4c7f-a1d8-abb0397e4954","Type":"ContainerStarted","Data":"d2c790e93e6af0dd8250622bf23f380284502075051b10ef71f628f06f81c175"} Mar 12 15:02:00 crc kubenswrapper[4869]: I0312 15:02:00.452157 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555462-wdv62" Mar 12 15:02:00 crc kubenswrapper[4869]: I0312 15:02:00.649519 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555462-wdv62"] Mar 12 15:02:00 crc kubenswrapper[4869]: W0312 15:02:00.661661 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd70d1b1_5059_4e5f_9217_eeb022665766.slice/crio-8bf2cb40a45bae4fc538ec43c460e668ab4fe028d7f2dd1bee38fc5957090a7f WatchSource:0}: Error finding container 8bf2cb40a45bae4fc538ec43c460e668ab4fe028d7f2dd1bee38fc5957090a7f: Status 404 returned error can't find the container with id 8bf2cb40a45bae4fc538ec43c460e668ab4fe028d7f2dd1bee38fc5957090a7f Mar 12 15:02:01 crc kubenswrapper[4869]: I0312 15:02:01.432893 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555462-wdv62" event={"ID":"dd70d1b1-5059-4e5f-9217-eeb022665766","Type":"ContainerStarted","Data":"8bf2cb40a45bae4fc538ec43c460e668ab4fe028d7f2dd1bee38fc5957090a7f"} Mar 12 15:02:02 crc kubenswrapper[4869]: I0312 15:02:02.413002 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-qfqjj" podUID="e61f813a-db17-46a6-a380-9f13452ef07b" containerName="console" containerID="cri-o://1fe54d1f7b5ae6ead062dc47282a3afb0ee42dd63b1d261fe6c4f8314dbea557" gracePeriod=15 Mar 12 15:02:02 crc kubenswrapper[4869]: I0312 15:02:02.451802 4869 generic.go:334] "Generic (PLEG): container finished" podID="dd70d1b1-5059-4e5f-9217-eeb022665766" containerID="d9fe9d3940873f347f06b02ae3fc65897bb6879cd59752fc2822fffb67ef4540" exitCode=0 Mar 12 15:02:02 crc kubenswrapper[4869]: I0312 15:02:02.451857 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555462-wdv62" event={"ID":"dd70d1b1-5059-4e5f-9217-eeb022665766","Type":"ContainerDied","Data":"d9fe9d3940873f347f06b02ae3fc65897bb6879cd59752fc2822fffb67ef4540"} Mar 12 15:02:03 crc kubenswrapper[4869]: I0312 15:02:03.462831 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-qfqjj_e61f813a-db17-46a6-a380-9f13452ef07b/console/0.log" Mar 12 15:02:03 crc kubenswrapper[4869]: I0312 15:02:03.463284 4869 generic.go:334] "Generic (PLEG): container finished" podID="e61f813a-db17-46a6-a380-9f13452ef07b" containerID="1fe54d1f7b5ae6ead062dc47282a3afb0ee42dd63b1d261fe6c4f8314dbea557" exitCode=2 Mar 12 15:02:03 crc kubenswrapper[4869]: I0312 15:02:03.463349 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-qfqjj" event={"ID":"e61f813a-db17-46a6-a380-9f13452ef07b","Type":"ContainerDied","Data":"1fe54d1f7b5ae6ead062dc47282a3afb0ee42dd63b1d261fe6c4f8314dbea557"} Mar 12 15:02:03 crc kubenswrapper[4869]: I0312 15:02:03.931444 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555462-wdv62" Mar 12 15:02:03 crc kubenswrapper[4869]: I0312 15:02:03.937889 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-qfqjj_e61f813a-db17-46a6-a380-9f13452ef07b/console/0.log" Mar 12 15:02:03 crc kubenswrapper[4869]: I0312 15:02:03.937963 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-qfqjj" Mar 12 15:02:03 crc kubenswrapper[4869]: I0312 15:02:03.998002 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m48bv\" (UniqueName: \"kubernetes.io/projected/e61f813a-db17-46a6-a380-9f13452ef07b-kube-api-access-m48bv\") pod \"e61f813a-db17-46a6-a380-9f13452ef07b\" (UID: \"e61f813a-db17-46a6-a380-9f13452ef07b\") " Mar 12 15:02:03 crc kubenswrapper[4869]: I0312 15:02:03.998300 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e61f813a-db17-46a6-a380-9f13452ef07b-console-config\") pod \"e61f813a-db17-46a6-a380-9f13452ef07b\" (UID: \"e61f813a-db17-46a6-a380-9f13452ef07b\") " Mar 12 15:02:03 crc kubenswrapper[4869]: I0312 15:02:03.998331 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e61f813a-db17-46a6-a380-9f13452ef07b-trusted-ca-bundle\") pod \"e61f813a-db17-46a6-a380-9f13452ef07b\" (UID: \"e61f813a-db17-46a6-a380-9f13452ef07b\") " Mar 12 15:02:03 crc kubenswrapper[4869]: I0312 15:02:03.998349 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e61f813a-db17-46a6-a380-9f13452ef07b-console-serving-cert\") pod \"e61f813a-db17-46a6-a380-9f13452ef07b\" (UID: \"e61f813a-db17-46a6-a380-9f13452ef07b\") " Mar 12 15:02:03 crc kubenswrapper[4869]: I0312 15:02:03.998403 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e61f813a-db17-46a6-a380-9f13452ef07b-console-oauth-config\") pod \"e61f813a-db17-46a6-a380-9f13452ef07b\" (UID: \"e61f813a-db17-46a6-a380-9f13452ef07b\") " Mar 12 15:02:03 crc kubenswrapper[4869]: I0312 15:02:03.998450 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccknd\" (UniqueName: \"kubernetes.io/projected/dd70d1b1-5059-4e5f-9217-eeb022665766-kube-api-access-ccknd\") pod \"dd70d1b1-5059-4e5f-9217-eeb022665766\" (UID: \"dd70d1b1-5059-4e5f-9217-eeb022665766\") " Mar 12 15:02:03 crc kubenswrapper[4869]: I0312 15:02:03.998483 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e61f813a-db17-46a6-a380-9f13452ef07b-service-ca\") pod \"e61f813a-db17-46a6-a380-9f13452ef07b\" (UID: \"e61f813a-db17-46a6-a380-9f13452ef07b\") " Mar 12 15:02:03 crc kubenswrapper[4869]: I0312 15:02:03.998503 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e61f813a-db17-46a6-a380-9f13452ef07b-oauth-serving-cert\") pod \"e61f813a-db17-46a6-a380-9f13452ef07b\" (UID: \"e61f813a-db17-46a6-a380-9f13452ef07b\") " Mar 12 15:02:04 crc kubenswrapper[4869]: I0312 15:02:03.999497 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e61f813a-db17-46a6-a380-9f13452ef07b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e61f813a-db17-46a6-a380-9f13452ef07b" (UID: "e61f813a-db17-46a6-a380-9f13452ef07b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:02:04 crc kubenswrapper[4869]: I0312 15:02:03.999555 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e61f813a-db17-46a6-a380-9f13452ef07b-console-config" (OuterVolumeSpecName: "console-config") pod "e61f813a-db17-46a6-a380-9f13452ef07b" (UID: "e61f813a-db17-46a6-a380-9f13452ef07b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:02:04 crc kubenswrapper[4869]: I0312 15:02:03.999551 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e61f813a-db17-46a6-a380-9f13452ef07b-service-ca" (OuterVolumeSpecName: "service-ca") pod "e61f813a-db17-46a6-a380-9f13452ef07b" (UID: "e61f813a-db17-46a6-a380-9f13452ef07b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:02:04 crc kubenswrapper[4869]: I0312 15:02:03.999649 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e61f813a-db17-46a6-a380-9f13452ef07b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e61f813a-db17-46a6-a380-9f13452ef07b" (UID: "e61f813a-db17-46a6-a380-9f13452ef07b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:02:04 crc kubenswrapper[4869]: I0312 15:02:04.005994 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd70d1b1-5059-4e5f-9217-eeb022665766-kube-api-access-ccknd" (OuterVolumeSpecName: "kube-api-access-ccknd") pod "dd70d1b1-5059-4e5f-9217-eeb022665766" (UID: "dd70d1b1-5059-4e5f-9217-eeb022665766"). InnerVolumeSpecName "kube-api-access-ccknd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:02:04 crc kubenswrapper[4869]: I0312 15:02:04.006412 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e61f813a-db17-46a6-a380-9f13452ef07b-kube-api-access-m48bv" (OuterVolumeSpecName: "kube-api-access-m48bv") pod "e61f813a-db17-46a6-a380-9f13452ef07b" (UID: "e61f813a-db17-46a6-a380-9f13452ef07b"). InnerVolumeSpecName "kube-api-access-m48bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:02:04 crc kubenswrapper[4869]: I0312 15:02:04.006481 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e61f813a-db17-46a6-a380-9f13452ef07b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e61f813a-db17-46a6-a380-9f13452ef07b" (UID: "e61f813a-db17-46a6-a380-9f13452ef07b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:02:04 crc kubenswrapper[4869]: I0312 15:02:04.022211 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e61f813a-db17-46a6-a380-9f13452ef07b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e61f813a-db17-46a6-a380-9f13452ef07b" (UID: "e61f813a-db17-46a6-a380-9f13452ef07b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:02:04 crc kubenswrapper[4869]: I0312 15:02:04.100172 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m48bv\" (UniqueName: \"kubernetes.io/projected/e61f813a-db17-46a6-a380-9f13452ef07b-kube-api-access-m48bv\") on node \"crc\" DevicePath \"\"" Mar 12 15:02:04 crc kubenswrapper[4869]: I0312 15:02:04.100270 4869 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e61f813a-db17-46a6-a380-9f13452ef07b-console-config\") on node \"crc\" DevicePath \"\"" Mar 12 15:02:04 crc kubenswrapper[4869]: I0312 15:02:04.100328 4869 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e61f813a-db17-46a6-a380-9f13452ef07b-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:02:04 crc kubenswrapper[4869]: I0312 15:02:04.100348 4869 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e61f813a-db17-46a6-a380-9f13452ef07b-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 15:02:04 crc kubenswrapper[4869]: I0312 15:02:04.100365 4869 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e61f813a-db17-46a6-a380-9f13452ef07b-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 12 15:02:04 crc kubenswrapper[4869]: I0312 15:02:04.100420 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccknd\" (UniqueName: \"kubernetes.io/projected/dd70d1b1-5059-4e5f-9217-eeb022665766-kube-api-access-ccknd\") on node \"crc\" DevicePath \"\"" Mar 12 15:02:04 crc kubenswrapper[4869]: I0312 15:02:04.100438 4869 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e61f813a-db17-46a6-a380-9f13452ef07b-service-ca\") on node \"crc\" DevicePath \"\"" Mar 12 15:02:04 crc kubenswrapper[4869]: I0312 15:02:04.100456 4869 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e61f813a-db17-46a6-a380-9f13452ef07b-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 15:02:04 crc kubenswrapper[4869]: I0312 15:02:04.476168 4869 generic.go:334] "Generic (PLEG): container finished" podID="3647e6ae-9add-4c7f-a1d8-abb0397e4954" containerID="6158f0825e44ab58b93b3879a6dbfa8b8273a68863008dd177b730be9f1fe51a" exitCode=0 Mar 12 15:02:04 crc kubenswrapper[4869]: I0312 15:02:04.476232 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6" event={"ID":"3647e6ae-9add-4c7f-a1d8-abb0397e4954","Type":"ContainerDied","Data":"6158f0825e44ab58b93b3879a6dbfa8b8273a68863008dd177b730be9f1fe51a"} Mar 12 15:02:04 crc kubenswrapper[4869]: I0312 15:02:04.480501 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-qfqjj_e61f813a-db17-46a6-a380-9f13452ef07b/console/0.log" Mar 12 15:02:04 crc kubenswrapper[4869]: I0312 15:02:04.480582 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-qfqjj" event={"ID":"e61f813a-db17-46a6-a380-9f13452ef07b","Type":"ContainerDied","Data":"a391b4ea97251748aa8094411d4e11a2d4e97590212b932d592a20fe1ced765d"} Mar 12 15:02:04 crc kubenswrapper[4869]: I0312 15:02:04.480617 4869 scope.go:117] "RemoveContainer" containerID="1fe54d1f7b5ae6ead062dc47282a3afb0ee42dd63b1d261fe6c4f8314dbea557" Mar 12 15:02:04 crc kubenswrapper[4869]: I0312 15:02:04.480732 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-qfqjj" Mar 12 15:02:04 crc kubenswrapper[4869]: I0312 15:02:04.484210 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555462-wdv62" event={"ID":"dd70d1b1-5059-4e5f-9217-eeb022665766","Type":"ContainerDied","Data":"8bf2cb40a45bae4fc538ec43c460e668ab4fe028d7f2dd1bee38fc5957090a7f"} Mar 12 15:02:04 crc kubenswrapper[4869]: I0312 15:02:04.484284 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bf2cb40a45bae4fc538ec43c460e668ab4fe028d7f2dd1bee38fc5957090a7f" Mar 12 15:02:04 crc kubenswrapper[4869]: I0312 15:02:04.484375 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555462-wdv62" Mar 12 15:02:04 crc kubenswrapper[4869]: I0312 15:02:04.510113 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-qfqjj"] Mar 12 15:02:04 crc kubenswrapper[4869]: I0312 15:02:04.513383 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-qfqjj"] Mar 12 15:02:04 crc kubenswrapper[4869]: I0312 15:02:04.987228 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555456-2vfdb"] Mar 12 15:02:04 crc kubenswrapper[4869]: I0312 15:02:04.992590 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555456-2vfdb"] Mar 12 15:02:05 crc kubenswrapper[4869]: I0312 15:02:05.494431 4869 generic.go:334] "Generic (PLEG): container finished" podID="3647e6ae-9add-4c7f-a1d8-abb0397e4954" containerID="0e755e2c1870278f9d5f6af88d6e97fb677528f9e21eabeb946a43068b515a33" exitCode=0 Mar 12 15:02:05 crc kubenswrapper[4869]: I0312 15:02:05.494493 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6" event={"ID":"3647e6ae-9add-4c7f-a1d8-abb0397e4954","Type":"ContainerDied","Data":"0e755e2c1870278f9d5f6af88d6e97fb677528f9e21eabeb946a43068b515a33"} Mar 12 15:02:06 crc kubenswrapper[4869]: I0312 15:02:06.342757 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2c74cbb-f1e0-4776-aa0f-17893bf634f9" path="/var/lib/kubelet/pods/e2c74cbb-f1e0-4776-aa0f-17893bf634f9/volumes" Mar 12 15:02:06 crc kubenswrapper[4869]: I0312 15:02:06.343420 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e61f813a-db17-46a6-a380-9f13452ef07b" path="/var/lib/kubelet/pods/e61f813a-db17-46a6-a380-9f13452ef07b/volumes" Mar 12 15:02:06 crc kubenswrapper[4869]: I0312 15:02:06.764497 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6" Mar 12 15:02:06 crc kubenswrapper[4869]: I0312 15:02:06.832813 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3647e6ae-9add-4c7f-a1d8-abb0397e4954-bundle\") pod \"3647e6ae-9add-4c7f-a1d8-abb0397e4954\" (UID: \"3647e6ae-9add-4c7f-a1d8-abb0397e4954\") " Mar 12 15:02:06 crc kubenswrapper[4869]: I0312 15:02:06.832890 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3647e6ae-9add-4c7f-a1d8-abb0397e4954-util\") pod \"3647e6ae-9add-4c7f-a1d8-abb0397e4954\" (UID: \"3647e6ae-9add-4c7f-a1d8-abb0397e4954\") " Mar 12 15:02:06 crc kubenswrapper[4869]: I0312 15:02:06.833088 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qp56\" (UniqueName: \"kubernetes.io/projected/3647e6ae-9add-4c7f-a1d8-abb0397e4954-kube-api-access-6qp56\") pod \"3647e6ae-9add-4c7f-a1d8-abb0397e4954\" (UID: \"3647e6ae-9add-4c7f-a1d8-abb0397e4954\") " Mar 12 15:02:06 crc kubenswrapper[4869]: I0312 15:02:06.834621 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3647e6ae-9add-4c7f-a1d8-abb0397e4954-bundle" (OuterVolumeSpecName: "bundle") pod "3647e6ae-9add-4c7f-a1d8-abb0397e4954" (UID: "3647e6ae-9add-4c7f-a1d8-abb0397e4954"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:02:06 crc kubenswrapper[4869]: I0312 15:02:06.838702 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3647e6ae-9add-4c7f-a1d8-abb0397e4954-kube-api-access-6qp56" (OuterVolumeSpecName: "kube-api-access-6qp56") pod "3647e6ae-9add-4c7f-a1d8-abb0397e4954" (UID: "3647e6ae-9add-4c7f-a1d8-abb0397e4954"). InnerVolumeSpecName "kube-api-access-6qp56". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:02:06 crc kubenswrapper[4869]: I0312 15:02:06.846461 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3647e6ae-9add-4c7f-a1d8-abb0397e4954-util" (OuterVolumeSpecName: "util") pod "3647e6ae-9add-4c7f-a1d8-abb0397e4954" (UID: "3647e6ae-9add-4c7f-a1d8-abb0397e4954"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:02:06 crc kubenswrapper[4869]: I0312 15:02:06.934717 4869 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3647e6ae-9add-4c7f-a1d8-abb0397e4954-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:02:06 crc kubenswrapper[4869]: I0312 15:02:06.934755 4869 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3647e6ae-9add-4c7f-a1d8-abb0397e4954-util\") on node \"crc\" DevicePath \"\"" Mar 12 15:02:06 crc kubenswrapper[4869]: I0312 15:02:06.934768 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qp56\" (UniqueName: \"kubernetes.io/projected/3647e6ae-9add-4c7f-a1d8-abb0397e4954-kube-api-access-6qp56\") on node \"crc\" DevicePath \"\"" Mar 12 15:02:07 crc kubenswrapper[4869]: I0312 15:02:07.510758 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6" event={"ID":"3647e6ae-9add-4c7f-a1d8-abb0397e4954","Type":"ContainerDied","Data":"d2c790e93e6af0dd8250622bf23f380284502075051b10ef71f628f06f81c175"} Mar 12 15:02:07 crc kubenswrapper[4869]: I0312 15:02:07.510824 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2c790e93e6af0dd8250622bf23f380284502075051b10ef71f628f06f81c175" Mar 12 15:02:07 crc kubenswrapper[4869]: I0312 15:02:07.510897 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6" Mar 12 15:02:17 crc kubenswrapper[4869]: I0312 15:02:17.720386 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7899659c6c-5rm9x"] Mar 12 15:02:17 crc kubenswrapper[4869]: E0312 15:02:17.721284 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3647e6ae-9add-4c7f-a1d8-abb0397e4954" containerName="util" Mar 12 15:02:17 crc kubenswrapper[4869]: I0312 15:02:17.721299 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="3647e6ae-9add-4c7f-a1d8-abb0397e4954" containerName="util" Mar 12 15:02:17 crc kubenswrapper[4869]: E0312 15:02:17.721314 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e61f813a-db17-46a6-a380-9f13452ef07b" containerName="console" Mar 12 15:02:17 crc kubenswrapper[4869]: I0312 15:02:17.721322 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e61f813a-db17-46a6-a380-9f13452ef07b" containerName="console" Mar 12 15:02:17 crc kubenswrapper[4869]: E0312 15:02:17.721333 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3647e6ae-9add-4c7f-a1d8-abb0397e4954" containerName="extract" Mar 12 15:02:17 crc kubenswrapper[4869]: I0312 15:02:17.721341 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="3647e6ae-9add-4c7f-a1d8-abb0397e4954" containerName="extract" Mar 12 15:02:17 crc kubenswrapper[4869]: E0312 15:02:17.721353 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3647e6ae-9add-4c7f-a1d8-abb0397e4954" containerName="pull" Mar 12 15:02:17 crc kubenswrapper[4869]: I0312 15:02:17.721361 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="3647e6ae-9add-4c7f-a1d8-abb0397e4954" containerName="pull" Mar 12 15:02:17 crc kubenswrapper[4869]: E0312 15:02:17.721380 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd70d1b1-5059-4e5f-9217-eeb022665766" containerName="oc" Mar 12 15:02:17 crc kubenswrapper[4869]: I0312 15:02:17.721388 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd70d1b1-5059-4e5f-9217-eeb022665766" containerName="oc" Mar 12 15:02:17 crc kubenswrapper[4869]: I0312 15:02:17.721503 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd70d1b1-5059-4e5f-9217-eeb022665766" containerName="oc" Mar 12 15:02:17 crc kubenswrapper[4869]: I0312 15:02:17.721518 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="3647e6ae-9add-4c7f-a1d8-abb0397e4954" containerName="extract" Mar 12 15:02:17 crc kubenswrapper[4869]: I0312 15:02:17.721529 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="e61f813a-db17-46a6-a380-9f13452ef07b" containerName="console" Mar 12 15:02:17 crc kubenswrapper[4869]: I0312 15:02:17.722008 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7899659c6c-5rm9x" Mar 12 15:02:17 crc kubenswrapper[4869]: I0312 15:02:17.724426 4869 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 12 15:02:17 crc kubenswrapper[4869]: I0312 15:02:17.724571 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 12 15:02:17 crc kubenswrapper[4869]: I0312 15:02:17.724672 4869 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-v678q" Mar 12 15:02:17 crc kubenswrapper[4869]: I0312 15:02:17.725497 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 12 15:02:17 crc kubenswrapper[4869]: I0312 15:02:17.728532 4869 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 12 15:02:17 crc kubenswrapper[4869]: I0312 15:02:17.737327 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7899659c6c-5rm9x"] Mar 12 15:02:17 crc kubenswrapper[4869]: I0312 15:02:17.765296 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvs4s\" (UniqueName: \"kubernetes.io/projected/2c705e1d-1ea5-4f81-a1d9-9888f96b36be-kube-api-access-pvs4s\") pod \"metallb-operator-controller-manager-7899659c6c-5rm9x\" (UID: \"2c705e1d-1ea5-4f81-a1d9-9888f96b36be\") " pod="metallb-system/metallb-operator-controller-manager-7899659c6c-5rm9x" Mar 12 15:02:17 crc kubenswrapper[4869]: I0312 15:02:17.765356 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2c705e1d-1ea5-4f81-a1d9-9888f96b36be-apiservice-cert\") pod \"metallb-operator-controller-manager-7899659c6c-5rm9x\" (UID: \"2c705e1d-1ea5-4f81-a1d9-9888f96b36be\") " pod="metallb-system/metallb-operator-controller-manager-7899659c6c-5rm9x" Mar 12 15:02:17 crc kubenswrapper[4869]: I0312 15:02:17.765383 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2c705e1d-1ea5-4f81-a1d9-9888f96b36be-webhook-cert\") pod \"metallb-operator-controller-manager-7899659c6c-5rm9x\" (UID: \"2c705e1d-1ea5-4f81-a1d9-9888f96b36be\") " pod="metallb-system/metallb-operator-controller-manager-7899659c6c-5rm9x" Mar 12 15:02:17 crc kubenswrapper[4869]: I0312 15:02:17.866311 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvs4s\" (UniqueName: \"kubernetes.io/projected/2c705e1d-1ea5-4f81-a1d9-9888f96b36be-kube-api-access-pvs4s\") pod \"metallb-operator-controller-manager-7899659c6c-5rm9x\" (UID: \"2c705e1d-1ea5-4f81-a1d9-9888f96b36be\") " pod="metallb-system/metallb-operator-controller-manager-7899659c6c-5rm9x" Mar 12 15:02:17 crc kubenswrapper[4869]: I0312 15:02:17.866686 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2c705e1d-1ea5-4f81-a1d9-9888f96b36be-apiservice-cert\") pod \"metallb-operator-controller-manager-7899659c6c-5rm9x\" (UID: \"2c705e1d-1ea5-4f81-a1d9-9888f96b36be\") " pod="metallb-system/metallb-operator-controller-manager-7899659c6c-5rm9x" Mar 12 15:02:17 crc kubenswrapper[4869]: I0312 15:02:17.866705 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2c705e1d-1ea5-4f81-a1d9-9888f96b36be-webhook-cert\") pod \"metallb-operator-controller-manager-7899659c6c-5rm9x\" (UID: \"2c705e1d-1ea5-4f81-a1d9-9888f96b36be\") " pod="metallb-system/metallb-operator-controller-manager-7899659c6c-5rm9x" Mar 12 15:02:17 crc kubenswrapper[4869]: I0312 15:02:17.874298 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2c705e1d-1ea5-4f81-a1d9-9888f96b36be-apiservice-cert\") pod \"metallb-operator-controller-manager-7899659c6c-5rm9x\" (UID: \"2c705e1d-1ea5-4f81-a1d9-9888f96b36be\") " pod="metallb-system/metallb-operator-controller-manager-7899659c6c-5rm9x" Mar 12 15:02:17 crc kubenswrapper[4869]: I0312 15:02:17.878286 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2c705e1d-1ea5-4f81-a1d9-9888f96b36be-webhook-cert\") pod \"metallb-operator-controller-manager-7899659c6c-5rm9x\" (UID: \"2c705e1d-1ea5-4f81-a1d9-9888f96b36be\") " pod="metallb-system/metallb-operator-controller-manager-7899659c6c-5rm9x" Mar 12 15:02:17 crc kubenswrapper[4869]: I0312 15:02:17.885231 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvs4s\" (UniqueName: \"kubernetes.io/projected/2c705e1d-1ea5-4f81-a1d9-9888f96b36be-kube-api-access-pvs4s\") pod \"metallb-operator-controller-manager-7899659c6c-5rm9x\" (UID: \"2c705e1d-1ea5-4f81-a1d9-9888f96b36be\") " pod="metallb-system/metallb-operator-controller-manager-7899659c6c-5rm9x" Mar 12 15:02:18 crc kubenswrapper[4869]: I0312 15:02:18.039471 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7899659c6c-5rm9x" Mar 12 15:02:18 crc kubenswrapper[4869]: I0312 15:02:18.107485 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-86c86ddd4-pds8d"] Mar 12 15:02:18 crc kubenswrapper[4869]: I0312 15:02:18.109938 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-86c86ddd4-pds8d" Mar 12 15:02:18 crc kubenswrapper[4869]: I0312 15:02:18.111949 4869 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 12 15:02:18 crc kubenswrapper[4869]: I0312 15:02:18.112240 4869 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-dfpww" Mar 12 15:02:18 crc kubenswrapper[4869]: I0312 15:02:18.112491 4869 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 12 15:02:18 crc kubenswrapper[4869]: I0312 15:02:18.138327 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-86c86ddd4-pds8d"] Mar 12 15:02:18 crc kubenswrapper[4869]: I0312 15:02:18.170883 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg2g4\" (UniqueName: \"kubernetes.io/projected/c5eb89db-d2ff-4a93-b12b-83ac78e93e65-kube-api-access-tg2g4\") pod \"metallb-operator-webhook-server-86c86ddd4-pds8d\" (UID: \"c5eb89db-d2ff-4a93-b12b-83ac78e93e65\") " pod="metallb-system/metallb-operator-webhook-server-86c86ddd4-pds8d" Mar 12 15:02:18 crc kubenswrapper[4869]: I0312 15:02:18.171405 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c5eb89db-d2ff-4a93-b12b-83ac78e93e65-webhook-cert\") pod \"metallb-operator-webhook-server-86c86ddd4-pds8d\" (UID: \"c5eb89db-d2ff-4a93-b12b-83ac78e93e65\") " pod="metallb-system/metallb-operator-webhook-server-86c86ddd4-pds8d" Mar 12 15:02:18 crc kubenswrapper[4869]: I0312 15:02:18.171489 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c5eb89db-d2ff-4a93-b12b-83ac78e93e65-apiservice-cert\") pod \"metallb-operator-webhook-server-86c86ddd4-pds8d\" (UID: \"c5eb89db-d2ff-4a93-b12b-83ac78e93e65\") " pod="metallb-system/metallb-operator-webhook-server-86c86ddd4-pds8d" Mar 12 15:02:18 crc kubenswrapper[4869]: I0312 15:02:18.273132 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c5eb89db-d2ff-4a93-b12b-83ac78e93e65-apiservice-cert\") pod \"metallb-operator-webhook-server-86c86ddd4-pds8d\" (UID: \"c5eb89db-d2ff-4a93-b12b-83ac78e93e65\") " pod="metallb-system/metallb-operator-webhook-server-86c86ddd4-pds8d" Mar 12 15:02:18 crc kubenswrapper[4869]: I0312 15:02:18.273217 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg2g4\" (UniqueName: \"kubernetes.io/projected/c5eb89db-d2ff-4a93-b12b-83ac78e93e65-kube-api-access-tg2g4\") pod \"metallb-operator-webhook-server-86c86ddd4-pds8d\" (UID: \"c5eb89db-d2ff-4a93-b12b-83ac78e93e65\") " pod="metallb-system/metallb-operator-webhook-server-86c86ddd4-pds8d" Mar 12 15:02:18 crc kubenswrapper[4869]: I0312 15:02:18.273246 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c5eb89db-d2ff-4a93-b12b-83ac78e93e65-webhook-cert\") pod \"metallb-operator-webhook-server-86c86ddd4-pds8d\" (UID: \"c5eb89db-d2ff-4a93-b12b-83ac78e93e65\") " pod="metallb-system/metallb-operator-webhook-server-86c86ddd4-pds8d" Mar 12 15:02:18 crc kubenswrapper[4869]: I0312 15:02:18.277595 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c5eb89db-d2ff-4a93-b12b-83ac78e93e65-apiservice-cert\") pod \"metallb-operator-webhook-server-86c86ddd4-pds8d\" (UID: \"c5eb89db-d2ff-4a93-b12b-83ac78e93e65\") " pod="metallb-system/metallb-operator-webhook-server-86c86ddd4-pds8d" Mar 12 15:02:18 crc kubenswrapper[4869]: I0312 15:02:18.287650 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c5eb89db-d2ff-4a93-b12b-83ac78e93e65-webhook-cert\") pod \"metallb-operator-webhook-server-86c86ddd4-pds8d\" (UID: \"c5eb89db-d2ff-4a93-b12b-83ac78e93e65\") " pod="metallb-system/metallb-operator-webhook-server-86c86ddd4-pds8d" Mar 12 15:02:18 crc kubenswrapper[4869]: I0312 15:02:18.292321 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg2g4\" (UniqueName: \"kubernetes.io/projected/c5eb89db-d2ff-4a93-b12b-83ac78e93e65-kube-api-access-tg2g4\") pod \"metallb-operator-webhook-server-86c86ddd4-pds8d\" (UID: \"c5eb89db-d2ff-4a93-b12b-83ac78e93e65\") " pod="metallb-system/metallb-operator-webhook-server-86c86ddd4-pds8d" Mar 12 15:02:18 crc kubenswrapper[4869]: I0312 15:02:18.315228 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7899659c6c-5rm9x"] Mar 12 15:02:18 crc kubenswrapper[4869]: I0312 15:02:18.443355 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-86c86ddd4-pds8d" Mar 12 15:02:18 crc kubenswrapper[4869]: I0312 15:02:18.573298 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7899659c6c-5rm9x" event={"ID":"2c705e1d-1ea5-4f81-a1d9-9888f96b36be","Type":"ContainerStarted","Data":"659528180c1cf57a5db5411fd15883bb1599a17aaf36c3c352ffb7eb8034c19d"} Mar 12 15:02:18 crc kubenswrapper[4869]: I0312 15:02:18.677201 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-86c86ddd4-pds8d"] Mar 12 15:02:18 crc kubenswrapper[4869]: W0312 15:02:18.688123 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5eb89db_d2ff_4a93_b12b_83ac78e93e65.slice/crio-b31f18303e8848e2f055dd0db57df0ac7c89c1eab4c6c2c9129b04cc40ef26e2 WatchSource:0}: Error finding container b31f18303e8848e2f055dd0db57df0ac7c89c1eab4c6c2c9129b04cc40ef26e2: Status 404 returned error can't find the container with id b31f18303e8848e2f055dd0db57df0ac7c89c1eab4c6c2c9129b04cc40ef26e2 Mar 12 15:02:19 crc kubenswrapper[4869]: I0312 15:02:19.582929 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-86c86ddd4-pds8d" event={"ID":"c5eb89db-d2ff-4a93-b12b-83ac78e93e65","Type":"ContainerStarted","Data":"b31f18303e8848e2f055dd0db57df0ac7c89c1eab4c6c2c9129b04cc40ef26e2"} Mar 12 15:02:19 crc kubenswrapper[4869]: I0312 15:02:19.683842 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:02:19 crc kubenswrapper[4869]: I0312 15:02:19.683906 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:02:22 crc kubenswrapper[4869]: I0312 15:02:22.610016 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7899659c6c-5rm9x" event={"ID":"2c705e1d-1ea5-4f81-a1d9-9888f96b36be","Type":"ContainerStarted","Data":"5adc925f0240883f3508e7e02d98137e175c82da2c67b8c38544193f93e9dfb9"} Mar 12 15:02:22 crc kubenswrapper[4869]: I0312 15:02:22.610692 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7899659c6c-5rm9x" Mar 12 15:02:22 crc kubenswrapper[4869]: I0312 15:02:22.631807 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7899659c6c-5rm9x" podStartSLOduration=2.21568011 podStartE2EDuration="5.631791833s" podCreationTimestamp="2026-03-12 15:02:17 +0000 UTC" firstStartedPulling="2026-03-12 15:02:18.328511962 +0000 UTC m=+890.613737240" lastFinishedPulling="2026-03-12 15:02:21.744623685 +0000 UTC m=+894.029848963" observedRunningTime="2026-03-12 15:02:22.630371203 +0000 UTC m=+894.915596481" watchObservedRunningTime="2026-03-12 15:02:22.631791833 +0000 UTC m=+894.917017111" Mar 12 15:02:23 crc kubenswrapper[4869]: I0312 15:02:23.617292 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-86c86ddd4-pds8d" event={"ID":"c5eb89db-d2ff-4a93-b12b-83ac78e93e65","Type":"ContainerStarted","Data":"306af903c3db9ebfa25455341da10c142580b6ec464000d73c2d992d544132d9"} Mar 12 15:02:23 crc kubenswrapper[4869]: I0312 15:02:23.641141 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-86c86ddd4-pds8d" podStartSLOduration=1.02893524 podStartE2EDuration="5.64112336s" podCreationTimestamp="2026-03-12 15:02:18 +0000 UTC" firstStartedPulling="2026-03-12 15:02:18.693214844 +0000 UTC m=+890.978440122" lastFinishedPulling="2026-03-12 15:02:23.305402964 +0000 UTC m=+895.590628242" observedRunningTime="2026-03-12 15:02:23.640264785 +0000 UTC m=+895.925490063" watchObservedRunningTime="2026-03-12 15:02:23.64112336 +0000 UTC m=+895.926348638" Mar 12 15:02:24 crc kubenswrapper[4869]: I0312 15:02:24.622504 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-86c86ddd4-pds8d" Mar 12 15:02:29 crc kubenswrapper[4869]: I0312 15:02:29.228610 4869 scope.go:117] "RemoveContainer" containerID="3ec89ccc59edba5da1e1493ceee89c161d17d3f4895126e8e89c4fabe29162a8" Mar 12 15:02:38 crc kubenswrapper[4869]: I0312 15:02:38.448492 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-86c86ddd4-pds8d" Mar 12 15:02:41 crc kubenswrapper[4869]: I0312 15:02:41.601791 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7cpf8"] Mar 12 15:02:41 crc kubenswrapper[4869]: I0312 15:02:41.603118 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7cpf8" Mar 12 15:02:41 crc kubenswrapper[4869]: I0312 15:02:41.630864 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7cpf8"] Mar 12 15:02:41 crc kubenswrapper[4869]: I0312 15:02:41.689146 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhdkl\" (UniqueName: \"kubernetes.io/projected/63847a0d-c0a9-4d88-8b1e-a8f5b68134ca-kube-api-access-jhdkl\") pod \"community-operators-7cpf8\" (UID: \"63847a0d-c0a9-4d88-8b1e-a8f5b68134ca\") " pod="openshift-marketplace/community-operators-7cpf8" Mar 12 15:02:41 crc kubenswrapper[4869]: I0312 15:02:41.689233 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63847a0d-c0a9-4d88-8b1e-a8f5b68134ca-utilities\") pod \"community-operators-7cpf8\" (UID: \"63847a0d-c0a9-4d88-8b1e-a8f5b68134ca\") " pod="openshift-marketplace/community-operators-7cpf8" Mar 12 15:02:41 crc kubenswrapper[4869]: I0312 15:02:41.689260 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63847a0d-c0a9-4d88-8b1e-a8f5b68134ca-catalog-content\") pod \"community-operators-7cpf8\" (UID: \"63847a0d-c0a9-4d88-8b1e-a8f5b68134ca\") " pod="openshift-marketplace/community-operators-7cpf8" Mar 12 15:02:41 crc kubenswrapper[4869]: I0312 15:02:41.790985 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63847a0d-c0a9-4d88-8b1e-a8f5b68134ca-utilities\") pod \"community-operators-7cpf8\" (UID: \"63847a0d-c0a9-4d88-8b1e-a8f5b68134ca\") " pod="openshift-marketplace/community-operators-7cpf8" Mar 12 15:02:41 crc kubenswrapper[4869]: I0312 15:02:41.791036 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63847a0d-c0a9-4d88-8b1e-a8f5b68134ca-catalog-content\") pod \"community-operators-7cpf8\" (UID: \"63847a0d-c0a9-4d88-8b1e-a8f5b68134ca\") " pod="openshift-marketplace/community-operators-7cpf8" Mar 12 15:02:41 crc kubenswrapper[4869]: I0312 15:02:41.791115 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhdkl\" (UniqueName: \"kubernetes.io/projected/63847a0d-c0a9-4d88-8b1e-a8f5b68134ca-kube-api-access-jhdkl\") pod \"community-operators-7cpf8\" (UID: \"63847a0d-c0a9-4d88-8b1e-a8f5b68134ca\") " pod="openshift-marketplace/community-operators-7cpf8" Mar 12 15:02:41 crc kubenswrapper[4869]: I0312 15:02:41.791500 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63847a0d-c0a9-4d88-8b1e-a8f5b68134ca-utilities\") pod \"community-operators-7cpf8\" (UID: \"63847a0d-c0a9-4d88-8b1e-a8f5b68134ca\") " pod="openshift-marketplace/community-operators-7cpf8" Mar 12 15:02:41 crc kubenswrapper[4869]: I0312 15:02:41.792230 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63847a0d-c0a9-4d88-8b1e-a8f5b68134ca-catalog-content\") pod \"community-operators-7cpf8\" (UID: \"63847a0d-c0a9-4d88-8b1e-a8f5b68134ca\") " pod="openshift-marketplace/community-operators-7cpf8" Mar 12 15:02:41 crc kubenswrapper[4869]: I0312 15:02:41.812510 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhdkl\" (UniqueName: \"kubernetes.io/projected/63847a0d-c0a9-4d88-8b1e-a8f5b68134ca-kube-api-access-jhdkl\") pod \"community-operators-7cpf8\" (UID: \"63847a0d-c0a9-4d88-8b1e-a8f5b68134ca\") " pod="openshift-marketplace/community-operators-7cpf8" Mar 12 15:02:41 crc kubenswrapper[4869]: I0312 15:02:41.922816 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7cpf8" Mar 12 15:02:42 crc kubenswrapper[4869]: I0312 15:02:42.161482 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7cpf8"] Mar 12 15:02:42 crc kubenswrapper[4869]: I0312 15:02:42.725062 4869 generic.go:334] "Generic (PLEG): container finished" podID="63847a0d-c0a9-4d88-8b1e-a8f5b68134ca" containerID="9f2721d63c31017c28d473ea360e1335aa969e6dad026ae9d2dd3440fba3893e" exitCode=0 Mar 12 15:02:42 crc kubenswrapper[4869]: I0312 15:02:42.725126 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7cpf8" event={"ID":"63847a0d-c0a9-4d88-8b1e-a8f5b68134ca","Type":"ContainerDied","Data":"9f2721d63c31017c28d473ea360e1335aa969e6dad026ae9d2dd3440fba3893e"} Mar 12 15:02:42 crc kubenswrapper[4869]: I0312 15:02:42.725358 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7cpf8" event={"ID":"63847a0d-c0a9-4d88-8b1e-a8f5b68134ca","Type":"ContainerStarted","Data":"4a4700f9d1f7c3e8b5bfcbd98d78f440c0a62f1e3256fa120e7d643ec15cd119"} Mar 12 15:02:45 crc kubenswrapper[4869]: I0312 15:02:45.751352 4869 generic.go:334] "Generic (PLEG): container finished" podID="63847a0d-c0a9-4d88-8b1e-a8f5b68134ca" containerID="a93c10046390fad4e2d5dafff32e88f8ecf12e8ad03f3756dbb4e1995185fb9c" exitCode=0 Mar 12 15:02:45 crc kubenswrapper[4869]: I0312 15:02:45.751735 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7cpf8" event={"ID":"63847a0d-c0a9-4d88-8b1e-a8f5b68134ca","Type":"ContainerDied","Data":"a93c10046390fad4e2d5dafff32e88f8ecf12e8ad03f3756dbb4e1995185fb9c"} Mar 12 15:02:46 crc kubenswrapper[4869]: I0312 15:02:46.758692 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7cpf8" event={"ID":"63847a0d-c0a9-4d88-8b1e-a8f5b68134ca","Type":"ContainerStarted","Data":"39aaa4d75142ca03721c86bdd3851610688ad30aaee3be8c83d61942920c3c48"} Mar 12 15:02:46 crc kubenswrapper[4869]: I0312 15:02:46.778532 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7cpf8" podStartSLOduration=2.355817961 podStartE2EDuration="5.77851658s" podCreationTimestamp="2026-03-12 15:02:41 +0000 UTC" firstStartedPulling="2026-03-12 15:02:42.726999974 +0000 UTC m=+915.012225252" lastFinishedPulling="2026-03-12 15:02:46.149698583 +0000 UTC m=+918.434923871" observedRunningTime="2026-03-12 15:02:46.773532359 +0000 UTC m=+919.058757637" watchObservedRunningTime="2026-03-12 15:02:46.77851658 +0000 UTC m=+919.063741858" Mar 12 15:02:49 crc kubenswrapper[4869]: I0312 15:02:49.683933 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:02:49 crc kubenswrapper[4869]: I0312 15:02:49.683993 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:02:49 crc kubenswrapper[4869]: I0312 15:02:49.684038 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" Mar 12 15:02:49 crc kubenswrapper[4869]: I0312 15:02:49.684667 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f13d6e2c562cf7652d696fbd35b8d5dcd20c099639553782e920f330cc3ff75c"} pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 15:02:49 crc kubenswrapper[4869]: I0312 15:02:49.684732 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" containerID="cri-o://f13d6e2c562cf7652d696fbd35b8d5dcd20c099639553782e920f330cc3ff75c" gracePeriod=600 Mar 12 15:02:50 crc kubenswrapper[4869]: I0312 15:02:50.781197 4869 generic.go:334] "Generic (PLEG): container finished" podID="1621c994-94d2-4105-a988-f4739518ba91" containerID="f13d6e2c562cf7652d696fbd35b8d5dcd20c099639553782e920f330cc3ff75c" exitCode=0 Mar 12 15:02:50 crc kubenswrapper[4869]: I0312 15:02:50.781253 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerDied","Data":"f13d6e2c562cf7652d696fbd35b8d5dcd20c099639553782e920f330cc3ff75c"} Mar 12 15:02:50 crc kubenswrapper[4869]: I0312 15:02:50.781566 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerStarted","Data":"e0eefeacb8cbd3ff8e6abbc8c8ef619674d30f19c6c54cde868b340244e6c200"} Mar 12 15:02:50 crc kubenswrapper[4869]: I0312 15:02:50.781598 4869 scope.go:117] "RemoveContainer" containerID="782a4d0854bd913b58ff7c98e0d483c61a66e1f21a0fa88c58c100bef1826876" Mar 12 15:02:51 crc kubenswrapper[4869]: I0312 15:02:51.923479 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7cpf8" Mar 12 15:02:51 crc kubenswrapper[4869]: I0312 15:02:51.923522 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7cpf8" Mar 12 15:02:51 crc kubenswrapper[4869]: I0312 15:02:51.978463 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7cpf8" Mar 12 15:02:52 crc kubenswrapper[4869]: I0312 15:02:52.851055 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7cpf8" Mar 12 15:02:52 crc kubenswrapper[4869]: I0312 15:02:52.895115 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7cpf8"] Mar 12 15:02:54 crc kubenswrapper[4869]: I0312 15:02:54.805520 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7cpf8" podUID="63847a0d-c0a9-4d88-8b1e-a8f5b68134ca" containerName="registry-server" containerID="cri-o://39aaa4d75142ca03721c86bdd3851610688ad30aaee3be8c83d61942920c3c48" gracePeriod=2 Mar 12 15:02:55 crc kubenswrapper[4869]: I0312 15:02:55.183713 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7cpf8" Mar 12 15:02:55 crc kubenswrapper[4869]: I0312 15:02:55.257187 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhdkl\" (UniqueName: \"kubernetes.io/projected/63847a0d-c0a9-4d88-8b1e-a8f5b68134ca-kube-api-access-jhdkl\") pod \"63847a0d-c0a9-4d88-8b1e-a8f5b68134ca\" (UID: \"63847a0d-c0a9-4d88-8b1e-a8f5b68134ca\") " Mar 12 15:02:55 crc kubenswrapper[4869]: I0312 15:02:55.257286 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63847a0d-c0a9-4d88-8b1e-a8f5b68134ca-utilities\") pod \"63847a0d-c0a9-4d88-8b1e-a8f5b68134ca\" (UID: \"63847a0d-c0a9-4d88-8b1e-a8f5b68134ca\") " Mar 12 15:02:55 crc kubenswrapper[4869]: I0312 15:02:55.257346 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63847a0d-c0a9-4d88-8b1e-a8f5b68134ca-catalog-content\") pod \"63847a0d-c0a9-4d88-8b1e-a8f5b68134ca\" (UID: \"63847a0d-c0a9-4d88-8b1e-a8f5b68134ca\") " Mar 12 15:02:55 crc kubenswrapper[4869]: I0312 15:02:55.258824 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63847a0d-c0a9-4d88-8b1e-a8f5b68134ca-utilities" (OuterVolumeSpecName: "utilities") pod "63847a0d-c0a9-4d88-8b1e-a8f5b68134ca" (UID: "63847a0d-c0a9-4d88-8b1e-a8f5b68134ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:02:55 crc kubenswrapper[4869]: I0312 15:02:55.266597 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63847a0d-c0a9-4d88-8b1e-a8f5b68134ca-kube-api-access-jhdkl" (OuterVolumeSpecName: "kube-api-access-jhdkl") pod "63847a0d-c0a9-4d88-8b1e-a8f5b68134ca" (UID: "63847a0d-c0a9-4d88-8b1e-a8f5b68134ca"). InnerVolumeSpecName "kube-api-access-jhdkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:02:55 crc kubenswrapper[4869]: I0312 15:02:55.325989 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63847a0d-c0a9-4d88-8b1e-a8f5b68134ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63847a0d-c0a9-4d88-8b1e-a8f5b68134ca" (UID: "63847a0d-c0a9-4d88-8b1e-a8f5b68134ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:02:55 crc kubenswrapper[4869]: I0312 15:02:55.358484 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63847a0d-c0a9-4d88-8b1e-a8f5b68134ca-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:02:55 crc kubenswrapper[4869]: I0312 15:02:55.358524 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63847a0d-c0a9-4d88-8b1e-a8f5b68134ca-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:02:55 crc kubenswrapper[4869]: I0312 15:02:55.358551 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhdkl\" (UniqueName: \"kubernetes.io/projected/63847a0d-c0a9-4d88-8b1e-a8f5b68134ca-kube-api-access-jhdkl\") on node \"crc\" DevicePath \"\"" Mar 12 15:02:55 crc kubenswrapper[4869]: I0312 15:02:55.811941 4869 generic.go:334] "Generic (PLEG): container finished" podID="63847a0d-c0a9-4d88-8b1e-a8f5b68134ca" containerID="39aaa4d75142ca03721c86bdd3851610688ad30aaee3be8c83d61942920c3c48" exitCode=0 Mar 12 15:02:55 crc kubenswrapper[4869]: I0312 15:02:55.811983 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7cpf8" event={"ID":"63847a0d-c0a9-4d88-8b1e-a8f5b68134ca","Type":"ContainerDied","Data":"39aaa4d75142ca03721c86bdd3851610688ad30aaee3be8c83d61942920c3c48"} Mar 12 15:02:55 crc kubenswrapper[4869]: I0312 15:02:55.812013 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7cpf8" event={"ID":"63847a0d-c0a9-4d88-8b1e-a8f5b68134ca","Type":"ContainerDied","Data":"4a4700f9d1f7c3e8b5bfcbd98d78f440c0a62f1e3256fa120e7d643ec15cd119"} Mar 12 15:02:55 crc kubenswrapper[4869]: I0312 15:02:55.812030 4869 scope.go:117] "RemoveContainer" containerID="39aaa4d75142ca03721c86bdd3851610688ad30aaee3be8c83d61942920c3c48" Mar 12 15:02:55 crc kubenswrapper[4869]: I0312 15:02:55.812045 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7cpf8" Mar 12 15:02:55 crc kubenswrapper[4869]: I0312 15:02:55.830294 4869 scope.go:117] "RemoveContainer" containerID="a93c10046390fad4e2d5dafff32e88f8ecf12e8ad03f3756dbb4e1995185fb9c" Mar 12 15:02:55 crc kubenswrapper[4869]: I0312 15:02:55.853951 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7cpf8"] Mar 12 15:02:55 crc kubenswrapper[4869]: I0312 15:02:55.859698 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7cpf8"] Mar 12 15:02:55 crc kubenswrapper[4869]: I0312 15:02:55.859720 4869 scope.go:117] "RemoveContainer" containerID="9f2721d63c31017c28d473ea360e1335aa969e6dad026ae9d2dd3440fba3893e" Mar 12 15:02:55 crc kubenswrapper[4869]: I0312 15:02:55.877225 4869 scope.go:117] "RemoveContainer" containerID="39aaa4d75142ca03721c86bdd3851610688ad30aaee3be8c83d61942920c3c48" Mar 12 15:02:55 crc kubenswrapper[4869]: E0312 15:02:55.878013 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39aaa4d75142ca03721c86bdd3851610688ad30aaee3be8c83d61942920c3c48\": container with ID starting with 39aaa4d75142ca03721c86bdd3851610688ad30aaee3be8c83d61942920c3c48 not found: ID does not exist" containerID="39aaa4d75142ca03721c86bdd3851610688ad30aaee3be8c83d61942920c3c48" Mar 12 15:02:55 crc kubenswrapper[4869]: I0312 15:02:55.878050 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39aaa4d75142ca03721c86bdd3851610688ad30aaee3be8c83d61942920c3c48"} err="failed to get container status \"39aaa4d75142ca03721c86bdd3851610688ad30aaee3be8c83d61942920c3c48\": rpc error: code = NotFound desc = could not find container \"39aaa4d75142ca03721c86bdd3851610688ad30aaee3be8c83d61942920c3c48\": container with ID starting with 39aaa4d75142ca03721c86bdd3851610688ad30aaee3be8c83d61942920c3c48 not found: ID does not exist" Mar 12 15:02:55 crc kubenswrapper[4869]: I0312 15:02:55.878078 4869 scope.go:117] "RemoveContainer" containerID="a93c10046390fad4e2d5dafff32e88f8ecf12e8ad03f3756dbb4e1995185fb9c" Mar 12 15:02:55 crc kubenswrapper[4869]: E0312 15:02:55.878508 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a93c10046390fad4e2d5dafff32e88f8ecf12e8ad03f3756dbb4e1995185fb9c\": container with ID starting with a93c10046390fad4e2d5dafff32e88f8ecf12e8ad03f3756dbb4e1995185fb9c not found: ID does not exist" containerID="a93c10046390fad4e2d5dafff32e88f8ecf12e8ad03f3756dbb4e1995185fb9c" Mar 12 15:02:55 crc kubenswrapper[4869]: I0312 15:02:55.878574 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a93c10046390fad4e2d5dafff32e88f8ecf12e8ad03f3756dbb4e1995185fb9c"} err="failed to get container status \"a93c10046390fad4e2d5dafff32e88f8ecf12e8ad03f3756dbb4e1995185fb9c\": rpc error: code = NotFound desc = could not find container \"a93c10046390fad4e2d5dafff32e88f8ecf12e8ad03f3756dbb4e1995185fb9c\": container with ID starting with a93c10046390fad4e2d5dafff32e88f8ecf12e8ad03f3756dbb4e1995185fb9c not found: ID does not exist" Mar 12 15:02:55 crc kubenswrapper[4869]: I0312 15:02:55.878608 4869 scope.go:117] "RemoveContainer" containerID="9f2721d63c31017c28d473ea360e1335aa969e6dad026ae9d2dd3440fba3893e" Mar 12 15:02:55 crc kubenswrapper[4869]: E0312 15:02:55.878935 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f2721d63c31017c28d473ea360e1335aa969e6dad026ae9d2dd3440fba3893e\": container with ID starting with 9f2721d63c31017c28d473ea360e1335aa969e6dad026ae9d2dd3440fba3893e not found: ID does not exist" containerID="9f2721d63c31017c28d473ea360e1335aa969e6dad026ae9d2dd3440fba3893e" Mar 12 15:02:55 crc kubenswrapper[4869]: I0312 15:02:55.878966 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f2721d63c31017c28d473ea360e1335aa969e6dad026ae9d2dd3440fba3893e"} err="failed to get container status \"9f2721d63c31017c28d473ea360e1335aa969e6dad026ae9d2dd3440fba3893e\": rpc error: code = NotFound desc = could not find container \"9f2721d63c31017c28d473ea360e1335aa969e6dad026ae9d2dd3440fba3893e\": container with ID starting with 9f2721d63c31017c28d473ea360e1335aa969e6dad026ae9d2dd3440fba3893e not found: ID does not exist" Mar 12 15:02:56 crc kubenswrapper[4869]: I0312 15:02:56.344267 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63847a0d-c0a9-4d88-8b1e-a8f5b68134ca" path="/var/lib/kubelet/pods/63847a0d-c0a9-4d88-8b1e-a8f5b68134ca/volumes" Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.042694 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7899659c6c-5rm9x" Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.759691 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-kk29n"] Mar 12 15:02:58 crc kubenswrapper[4869]: E0312 15:02:58.759948 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63847a0d-c0a9-4d88-8b1e-a8f5b68134ca" containerName="extract-content" Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.759960 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="63847a0d-c0a9-4d88-8b1e-a8f5b68134ca" containerName="extract-content" Mar 12 15:02:58 crc kubenswrapper[4869]: E0312 15:02:58.759975 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63847a0d-c0a9-4d88-8b1e-a8f5b68134ca" containerName="registry-server" Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.759981 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="63847a0d-c0a9-4d88-8b1e-a8f5b68134ca" containerName="registry-server" Mar 12 15:02:58 crc kubenswrapper[4869]: E0312 15:02:58.759991 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63847a0d-c0a9-4d88-8b1e-a8f5b68134ca" containerName="extract-utilities" Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.759997 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="63847a0d-c0a9-4d88-8b1e-a8f5b68134ca" containerName="extract-utilities" Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.760125 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="63847a0d-c0a9-4d88-8b1e-a8f5b68134ca" containerName="registry-server" Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.762028 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-kk29n" Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.764656 4869 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.764833 4869 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-c7vx2" Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.766197 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.784647 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-gw94s"] Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.785468 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gw94s" Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.788723 4869 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.808823 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-gw94s"] Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.874488 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-jkzb4"] Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.875674 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-jkzb4" Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.881986 4869 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.882189 4869 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-hg8lx" Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.882308 4869 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.882345 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.909125 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4c5190c8-103c-41d8-b144-a5fc95608626-frr-startup\") pod \"frr-k8s-kk29n\" (UID: \"4c5190c8-103c-41d8-b144-a5fc95608626\") " pod="metallb-system/frr-k8s-kk29n" Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.909216 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4c5190c8-103c-41d8-b144-a5fc95608626-frr-sockets\") pod \"frr-k8s-kk29n\" (UID: \"4c5190c8-103c-41d8-b144-a5fc95608626\") " pod="metallb-system/frr-k8s-kk29n" Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.909247 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c5190c8-103c-41d8-b144-a5fc95608626-metrics-certs\") pod \"frr-k8s-kk29n\" (UID: \"4c5190c8-103c-41d8-b144-a5fc95608626\") " pod="metallb-system/frr-k8s-kk29n" Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.909270 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4c5190c8-103c-41d8-b144-a5fc95608626-reloader\") pod \"frr-k8s-kk29n\" (UID: \"4c5190c8-103c-41d8-b144-a5fc95608626\") " pod="metallb-system/frr-k8s-kk29n" Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.909298 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr6f9\" (UniqueName: \"kubernetes.io/projected/2d1f63b9-3da1-4d8d-a803-6d8e1cec0081-kube-api-access-cr6f9\") pod \"frr-k8s-webhook-server-bcc4b6f68-gw94s\" (UID: \"2d1f63b9-3da1-4d8d-a803-6d8e1cec0081\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gw94s" Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.909363 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4c5190c8-103c-41d8-b144-a5fc95608626-metrics\") pod \"frr-k8s-kk29n\" (UID: \"4c5190c8-103c-41d8-b144-a5fc95608626\") " pod="metallb-system/frr-k8s-kk29n" Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.909418 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4c5190c8-103c-41d8-b144-a5fc95608626-frr-conf\") pod \"frr-k8s-kk29n\" (UID: \"4c5190c8-103c-41d8-b144-a5fc95608626\") " pod="metallb-system/frr-k8s-kk29n" Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.909454 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d1f63b9-3da1-4d8d-a803-6d8e1cec0081-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-gw94s\" (UID: \"2d1f63b9-3da1-4d8d-a803-6d8e1cec0081\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gw94s" Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.909499 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs4cs\" (UniqueName: \"kubernetes.io/projected/4c5190c8-103c-41d8-b144-a5fc95608626-kube-api-access-rs4cs\") pod \"frr-k8s-kk29n\" (UID: \"4c5190c8-103c-41d8-b144-a5fc95608626\") " pod="metallb-system/frr-k8s-kk29n" Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.919609 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-xr7s4"] Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.920722 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-xr7s4" Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.924777 4869 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 12 15:02:58 crc kubenswrapper[4869]: I0312 15:02:58.963602 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-xr7s4"] Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.010984 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d1f63b9-3da1-4d8d-a803-6d8e1cec0081-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-gw94s\" (UID: \"2d1f63b9-3da1-4d8d-a803-6d8e1cec0081\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gw94s" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.011037 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs4cs\" (UniqueName: \"kubernetes.io/projected/4c5190c8-103c-41d8-b144-a5fc95608626-kube-api-access-rs4cs\") pod \"frr-k8s-kk29n\" (UID: \"4c5190c8-103c-41d8-b144-a5fc95608626\") " pod="metallb-system/frr-k8s-kk29n" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.011076 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4162c7d8-9940-4754-884e-7a8ed66d3281-memberlist\") pod \"speaker-jkzb4\" (UID: \"4162c7d8-9940-4754-884e-7a8ed66d3281\") " pod="metallb-system/speaker-jkzb4" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.011114 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4c5190c8-103c-41d8-b144-a5fc95608626-frr-startup\") pod \"frr-k8s-kk29n\" (UID: \"4c5190c8-103c-41d8-b144-a5fc95608626\") " pod="metallb-system/frr-k8s-kk29n" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.011135 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4c5190c8-103c-41d8-b144-a5fc95608626-frr-sockets\") pod \"frr-k8s-kk29n\" (UID: \"4c5190c8-103c-41d8-b144-a5fc95608626\") " pod="metallb-system/frr-k8s-kk29n" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.011152 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c5190c8-103c-41d8-b144-a5fc95608626-metrics-certs\") pod \"frr-k8s-kk29n\" (UID: \"4c5190c8-103c-41d8-b144-a5fc95608626\") " pod="metallb-system/frr-k8s-kk29n" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.011168 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4c5190c8-103c-41d8-b144-a5fc95608626-reloader\") pod \"frr-k8s-kk29n\" (UID: \"4c5190c8-103c-41d8-b144-a5fc95608626\") " pod="metallb-system/frr-k8s-kk29n" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.011184 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr6f9\" (UniqueName: \"kubernetes.io/projected/2d1f63b9-3da1-4d8d-a803-6d8e1cec0081-kube-api-access-cr6f9\") pod \"frr-k8s-webhook-server-bcc4b6f68-gw94s\" (UID: \"2d1f63b9-3da1-4d8d-a803-6d8e1cec0081\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gw94s" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.011205 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4162c7d8-9940-4754-884e-7a8ed66d3281-metallb-excludel2\") pod \"speaker-jkzb4\" (UID: \"4162c7d8-9940-4754-884e-7a8ed66d3281\") " pod="metallb-system/speaker-jkzb4" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.011223 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4162c7d8-9940-4754-884e-7a8ed66d3281-metrics-certs\") pod \"speaker-jkzb4\" (UID: \"4162c7d8-9940-4754-884e-7a8ed66d3281\") " pod="metallb-system/speaker-jkzb4" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.011243 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4c5190c8-103c-41d8-b144-a5fc95608626-metrics\") pod \"frr-k8s-kk29n\" (UID: \"4c5190c8-103c-41d8-b144-a5fc95608626\") " pod="metallb-system/frr-k8s-kk29n" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.011261 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f3cc7c1-69f1-47ac-bc57-20ea088c9d6a-metrics-certs\") pod \"controller-7bb4cc7c98-xr7s4\" (UID: \"0f3cc7c1-69f1-47ac-bc57-20ea088c9d6a\") " pod="metallb-system/controller-7bb4cc7c98-xr7s4" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.011278 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw69z\" (UniqueName: \"kubernetes.io/projected/0f3cc7c1-69f1-47ac-bc57-20ea088c9d6a-kube-api-access-xw69z\") pod \"controller-7bb4cc7c98-xr7s4\" (UID: \"0f3cc7c1-69f1-47ac-bc57-20ea088c9d6a\") " pod="metallb-system/controller-7bb4cc7c98-xr7s4" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.011294 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc7jt\" (UniqueName: \"kubernetes.io/projected/4162c7d8-9940-4754-884e-7a8ed66d3281-kube-api-access-gc7jt\") pod \"speaker-jkzb4\" (UID: \"4162c7d8-9940-4754-884e-7a8ed66d3281\") " pod="metallb-system/speaker-jkzb4" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.011317 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f3cc7c1-69f1-47ac-bc57-20ea088c9d6a-cert\") pod \"controller-7bb4cc7c98-xr7s4\" (UID: \"0f3cc7c1-69f1-47ac-bc57-20ea088c9d6a\") " pod="metallb-system/controller-7bb4cc7c98-xr7s4" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.011331 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4c5190c8-103c-41d8-b144-a5fc95608626-frr-conf\") pod \"frr-k8s-kk29n\" (UID: \"4c5190c8-103c-41d8-b144-a5fc95608626\") " pod="metallb-system/frr-k8s-kk29n" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.011730 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4c5190c8-103c-41d8-b144-a5fc95608626-frr-conf\") pod \"frr-k8s-kk29n\" (UID: \"4c5190c8-103c-41d8-b144-a5fc95608626\") " pod="metallb-system/frr-k8s-kk29n" Mar 12 15:02:59 crc kubenswrapper[4869]: E0312 15:02:59.011811 4869 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 12 15:02:59 crc kubenswrapper[4869]: E0312 15:02:59.011849 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d1f63b9-3da1-4d8d-a803-6d8e1cec0081-cert podName:2d1f63b9-3da1-4d8d-a803-6d8e1cec0081 nodeName:}" failed. No retries permitted until 2026-03-12 15:02:59.511834572 +0000 UTC m=+931.797059850 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2d1f63b9-3da1-4d8d-a803-6d8e1cec0081-cert") pod "frr-k8s-webhook-server-bcc4b6f68-gw94s" (UID: "2d1f63b9-3da1-4d8d-a803-6d8e1cec0081") : secret "frr-k8s-webhook-server-cert" not found Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.012893 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4c5190c8-103c-41d8-b144-a5fc95608626-frr-startup\") pod \"frr-k8s-kk29n\" (UID: \"4c5190c8-103c-41d8-b144-a5fc95608626\") " pod="metallb-system/frr-k8s-kk29n" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.013086 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4c5190c8-103c-41d8-b144-a5fc95608626-frr-sockets\") pod \"frr-k8s-kk29n\" (UID: \"4c5190c8-103c-41d8-b144-a5fc95608626\") " pod="metallb-system/frr-k8s-kk29n" Mar 12 15:02:59 crc kubenswrapper[4869]: E0312 15:02:59.013139 4869 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 12 15:02:59 crc kubenswrapper[4869]: E0312 15:02:59.013167 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c5190c8-103c-41d8-b144-a5fc95608626-metrics-certs podName:4c5190c8-103c-41d8-b144-a5fc95608626 nodeName:}" failed. No retries permitted until 2026-03-12 15:02:59.51315893 +0000 UTC m=+931.798384208 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4c5190c8-103c-41d8-b144-a5fc95608626-metrics-certs") pod "frr-k8s-kk29n" (UID: "4c5190c8-103c-41d8-b144-a5fc95608626") : secret "frr-k8s-certs-secret" not found Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.013368 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4c5190c8-103c-41d8-b144-a5fc95608626-reloader\") pod \"frr-k8s-kk29n\" (UID: \"4c5190c8-103c-41d8-b144-a5fc95608626\") " pod="metallb-system/frr-k8s-kk29n" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.013679 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4c5190c8-103c-41d8-b144-a5fc95608626-metrics\") pod \"frr-k8s-kk29n\" (UID: \"4c5190c8-103c-41d8-b144-a5fc95608626\") " pod="metallb-system/frr-k8s-kk29n" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.045222 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr6f9\" (UniqueName: \"kubernetes.io/projected/2d1f63b9-3da1-4d8d-a803-6d8e1cec0081-kube-api-access-cr6f9\") pod \"frr-k8s-webhook-server-bcc4b6f68-gw94s\" (UID: \"2d1f63b9-3da1-4d8d-a803-6d8e1cec0081\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gw94s" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.045290 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs4cs\" (UniqueName: \"kubernetes.io/projected/4c5190c8-103c-41d8-b144-a5fc95608626-kube-api-access-rs4cs\") pod \"frr-k8s-kk29n\" (UID: \"4c5190c8-103c-41d8-b144-a5fc95608626\") " pod="metallb-system/frr-k8s-kk29n" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.112372 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4162c7d8-9940-4754-884e-7a8ed66d3281-metrics-certs\") pod \"speaker-jkzb4\" (UID: \"4162c7d8-9940-4754-884e-7a8ed66d3281\") " pod="metallb-system/speaker-jkzb4" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.112429 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f3cc7c1-69f1-47ac-bc57-20ea088c9d6a-metrics-certs\") pod \"controller-7bb4cc7c98-xr7s4\" (UID: \"0f3cc7c1-69f1-47ac-bc57-20ea088c9d6a\") " pod="metallb-system/controller-7bb4cc7c98-xr7s4" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.112454 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw69z\" (UniqueName: \"kubernetes.io/projected/0f3cc7c1-69f1-47ac-bc57-20ea088c9d6a-kube-api-access-xw69z\") pod \"controller-7bb4cc7c98-xr7s4\" (UID: \"0f3cc7c1-69f1-47ac-bc57-20ea088c9d6a\") " pod="metallb-system/controller-7bb4cc7c98-xr7s4" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.112484 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc7jt\" (UniqueName: \"kubernetes.io/projected/4162c7d8-9940-4754-884e-7a8ed66d3281-kube-api-access-gc7jt\") pod \"speaker-jkzb4\" (UID: \"4162c7d8-9940-4754-884e-7a8ed66d3281\") " pod="metallb-system/speaker-jkzb4" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.112520 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f3cc7c1-69f1-47ac-bc57-20ea088c9d6a-cert\") pod \"controller-7bb4cc7c98-xr7s4\" (UID: \"0f3cc7c1-69f1-47ac-bc57-20ea088c9d6a\") " pod="metallb-system/controller-7bb4cc7c98-xr7s4" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.112590 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4162c7d8-9940-4754-884e-7a8ed66d3281-memberlist\") pod \"speaker-jkzb4\" (UID: \"4162c7d8-9940-4754-884e-7a8ed66d3281\") " pod="metallb-system/speaker-jkzb4" Mar 12 15:02:59 crc kubenswrapper[4869]: E0312 15:02:59.112592 4869 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 12 15:02:59 crc kubenswrapper[4869]: E0312 15:02:59.112660 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4162c7d8-9940-4754-884e-7a8ed66d3281-metrics-certs podName:4162c7d8-9940-4754-884e-7a8ed66d3281 nodeName:}" failed. No retries permitted until 2026-03-12 15:02:59.612641167 +0000 UTC m=+931.897866445 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4162c7d8-9940-4754-884e-7a8ed66d3281-metrics-certs") pod "speaker-jkzb4" (UID: "4162c7d8-9940-4754-884e-7a8ed66d3281") : secret "speaker-certs-secret" not found Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.112685 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4162c7d8-9940-4754-884e-7a8ed66d3281-metallb-excludel2\") pod \"speaker-jkzb4\" (UID: \"4162c7d8-9940-4754-884e-7a8ed66d3281\") " pod="metallb-system/speaker-jkzb4" Mar 12 15:02:59 crc kubenswrapper[4869]: E0312 15:02:59.112715 4869 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 12 15:02:59 crc kubenswrapper[4869]: E0312 15:02:59.112773 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4162c7d8-9940-4754-884e-7a8ed66d3281-memberlist podName:4162c7d8-9940-4754-884e-7a8ed66d3281 nodeName:}" failed. No retries permitted until 2026-03-12 15:02:59.612753891 +0000 UTC m=+931.897979279 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/4162c7d8-9940-4754-884e-7a8ed66d3281-memberlist") pod "speaker-jkzb4" (UID: "4162c7d8-9940-4754-884e-7a8ed66d3281") : secret "metallb-memberlist" not found Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.113330 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4162c7d8-9940-4754-884e-7a8ed66d3281-metallb-excludel2\") pod \"speaker-jkzb4\" (UID: \"4162c7d8-9940-4754-884e-7a8ed66d3281\") " pod="metallb-system/speaker-jkzb4" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.114722 4869 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.117016 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f3cc7c1-69f1-47ac-bc57-20ea088c9d6a-metrics-certs\") pod \"controller-7bb4cc7c98-xr7s4\" (UID: \"0f3cc7c1-69f1-47ac-bc57-20ea088c9d6a\") " pod="metallb-system/controller-7bb4cc7c98-xr7s4" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.126131 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f3cc7c1-69f1-47ac-bc57-20ea088c9d6a-cert\") pod \"controller-7bb4cc7c98-xr7s4\" (UID: \"0f3cc7c1-69f1-47ac-bc57-20ea088c9d6a\") " pod="metallb-system/controller-7bb4cc7c98-xr7s4" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.130275 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc7jt\" (UniqueName: \"kubernetes.io/projected/4162c7d8-9940-4754-884e-7a8ed66d3281-kube-api-access-gc7jt\") pod \"speaker-jkzb4\" (UID: \"4162c7d8-9940-4754-884e-7a8ed66d3281\") " pod="metallb-system/speaker-jkzb4" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.142418 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw69z\" (UniqueName: \"kubernetes.io/projected/0f3cc7c1-69f1-47ac-bc57-20ea088c9d6a-kube-api-access-xw69z\") pod \"controller-7bb4cc7c98-xr7s4\" (UID: \"0f3cc7c1-69f1-47ac-bc57-20ea088c9d6a\") " pod="metallb-system/controller-7bb4cc7c98-xr7s4" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.255837 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-xr7s4" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.517799 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c5190c8-103c-41d8-b144-a5fc95608626-metrics-certs\") pod \"frr-k8s-kk29n\" (UID: \"4c5190c8-103c-41d8-b144-a5fc95608626\") " pod="metallb-system/frr-k8s-kk29n" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.518219 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d1f63b9-3da1-4d8d-a803-6d8e1cec0081-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-gw94s\" (UID: \"2d1f63b9-3da1-4d8d-a803-6d8e1cec0081\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gw94s" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.523900 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c5190c8-103c-41d8-b144-a5fc95608626-metrics-certs\") pod \"frr-k8s-kk29n\" (UID: \"4c5190c8-103c-41d8-b144-a5fc95608626\") " pod="metallb-system/frr-k8s-kk29n" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.524056 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d1f63b9-3da1-4d8d-a803-6d8e1cec0081-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-gw94s\" (UID: \"2d1f63b9-3da1-4d8d-a803-6d8e1cec0081\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gw94s" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.619674 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4162c7d8-9940-4754-884e-7a8ed66d3281-memberlist\") pod \"speaker-jkzb4\" (UID: \"4162c7d8-9940-4754-884e-7a8ed66d3281\") " pod="metallb-system/speaker-jkzb4" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.619776 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4162c7d8-9940-4754-884e-7a8ed66d3281-metrics-certs\") pod \"speaker-jkzb4\" (UID: \"4162c7d8-9940-4754-884e-7a8ed66d3281\") " pod="metallb-system/speaker-jkzb4" Mar 12 15:02:59 crc kubenswrapper[4869]: E0312 15:02:59.619858 4869 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 12 15:02:59 crc kubenswrapper[4869]: E0312 15:02:59.619935 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4162c7d8-9940-4754-884e-7a8ed66d3281-memberlist podName:4162c7d8-9940-4754-884e-7a8ed66d3281 nodeName:}" failed. No retries permitted until 2026-03-12 15:03:00.619917904 +0000 UTC m=+932.905143182 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/4162c7d8-9940-4754-884e-7a8ed66d3281-memberlist") pod "speaker-jkzb4" (UID: "4162c7d8-9940-4754-884e-7a8ed66d3281") : secret "metallb-memberlist" not found Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.622523 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4162c7d8-9940-4754-884e-7a8ed66d3281-metrics-certs\") pod \"speaker-jkzb4\" (UID: \"4162c7d8-9940-4754-884e-7a8ed66d3281\") " pod="metallb-system/speaker-jkzb4" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.634065 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-xr7s4"] Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.677041 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-kk29n" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.697689 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gw94s" Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.857054 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-xr7s4" event={"ID":"0f3cc7c1-69f1-47ac-bc57-20ea088c9d6a","Type":"ContainerStarted","Data":"64499ae042a9b197bd77e821ef37436920135a9ed5ea665ac5c4e5acec9bad36"} Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.857094 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-xr7s4" event={"ID":"0f3cc7c1-69f1-47ac-bc57-20ea088c9d6a","Type":"ContainerStarted","Data":"e00063d79d2b0ea1a2b8fa45895ce9f1efd3e1a611529e4b4eb6b26f574f8611"} Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.858022 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kk29n" event={"ID":"4c5190c8-103c-41d8-b144-a5fc95608626","Type":"ContainerStarted","Data":"7b7bbe8feaa39c0dffa561c27e082d9d50039ad1262f78c832d6072e01ad5051"} Mar 12 15:02:59 crc kubenswrapper[4869]: I0312 15:02:59.928497 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-gw94s"] Mar 12 15:02:59 crc kubenswrapper[4869]: W0312 15:02:59.937362 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d1f63b9_3da1_4d8d_a803_6d8e1cec0081.slice/crio-039f701733ce9ddcca5bf257ea3a3474467aab77b5fed1f99fd42cdd3cb1fe1d WatchSource:0}: Error finding container 039f701733ce9ddcca5bf257ea3a3474467aab77b5fed1f99fd42cdd3cb1fe1d: Status 404 returned error can't find the container with id 039f701733ce9ddcca5bf257ea3a3474467aab77b5fed1f99fd42cdd3cb1fe1d Mar 12 15:03:00 crc kubenswrapper[4869]: I0312 15:03:00.637288 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4162c7d8-9940-4754-884e-7a8ed66d3281-memberlist\") pod \"speaker-jkzb4\" (UID: \"4162c7d8-9940-4754-884e-7a8ed66d3281\") " pod="metallb-system/speaker-jkzb4" Mar 12 15:03:00 crc kubenswrapper[4869]: I0312 15:03:00.650419 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4162c7d8-9940-4754-884e-7a8ed66d3281-memberlist\") pod \"speaker-jkzb4\" (UID: \"4162c7d8-9940-4754-884e-7a8ed66d3281\") " pod="metallb-system/speaker-jkzb4" Mar 12 15:03:00 crc kubenswrapper[4869]: I0312 15:03:00.713236 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-jkzb4" Mar 12 15:03:00 crc kubenswrapper[4869]: W0312 15:03:00.739500 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4162c7d8_9940_4754_884e_7a8ed66d3281.slice/crio-a2cdfa1b7e4c0fbd20dabaa3bbb3b293313bd37253460ad73934a6e625d03df0 WatchSource:0}: Error finding container a2cdfa1b7e4c0fbd20dabaa3bbb3b293313bd37253460ad73934a6e625d03df0: Status 404 returned error can't find the container with id a2cdfa1b7e4c0fbd20dabaa3bbb3b293313bd37253460ad73934a6e625d03df0 Mar 12 15:03:00 crc kubenswrapper[4869]: I0312 15:03:00.872338 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jkzb4" event={"ID":"4162c7d8-9940-4754-884e-7a8ed66d3281","Type":"ContainerStarted","Data":"a2cdfa1b7e4c0fbd20dabaa3bbb3b293313bd37253460ad73934a6e625d03df0"} Mar 12 15:03:00 crc kubenswrapper[4869]: I0312 15:03:00.874378 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-xr7s4" event={"ID":"0f3cc7c1-69f1-47ac-bc57-20ea088c9d6a","Type":"ContainerStarted","Data":"dec59db101b8bb172434e7299b5b12901ee1d4e579971efb711a658a38444b81"} Mar 12 15:03:00 crc kubenswrapper[4869]: I0312 15:03:00.874483 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-xr7s4" Mar 12 15:03:00 crc kubenswrapper[4869]: I0312 15:03:00.876146 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gw94s" event={"ID":"2d1f63b9-3da1-4d8d-a803-6d8e1cec0081","Type":"ContainerStarted","Data":"039f701733ce9ddcca5bf257ea3a3474467aab77b5fed1f99fd42cdd3cb1fe1d"} Mar 12 15:03:00 crc kubenswrapper[4869]: I0312 15:03:00.914677 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-xr7s4" podStartSLOduration=2.914662637 podStartE2EDuration="2.914662637s" podCreationTimestamp="2026-03-12 15:02:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:03:00.912867666 +0000 UTC m=+933.198092944" watchObservedRunningTime="2026-03-12 15:03:00.914662637 +0000 UTC m=+933.199887915" Mar 12 15:03:01 crc kubenswrapper[4869]: I0312 15:03:01.885690 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jkzb4" event={"ID":"4162c7d8-9940-4754-884e-7a8ed66d3281","Type":"ContainerStarted","Data":"adcbe3bc0a652ac15c305e199e3be5bd41585b434b47a6b47eb0ff42db03962e"} Mar 12 15:03:01 crc kubenswrapper[4869]: I0312 15:03:01.885775 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-jkzb4" Mar 12 15:03:01 crc kubenswrapper[4869]: I0312 15:03:01.885792 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jkzb4" event={"ID":"4162c7d8-9940-4754-884e-7a8ed66d3281","Type":"ContainerStarted","Data":"03af721d7bd28ac9699eaa8e5d251b0295eb502efbf817ae2251c2e7b8ca5de1"} Mar 12 15:03:01 crc kubenswrapper[4869]: I0312 15:03:01.908505 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-jkzb4" podStartSLOduration=3.908488155 podStartE2EDuration="3.908488155s" podCreationTimestamp="2026-03-12 15:02:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:03:01.906096897 +0000 UTC m=+934.191322195" watchObservedRunningTime="2026-03-12 15:03:01.908488155 +0000 UTC m=+934.193713423" Mar 12 15:03:07 crc kubenswrapper[4869]: I0312 15:03:07.932187 4869 generic.go:334] "Generic (PLEG): container finished" podID="4c5190c8-103c-41d8-b144-a5fc95608626" containerID="1d8d35aacdc83d95b10b92c8ec766e4b692ee336a54c433c2e6455c25731130b" exitCode=0 Mar 12 15:03:07 crc kubenswrapper[4869]: I0312 15:03:07.932250 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kk29n" event={"ID":"4c5190c8-103c-41d8-b144-a5fc95608626","Type":"ContainerDied","Data":"1d8d35aacdc83d95b10b92c8ec766e4b692ee336a54c433c2e6455c25731130b"} Mar 12 15:03:07 crc kubenswrapper[4869]: I0312 15:03:07.934108 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gw94s" event={"ID":"2d1f63b9-3da1-4d8d-a803-6d8e1cec0081","Type":"ContainerStarted","Data":"cd12c8cd0aed9f42a759a1692d61c404298fd8d440b611f3ab3a13c1ce49594b"} Mar 12 15:03:07 crc kubenswrapper[4869]: I0312 15:03:07.934380 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gw94s" Mar 12 15:03:08 crc kubenswrapper[4869]: I0312 15:03:08.940443 4869 generic.go:334] "Generic (PLEG): container finished" podID="4c5190c8-103c-41d8-b144-a5fc95608626" containerID="db818e4d665341683b7d4f9e4681341a3daae919134869d05fe952da9f70e9c5" exitCode=0 Mar 12 15:03:08 crc kubenswrapper[4869]: I0312 15:03:08.940566 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kk29n" event={"ID":"4c5190c8-103c-41d8-b144-a5fc95608626","Type":"ContainerDied","Data":"db818e4d665341683b7d4f9e4681341a3daae919134869d05fe952da9f70e9c5"} Mar 12 15:03:08 crc kubenswrapper[4869]: I0312 15:03:08.968000 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gw94s" podStartSLOduration=3.9863466709999997 podStartE2EDuration="10.967985185s" podCreationTimestamp="2026-03-12 15:02:58 +0000 UTC" firstStartedPulling="2026-03-12 15:02:59.940877473 +0000 UTC m=+932.226102751" lastFinishedPulling="2026-03-12 15:03:06.922515987 +0000 UTC m=+939.207741265" observedRunningTime="2026-03-12 15:03:07.973003385 +0000 UTC m=+940.258228683" watchObservedRunningTime="2026-03-12 15:03:08.967985185 +0000 UTC m=+941.253210463" Mar 12 15:03:09 crc kubenswrapper[4869]: I0312 15:03:09.259457 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-xr7s4" Mar 12 15:03:09 crc kubenswrapper[4869]: I0312 15:03:09.948354 4869 generic.go:334] "Generic (PLEG): container finished" podID="4c5190c8-103c-41d8-b144-a5fc95608626" containerID="c3f3a9649491f272d3d9b8cce6e3d5282581f3d82458c94ca3689202e2f180b6" exitCode=0 Mar 12 15:03:09 crc kubenswrapper[4869]: I0312 15:03:09.948453 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kk29n" event={"ID":"4c5190c8-103c-41d8-b144-a5fc95608626","Type":"ContainerDied","Data":"c3f3a9649491f272d3d9b8cce6e3d5282581f3d82458c94ca3689202e2f180b6"} Mar 12 15:03:10 crc kubenswrapper[4869]: I0312 15:03:10.958918 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kk29n" event={"ID":"4c5190c8-103c-41d8-b144-a5fc95608626","Type":"ContainerStarted","Data":"c8761dbb0462d0103b0d098c3481868624944a9959d4fb857b23cc8f331c44cc"} Mar 12 15:03:10 crc kubenswrapper[4869]: I0312 15:03:10.958957 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kk29n" event={"ID":"4c5190c8-103c-41d8-b144-a5fc95608626","Type":"ContainerStarted","Data":"4f95675cdd27edd562c71ac170defe11e719bd6fb45e8c03643409af5b4d0895"} Mar 12 15:03:10 crc kubenswrapper[4869]: I0312 15:03:10.958967 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kk29n" event={"ID":"4c5190c8-103c-41d8-b144-a5fc95608626","Type":"ContainerStarted","Data":"d948c9e8ea44455dd854197b3b321f1940f7eb5ed686701d17cc268477893612"} Mar 12 15:03:10 crc kubenswrapper[4869]: I0312 15:03:10.958976 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kk29n" event={"ID":"4c5190c8-103c-41d8-b144-a5fc95608626","Type":"ContainerStarted","Data":"28fb8eb785813a46de13ad09d39d7f1892fc4be69ab11b7a8e880ffa00a1f60c"} Mar 12 15:03:10 crc kubenswrapper[4869]: I0312 15:03:10.958983 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kk29n" event={"ID":"4c5190c8-103c-41d8-b144-a5fc95608626","Type":"ContainerStarted","Data":"f845b396cdeec36dcfd35a9ae545feafe33b440595d0766dee2a6dc744edcf52"} Mar 12 15:03:10 crc kubenswrapper[4869]: I0312 15:03:10.958991 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kk29n" event={"ID":"4c5190c8-103c-41d8-b144-a5fc95608626","Type":"ContainerStarted","Data":"e460fbaaf8de1482edb430101a976172baad376f5a83e9b4df40c289a8ef9891"} Mar 12 15:03:10 crc kubenswrapper[4869]: I0312 15:03:10.959103 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-kk29n" Mar 12 15:03:14 crc kubenswrapper[4869]: I0312 15:03:14.677980 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-kk29n" Mar 12 15:03:14 crc kubenswrapper[4869]: I0312 15:03:14.713728 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-kk29n" Mar 12 15:03:14 crc kubenswrapper[4869]: I0312 15:03:14.740363 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-kk29n" podStartSLOduration=9.592224226999999 podStartE2EDuration="16.74034716s" podCreationTimestamp="2026-03-12 15:02:58 +0000 UTC" firstStartedPulling="2026-03-12 15:02:59.780211388 +0000 UTC m=+932.065436666" lastFinishedPulling="2026-03-12 15:03:06.928334321 +0000 UTC m=+939.213559599" observedRunningTime="2026-03-12 15:03:10.980417623 +0000 UTC m=+943.265642911" watchObservedRunningTime="2026-03-12 15:03:14.74034716 +0000 UTC m=+947.025572428" Mar 12 15:03:19 crc kubenswrapper[4869]: I0312 15:03:19.701169 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gw94s" Mar 12 15:03:20 crc kubenswrapper[4869]: I0312 15:03:20.717035 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-jkzb4" Mar 12 15:03:23 crc kubenswrapper[4869]: I0312 15:03:23.680817 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-x5f8z"] Mar 12 15:03:23 crc kubenswrapper[4869]: I0312 15:03:23.681976 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-x5f8z" Mar 12 15:03:23 crc kubenswrapper[4869]: I0312 15:03:23.683853 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 12 15:03:23 crc kubenswrapper[4869]: I0312 15:03:23.684183 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-56ccj" Mar 12 15:03:23 crc kubenswrapper[4869]: I0312 15:03:23.684203 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 12 15:03:23 crc kubenswrapper[4869]: I0312 15:03:23.701150 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-x5f8z"] Mar 12 15:03:23 crc kubenswrapper[4869]: I0312 15:03:23.729821 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7d4z\" (UniqueName: \"kubernetes.io/projected/7440ff1b-ed7b-4da0-b191-e7ee62917f2d-kube-api-access-k7d4z\") pod \"openstack-operator-index-x5f8z\" (UID: \"7440ff1b-ed7b-4da0-b191-e7ee62917f2d\") " pod="openstack-operators/openstack-operator-index-x5f8z" Mar 12 15:03:23 crc kubenswrapper[4869]: I0312 15:03:23.830872 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7d4z\" (UniqueName: \"kubernetes.io/projected/7440ff1b-ed7b-4da0-b191-e7ee62917f2d-kube-api-access-k7d4z\") pod \"openstack-operator-index-x5f8z\" (UID: \"7440ff1b-ed7b-4da0-b191-e7ee62917f2d\") " pod="openstack-operators/openstack-operator-index-x5f8z" Mar 12 15:03:23 crc kubenswrapper[4869]: I0312 15:03:23.847898 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7d4z\" (UniqueName: \"kubernetes.io/projected/7440ff1b-ed7b-4da0-b191-e7ee62917f2d-kube-api-access-k7d4z\") pod \"openstack-operator-index-x5f8z\" (UID: \"7440ff1b-ed7b-4da0-b191-e7ee62917f2d\") " pod="openstack-operators/openstack-operator-index-x5f8z" Mar 12 15:03:24 crc kubenswrapper[4869]: I0312 15:03:24.005826 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-x5f8z" Mar 12 15:03:24 crc kubenswrapper[4869]: I0312 15:03:24.380841 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-x5f8z"] Mar 12 15:03:24 crc kubenswrapper[4869]: W0312 15:03:24.388484 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7440ff1b_ed7b_4da0_b191_e7ee62917f2d.slice/crio-e7d3ae1a13d0a6b2f55bc3d3b270fbd591e65c94b06ea71ec548eecc2debe2b6 WatchSource:0}: Error finding container e7d3ae1a13d0a6b2f55bc3d3b270fbd591e65c94b06ea71ec548eecc2debe2b6: Status 404 returned error can't find the container with id e7d3ae1a13d0a6b2f55bc3d3b270fbd591e65c94b06ea71ec548eecc2debe2b6 Mar 12 15:03:25 crc kubenswrapper[4869]: I0312 15:03:25.059184 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-x5f8z" event={"ID":"7440ff1b-ed7b-4da0-b191-e7ee62917f2d","Type":"ContainerStarted","Data":"e7d3ae1a13d0a6b2f55bc3d3b270fbd591e65c94b06ea71ec548eecc2debe2b6"} Mar 12 15:03:27 crc kubenswrapper[4869]: I0312 15:03:27.073660 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-x5f8z" event={"ID":"7440ff1b-ed7b-4da0-b191-e7ee62917f2d","Type":"ContainerStarted","Data":"e35047cf16f0e7af0371e6db7afa52b3eec014d8463ea31168f136a086898e4c"} Mar 12 15:03:27 crc kubenswrapper[4869]: I0312 15:03:27.096308 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-x5f8z" podStartSLOduration=1.628142215 podStartE2EDuration="4.096290573s" podCreationTimestamp="2026-03-12 15:03:23 +0000 UTC" firstStartedPulling="2026-03-12 15:03:24.390804476 +0000 UTC m=+956.676029754" lastFinishedPulling="2026-03-12 15:03:26.858952834 +0000 UTC m=+959.144178112" observedRunningTime="2026-03-12 15:03:27.093585576 +0000 UTC m=+959.378810864" watchObservedRunningTime="2026-03-12 15:03:27.096290573 +0000 UTC m=+959.381515851" Mar 12 15:03:27 crc kubenswrapper[4869]: I0312 15:03:27.269742 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-x5f8z"] Mar 12 15:03:27 crc kubenswrapper[4869]: I0312 15:03:27.871425 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-89bmr"] Mar 12 15:03:27 crc kubenswrapper[4869]: I0312 15:03:27.872953 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-89bmr" Mar 12 15:03:27 crc kubenswrapper[4869]: I0312 15:03:27.891783 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-89bmr"] Mar 12 15:03:28 crc kubenswrapper[4869]: I0312 15:03:28.009663 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksbcd\" (UniqueName: \"kubernetes.io/projected/7fa1dc1f-0f9a-4a6d-b3f8-62df1e5e6a23-kube-api-access-ksbcd\") pod \"openstack-operator-index-89bmr\" (UID: \"7fa1dc1f-0f9a-4a6d-b3f8-62df1e5e6a23\") " pod="openstack-operators/openstack-operator-index-89bmr" Mar 12 15:03:28 crc kubenswrapper[4869]: I0312 15:03:28.111438 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksbcd\" (UniqueName: \"kubernetes.io/projected/7fa1dc1f-0f9a-4a6d-b3f8-62df1e5e6a23-kube-api-access-ksbcd\") pod \"openstack-operator-index-89bmr\" (UID: \"7fa1dc1f-0f9a-4a6d-b3f8-62df1e5e6a23\") " pod="openstack-operators/openstack-operator-index-89bmr" Mar 12 15:03:28 crc kubenswrapper[4869]: I0312 15:03:28.141019 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksbcd\" (UniqueName: \"kubernetes.io/projected/7fa1dc1f-0f9a-4a6d-b3f8-62df1e5e6a23-kube-api-access-ksbcd\") pod \"openstack-operator-index-89bmr\" (UID: \"7fa1dc1f-0f9a-4a6d-b3f8-62df1e5e6a23\") " pod="openstack-operators/openstack-operator-index-89bmr" Mar 12 15:03:28 crc kubenswrapper[4869]: I0312 15:03:28.204591 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-89bmr" Mar 12 15:03:28 crc kubenswrapper[4869]: I0312 15:03:28.472047 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-89bmr"] Mar 12 15:03:29 crc kubenswrapper[4869]: I0312 15:03:29.089302 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-89bmr" event={"ID":"7fa1dc1f-0f9a-4a6d-b3f8-62df1e5e6a23","Type":"ContainerStarted","Data":"56772214abb08515537b707ceddf64ae53666098656dcd82eec5f9cc53b3e051"} Mar 12 15:03:29 crc kubenswrapper[4869]: I0312 15:03:29.089725 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-89bmr" event={"ID":"7fa1dc1f-0f9a-4a6d-b3f8-62df1e5e6a23","Type":"ContainerStarted","Data":"3c750ce305f69f4932e4189ea6f8344e6f04b8ee7e6f56e02fb8e763f54877a1"} Mar 12 15:03:29 crc kubenswrapper[4869]: I0312 15:03:29.089402 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-x5f8z" podUID="7440ff1b-ed7b-4da0-b191-e7ee62917f2d" containerName="registry-server" containerID="cri-o://e35047cf16f0e7af0371e6db7afa52b3eec014d8463ea31168f136a086898e4c" gracePeriod=2 Mar 12 15:03:29 crc kubenswrapper[4869]: I0312 15:03:29.108837 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-89bmr" podStartSLOduration=2.064042889 podStartE2EDuration="2.108808853s" podCreationTimestamp="2026-03-12 15:03:27 +0000 UTC" firstStartedPulling="2026-03-12 15:03:28.482599739 +0000 UTC m=+960.767825017" lastFinishedPulling="2026-03-12 15:03:28.527365703 +0000 UTC m=+960.812590981" observedRunningTime="2026-03-12 15:03:29.105258043 +0000 UTC m=+961.390483341" watchObservedRunningTime="2026-03-12 15:03:29.108808853 +0000 UTC m=+961.394034171" Mar 12 15:03:29 crc kubenswrapper[4869]: I0312 15:03:29.541257 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-x5f8z" Mar 12 15:03:29 crc kubenswrapper[4869]: I0312 15:03:29.682751 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-kk29n" Mar 12 15:03:29 crc kubenswrapper[4869]: I0312 15:03:29.735977 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7d4z\" (UniqueName: \"kubernetes.io/projected/7440ff1b-ed7b-4da0-b191-e7ee62917f2d-kube-api-access-k7d4z\") pod \"7440ff1b-ed7b-4da0-b191-e7ee62917f2d\" (UID: \"7440ff1b-ed7b-4da0-b191-e7ee62917f2d\") " Mar 12 15:03:29 crc kubenswrapper[4869]: I0312 15:03:29.745763 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7440ff1b-ed7b-4da0-b191-e7ee62917f2d-kube-api-access-k7d4z" (OuterVolumeSpecName: "kube-api-access-k7d4z") pod "7440ff1b-ed7b-4da0-b191-e7ee62917f2d" (UID: "7440ff1b-ed7b-4da0-b191-e7ee62917f2d"). InnerVolumeSpecName "kube-api-access-k7d4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:03:29 crc kubenswrapper[4869]: I0312 15:03:29.837324 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7d4z\" (UniqueName: \"kubernetes.io/projected/7440ff1b-ed7b-4da0-b191-e7ee62917f2d-kube-api-access-k7d4z\") on node \"crc\" DevicePath \"\"" Mar 12 15:03:30 crc kubenswrapper[4869]: I0312 15:03:30.100524 4869 generic.go:334] "Generic (PLEG): container finished" podID="7440ff1b-ed7b-4da0-b191-e7ee62917f2d" containerID="e35047cf16f0e7af0371e6db7afa52b3eec014d8463ea31168f136a086898e4c" exitCode=0 Mar 12 15:03:30 crc kubenswrapper[4869]: I0312 15:03:30.100595 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-x5f8z" Mar 12 15:03:30 crc kubenswrapper[4869]: I0312 15:03:30.100619 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-x5f8z" event={"ID":"7440ff1b-ed7b-4da0-b191-e7ee62917f2d","Type":"ContainerDied","Data":"e35047cf16f0e7af0371e6db7afa52b3eec014d8463ea31168f136a086898e4c"} Mar 12 15:03:30 crc kubenswrapper[4869]: I0312 15:03:30.100689 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-x5f8z" event={"ID":"7440ff1b-ed7b-4da0-b191-e7ee62917f2d","Type":"ContainerDied","Data":"e7d3ae1a13d0a6b2f55bc3d3b270fbd591e65c94b06ea71ec548eecc2debe2b6"} Mar 12 15:03:30 crc kubenswrapper[4869]: I0312 15:03:30.100731 4869 scope.go:117] "RemoveContainer" containerID="e35047cf16f0e7af0371e6db7afa52b3eec014d8463ea31168f136a086898e4c" Mar 12 15:03:30 crc kubenswrapper[4869]: I0312 15:03:30.119288 4869 scope.go:117] "RemoveContainer" containerID="e35047cf16f0e7af0371e6db7afa52b3eec014d8463ea31168f136a086898e4c" Mar 12 15:03:30 crc kubenswrapper[4869]: E0312 15:03:30.119957 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e35047cf16f0e7af0371e6db7afa52b3eec014d8463ea31168f136a086898e4c\": container with ID starting with e35047cf16f0e7af0371e6db7afa52b3eec014d8463ea31168f136a086898e4c not found: ID does not exist" containerID="e35047cf16f0e7af0371e6db7afa52b3eec014d8463ea31168f136a086898e4c" Mar 12 15:03:30 crc kubenswrapper[4869]: I0312 15:03:30.119992 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e35047cf16f0e7af0371e6db7afa52b3eec014d8463ea31168f136a086898e4c"} err="failed to get container status \"e35047cf16f0e7af0371e6db7afa52b3eec014d8463ea31168f136a086898e4c\": rpc error: code = NotFound desc = could not find container \"e35047cf16f0e7af0371e6db7afa52b3eec014d8463ea31168f136a086898e4c\": container with ID starting with e35047cf16f0e7af0371e6db7afa52b3eec014d8463ea31168f136a086898e4c not found: ID does not exist" Mar 12 15:03:30 crc kubenswrapper[4869]: I0312 15:03:30.129671 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-x5f8z"] Mar 12 15:03:30 crc kubenswrapper[4869]: I0312 15:03:30.135037 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-x5f8z"] Mar 12 15:03:30 crc kubenswrapper[4869]: I0312 15:03:30.347778 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7440ff1b-ed7b-4da0-b191-e7ee62917f2d" path="/var/lib/kubelet/pods/7440ff1b-ed7b-4da0-b191-e7ee62917f2d/volumes" Mar 12 15:03:38 crc kubenswrapper[4869]: I0312 15:03:38.205422 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-89bmr" Mar 12 15:03:38 crc kubenswrapper[4869]: I0312 15:03:38.206127 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-89bmr" Mar 12 15:03:38 crc kubenswrapper[4869]: I0312 15:03:38.243988 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-89bmr" Mar 12 15:03:39 crc kubenswrapper[4869]: I0312 15:03:39.196908 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-89bmr" Mar 12 15:03:46 crc kubenswrapper[4869]: I0312 15:03:46.871291 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57"] Mar 12 15:03:46 crc kubenswrapper[4869]: E0312 15:03:46.872391 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7440ff1b-ed7b-4da0-b191-e7ee62917f2d" containerName="registry-server" Mar 12 15:03:46 crc kubenswrapper[4869]: I0312 15:03:46.872413 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="7440ff1b-ed7b-4da0-b191-e7ee62917f2d" containerName="registry-server" Mar 12 15:03:46 crc kubenswrapper[4869]: I0312 15:03:46.872684 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="7440ff1b-ed7b-4da0-b191-e7ee62917f2d" containerName="registry-server" Mar 12 15:03:46 crc kubenswrapper[4869]: I0312 15:03:46.874381 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57" Mar 12 15:03:46 crc kubenswrapper[4869]: I0312 15:03:46.876902 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-4pqt4" Mar 12 15:03:46 crc kubenswrapper[4869]: I0312 15:03:46.882022 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57"] Mar 12 15:03:46 crc kubenswrapper[4869]: I0312 15:03:46.899619 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7dadcf56-7151-4b50-953e-f469f19ac9be-bundle\") pod \"dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57\" (UID: \"7dadcf56-7151-4b50-953e-f469f19ac9be\") " pod="openstack-operators/dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57" Mar 12 15:03:46 crc kubenswrapper[4869]: I0312 15:03:46.899724 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7dadcf56-7151-4b50-953e-f469f19ac9be-util\") pod \"dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57\" (UID: \"7dadcf56-7151-4b50-953e-f469f19ac9be\") " pod="openstack-operators/dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57" Mar 12 15:03:46 crc kubenswrapper[4869]: I0312 15:03:46.899802 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlthr\" (UniqueName: \"kubernetes.io/projected/7dadcf56-7151-4b50-953e-f469f19ac9be-kube-api-access-jlthr\") pod \"dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57\" (UID: \"7dadcf56-7151-4b50-953e-f469f19ac9be\") " pod="openstack-operators/dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57" Mar 12 15:03:47 crc kubenswrapper[4869]: I0312 15:03:47.001285 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlthr\" (UniqueName: \"kubernetes.io/projected/7dadcf56-7151-4b50-953e-f469f19ac9be-kube-api-access-jlthr\") pod \"dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57\" (UID: \"7dadcf56-7151-4b50-953e-f469f19ac9be\") " pod="openstack-operators/dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57" Mar 12 15:03:47 crc kubenswrapper[4869]: I0312 15:03:47.001430 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7dadcf56-7151-4b50-953e-f469f19ac9be-bundle\") pod \"dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57\" (UID: \"7dadcf56-7151-4b50-953e-f469f19ac9be\") " pod="openstack-operators/dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57" Mar 12 15:03:47 crc kubenswrapper[4869]: I0312 15:03:47.001505 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7dadcf56-7151-4b50-953e-f469f19ac9be-util\") pod \"dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57\" (UID: \"7dadcf56-7151-4b50-953e-f469f19ac9be\") " pod="openstack-operators/dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57" Mar 12 15:03:47 crc kubenswrapper[4869]: I0312 15:03:47.002308 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7dadcf56-7151-4b50-953e-f469f19ac9be-util\") pod \"dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57\" (UID: \"7dadcf56-7151-4b50-953e-f469f19ac9be\") " pod="openstack-operators/dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57" Mar 12 15:03:47 crc kubenswrapper[4869]: I0312 15:03:47.002459 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7dadcf56-7151-4b50-953e-f469f19ac9be-bundle\") pod \"dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57\" (UID: \"7dadcf56-7151-4b50-953e-f469f19ac9be\") " pod="openstack-operators/dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57" Mar 12 15:03:47 crc kubenswrapper[4869]: I0312 15:03:47.036512 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlthr\" (UniqueName: \"kubernetes.io/projected/7dadcf56-7151-4b50-953e-f469f19ac9be-kube-api-access-jlthr\") pod \"dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57\" (UID: \"7dadcf56-7151-4b50-953e-f469f19ac9be\") " pod="openstack-operators/dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57" Mar 12 15:03:47 crc kubenswrapper[4869]: I0312 15:03:47.210634 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57" Mar 12 15:03:47 crc kubenswrapper[4869]: I0312 15:03:47.478786 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57"] Mar 12 15:03:47 crc kubenswrapper[4869]: W0312 15:03:47.481576 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dadcf56_7151_4b50_953e_f469f19ac9be.slice/crio-53b6dc86a0b355b91ef58900d761e8d24961a8bf9ee47bba79cff67b7b95acb0 WatchSource:0}: Error finding container 53b6dc86a0b355b91ef58900d761e8d24961a8bf9ee47bba79cff67b7b95acb0: Status 404 returned error can't find the container with id 53b6dc86a0b355b91ef58900d761e8d24961a8bf9ee47bba79cff67b7b95acb0 Mar 12 15:03:48 crc kubenswrapper[4869]: I0312 15:03:48.225660 4869 generic.go:334] "Generic (PLEG): container finished" podID="7dadcf56-7151-4b50-953e-f469f19ac9be" containerID="e2a1e90bb07f23de8976cdcb45f57840b3600b39108e8587b694c6f7f1393f11" exitCode=0 Mar 12 15:03:48 crc kubenswrapper[4869]: I0312 15:03:48.225751 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57" event={"ID":"7dadcf56-7151-4b50-953e-f469f19ac9be","Type":"ContainerDied","Data":"e2a1e90bb07f23de8976cdcb45f57840b3600b39108e8587b694c6f7f1393f11"} Mar 12 15:03:48 crc kubenswrapper[4869]: I0312 15:03:48.225837 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57" event={"ID":"7dadcf56-7151-4b50-953e-f469f19ac9be","Type":"ContainerStarted","Data":"53b6dc86a0b355b91ef58900d761e8d24961a8bf9ee47bba79cff67b7b95acb0"} Mar 12 15:03:49 crc kubenswrapper[4869]: I0312 15:03:49.235212 4869 generic.go:334] "Generic (PLEG): container finished" podID="7dadcf56-7151-4b50-953e-f469f19ac9be" containerID="68b809b1b2f64ca321b54da10d904b3302f17b8cf72bd31d17a4a32c6cfbe300" exitCode=0 Mar 12 15:03:49 crc kubenswrapper[4869]: I0312 15:03:49.235280 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57" event={"ID":"7dadcf56-7151-4b50-953e-f469f19ac9be","Type":"ContainerDied","Data":"68b809b1b2f64ca321b54da10d904b3302f17b8cf72bd31d17a4a32c6cfbe300"} Mar 12 15:03:50 crc kubenswrapper[4869]: I0312 15:03:50.246523 4869 generic.go:334] "Generic (PLEG): container finished" podID="7dadcf56-7151-4b50-953e-f469f19ac9be" containerID="cd6ee2c6cf111f000cfea13fcfccb2f192dbc8ef1c0995150193832c91ac5b63" exitCode=0 Mar 12 15:03:50 crc kubenswrapper[4869]: I0312 15:03:50.246584 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57" event={"ID":"7dadcf56-7151-4b50-953e-f469f19ac9be","Type":"ContainerDied","Data":"cd6ee2c6cf111f000cfea13fcfccb2f192dbc8ef1c0995150193832c91ac5b63"} Mar 12 15:03:51 crc kubenswrapper[4869]: I0312 15:03:51.475407 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57" Mar 12 15:03:51 crc kubenswrapper[4869]: I0312 15:03:51.563666 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7dadcf56-7151-4b50-953e-f469f19ac9be-util\") pod \"7dadcf56-7151-4b50-953e-f469f19ac9be\" (UID: \"7dadcf56-7151-4b50-953e-f469f19ac9be\") " Mar 12 15:03:51 crc kubenswrapper[4869]: I0312 15:03:51.563917 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlthr\" (UniqueName: \"kubernetes.io/projected/7dadcf56-7151-4b50-953e-f469f19ac9be-kube-api-access-jlthr\") pod \"7dadcf56-7151-4b50-953e-f469f19ac9be\" (UID: \"7dadcf56-7151-4b50-953e-f469f19ac9be\") " Mar 12 15:03:51 crc kubenswrapper[4869]: I0312 15:03:51.563998 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7dadcf56-7151-4b50-953e-f469f19ac9be-bundle\") pod \"7dadcf56-7151-4b50-953e-f469f19ac9be\" (UID: \"7dadcf56-7151-4b50-953e-f469f19ac9be\") " Mar 12 15:03:51 crc kubenswrapper[4869]: I0312 15:03:51.564655 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dadcf56-7151-4b50-953e-f469f19ac9be-bundle" (OuterVolumeSpecName: "bundle") pod "7dadcf56-7151-4b50-953e-f469f19ac9be" (UID: "7dadcf56-7151-4b50-953e-f469f19ac9be"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:03:51 crc kubenswrapper[4869]: I0312 15:03:51.568273 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dadcf56-7151-4b50-953e-f469f19ac9be-kube-api-access-jlthr" (OuterVolumeSpecName: "kube-api-access-jlthr") pod "7dadcf56-7151-4b50-953e-f469f19ac9be" (UID: "7dadcf56-7151-4b50-953e-f469f19ac9be"). InnerVolumeSpecName "kube-api-access-jlthr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:03:51 crc kubenswrapper[4869]: I0312 15:03:51.584925 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dadcf56-7151-4b50-953e-f469f19ac9be-util" (OuterVolumeSpecName: "util") pod "7dadcf56-7151-4b50-953e-f469f19ac9be" (UID: "7dadcf56-7151-4b50-953e-f469f19ac9be"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:03:51 crc kubenswrapper[4869]: I0312 15:03:51.665185 4869 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7dadcf56-7151-4b50-953e-f469f19ac9be-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:03:51 crc kubenswrapper[4869]: I0312 15:03:51.665216 4869 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7dadcf56-7151-4b50-953e-f469f19ac9be-util\") on node \"crc\" DevicePath \"\"" Mar 12 15:03:51 crc kubenswrapper[4869]: I0312 15:03:51.665225 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlthr\" (UniqueName: \"kubernetes.io/projected/7dadcf56-7151-4b50-953e-f469f19ac9be-kube-api-access-jlthr\") on node \"crc\" DevicePath \"\"" Mar 12 15:03:52 crc kubenswrapper[4869]: I0312 15:03:52.260220 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57" event={"ID":"7dadcf56-7151-4b50-953e-f469f19ac9be","Type":"ContainerDied","Data":"53b6dc86a0b355b91ef58900d761e8d24961a8bf9ee47bba79cff67b7b95acb0"} Mar 12 15:03:52 crc kubenswrapper[4869]: I0312 15:03:52.260267 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53b6dc86a0b355b91ef58900d761e8d24961a8bf9ee47bba79cff67b7b95acb0" Mar 12 15:03:52 crc kubenswrapper[4869]: I0312 15:03:52.260348 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57" Mar 12 15:03:59 crc kubenswrapper[4869]: I0312 15:03:59.537429 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-666b5bf768-cfwmm"] Mar 12 15:03:59 crc kubenswrapper[4869]: E0312 15:03:59.538295 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dadcf56-7151-4b50-953e-f469f19ac9be" containerName="extract" Mar 12 15:03:59 crc kubenswrapper[4869]: I0312 15:03:59.538310 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dadcf56-7151-4b50-953e-f469f19ac9be" containerName="extract" Mar 12 15:03:59 crc kubenswrapper[4869]: E0312 15:03:59.538327 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dadcf56-7151-4b50-953e-f469f19ac9be" containerName="util" Mar 12 15:03:59 crc kubenswrapper[4869]: I0312 15:03:59.538336 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dadcf56-7151-4b50-953e-f469f19ac9be" containerName="util" Mar 12 15:03:59 crc kubenswrapper[4869]: E0312 15:03:59.538359 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dadcf56-7151-4b50-953e-f469f19ac9be" containerName="pull" Mar 12 15:03:59 crc kubenswrapper[4869]: I0312 15:03:59.538368 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dadcf56-7151-4b50-953e-f469f19ac9be" containerName="pull" Mar 12 15:03:59 crc kubenswrapper[4869]: I0312 15:03:59.538525 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dadcf56-7151-4b50-953e-f469f19ac9be" containerName="extract" Mar 12 15:03:59 crc kubenswrapper[4869]: I0312 15:03:59.539088 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-666b5bf768-cfwmm" Mar 12 15:03:59 crc kubenswrapper[4869]: I0312 15:03:59.543930 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-76d2c" Mar 12 15:03:59 crc kubenswrapper[4869]: I0312 15:03:59.561570 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-666b5bf768-cfwmm"] Mar 12 15:03:59 crc kubenswrapper[4869]: I0312 15:03:59.674086 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7x7s\" (UniqueName: \"kubernetes.io/projected/fe21499b-24de-4455-86e6-41ca9441e3d4-kube-api-access-m7x7s\") pod \"openstack-operator-controller-init-666b5bf768-cfwmm\" (UID: \"fe21499b-24de-4455-86e6-41ca9441e3d4\") " pod="openstack-operators/openstack-operator-controller-init-666b5bf768-cfwmm" Mar 12 15:03:59 crc kubenswrapper[4869]: I0312 15:03:59.775382 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7x7s\" (UniqueName: \"kubernetes.io/projected/fe21499b-24de-4455-86e6-41ca9441e3d4-kube-api-access-m7x7s\") pod \"openstack-operator-controller-init-666b5bf768-cfwmm\" (UID: \"fe21499b-24de-4455-86e6-41ca9441e3d4\") " pod="openstack-operators/openstack-operator-controller-init-666b5bf768-cfwmm" Mar 12 15:03:59 crc kubenswrapper[4869]: I0312 15:03:59.793335 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7x7s\" (UniqueName: \"kubernetes.io/projected/fe21499b-24de-4455-86e6-41ca9441e3d4-kube-api-access-m7x7s\") pod \"openstack-operator-controller-init-666b5bf768-cfwmm\" (UID: \"fe21499b-24de-4455-86e6-41ca9441e3d4\") " pod="openstack-operators/openstack-operator-controller-init-666b5bf768-cfwmm" Mar 12 15:03:59 crc kubenswrapper[4869]: I0312 15:03:59.858724 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-666b5bf768-cfwmm" Mar 12 15:04:00 crc kubenswrapper[4869]: I0312 15:04:00.132888 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555464-nrhv5"] Mar 12 15:04:00 crc kubenswrapper[4869]: I0312 15:04:00.134261 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555464-nrhv5" Mar 12 15:04:00 crc kubenswrapper[4869]: I0312 15:04:00.135724 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:04:00 crc kubenswrapper[4869]: I0312 15:04:00.136011 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:04:00 crc kubenswrapper[4869]: I0312 15:04:00.136907 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 15:04:00 crc kubenswrapper[4869]: I0312 15:04:00.143262 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555464-nrhv5"] Mar 12 15:04:00 crc kubenswrapper[4869]: I0312 15:04:00.191246 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x57w4\" (UniqueName: \"kubernetes.io/projected/6ec8e133-2a4d-4494-9372-eeb75efc75b5-kube-api-access-x57w4\") pod \"auto-csr-approver-29555464-nrhv5\" (UID: \"6ec8e133-2a4d-4494-9372-eeb75efc75b5\") " pod="openshift-infra/auto-csr-approver-29555464-nrhv5" Mar 12 15:04:00 crc kubenswrapper[4869]: I0312 15:04:00.292632 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x57w4\" (UniqueName: \"kubernetes.io/projected/6ec8e133-2a4d-4494-9372-eeb75efc75b5-kube-api-access-x57w4\") pod \"auto-csr-approver-29555464-nrhv5\" (UID: \"6ec8e133-2a4d-4494-9372-eeb75efc75b5\") " pod="openshift-infra/auto-csr-approver-29555464-nrhv5" Mar 12 15:04:00 crc kubenswrapper[4869]: I0312 15:04:00.299656 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-666b5bf768-cfwmm"] Mar 12 15:04:00 crc kubenswrapper[4869]: I0312 15:04:00.317823 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-666b5bf768-cfwmm" event={"ID":"fe21499b-24de-4455-86e6-41ca9441e3d4","Type":"ContainerStarted","Data":"821c5d4022e8f8cfc376c6a6bc68cc71366870df722f27055dc80573f1567eb6"} Mar 12 15:04:00 crc kubenswrapper[4869]: I0312 15:04:00.318554 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x57w4\" (UniqueName: \"kubernetes.io/projected/6ec8e133-2a4d-4494-9372-eeb75efc75b5-kube-api-access-x57w4\") pod \"auto-csr-approver-29555464-nrhv5\" (UID: \"6ec8e133-2a4d-4494-9372-eeb75efc75b5\") " pod="openshift-infra/auto-csr-approver-29555464-nrhv5" Mar 12 15:04:00 crc kubenswrapper[4869]: I0312 15:04:00.451423 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555464-nrhv5" Mar 12 15:04:00 crc kubenswrapper[4869]: I0312 15:04:00.647111 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555464-nrhv5"] Mar 12 15:04:00 crc kubenswrapper[4869]: W0312 15:04:00.655864 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ec8e133_2a4d_4494_9372_eeb75efc75b5.slice/crio-a65ec41137ed224e63632619047511973a46210bfa1bcc97e3ccd3d764cec629 WatchSource:0}: Error finding container a65ec41137ed224e63632619047511973a46210bfa1bcc97e3ccd3d764cec629: Status 404 returned error can't find the container with id a65ec41137ed224e63632619047511973a46210bfa1bcc97e3ccd3d764cec629 Mar 12 15:04:01 crc kubenswrapper[4869]: I0312 15:04:01.325940 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555464-nrhv5" event={"ID":"6ec8e133-2a4d-4494-9372-eeb75efc75b5","Type":"ContainerStarted","Data":"a65ec41137ed224e63632619047511973a46210bfa1bcc97e3ccd3d764cec629"} Mar 12 15:04:04 crc kubenswrapper[4869]: I0312 15:04:04.357705 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-666b5bf768-cfwmm" event={"ID":"fe21499b-24de-4455-86e6-41ca9441e3d4","Type":"ContainerStarted","Data":"e53e33cb83a3335d2b229aa7537fcc7e12cc4f8f1be5197f66cf932a1164fb68"} Mar 12 15:04:04 crc kubenswrapper[4869]: I0312 15:04:04.358962 4869 generic.go:334] "Generic (PLEG): container finished" podID="6ec8e133-2a4d-4494-9372-eeb75efc75b5" containerID="134a0a881c718f6804c48d1c69519c46cdcdfff2b928bbdb5070d0c06ca68204" exitCode=0 Mar 12 15:04:04 crc kubenswrapper[4869]: I0312 15:04:04.358991 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555464-nrhv5" event={"ID":"6ec8e133-2a4d-4494-9372-eeb75efc75b5","Type":"ContainerDied","Data":"134a0a881c718f6804c48d1c69519c46cdcdfff2b928bbdb5070d0c06ca68204"} Mar 12 15:04:04 crc kubenswrapper[4869]: I0312 15:04:04.402666 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-666b5bf768-cfwmm" podStartSLOduration=1.80519571 podStartE2EDuration="5.402651314s" podCreationTimestamp="2026-03-12 15:03:59 +0000 UTC" firstStartedPulling="2026-03-12 15:04:00.311139557 +0000 UTC m=+992.596364835" lastFinishedPulling="2026-03-12 15:04:03.908595141 +0000 UTC m=+996.193820439" observedRunningTime="2026-03-12 15:04:04.389163541 +0000 UTC m=+996.674388819" watchObservedRunningTime="2026-03-12 15:04:04.402651314 +0000 UTC m=+996.687876592" Mar 12 15:04:05 crc kubenswrapper[4869]: I0312 15:04:05.368092 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-666b5bf768-cfwmm" Mar 12 15:04:05 crc kubenswrapper[4869]: I0312 15:04:05.658641 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555464-nrhv5" Mar 12 15:04:05 crc kubenswrapper[4869]: I0312 15:04:05.675814 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x57w4\" (UniqueName: \"kubernetes.io/projected/6ec8e133-2a4d-4494-9372-eeb75efc75b5-kube-api-access-x57w4\") pod \"6ec8e133-2a4d-4494-9372-eeb75efc75b5\" (UID: \"6ec8e133-2a4d-4494-9372-eeb75efc75b5\") " Mar 12 15:04:05 crc kubenswrapper[4869]: I0312 15:04:05.684780 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ec8e133-2a4d-4494-9372-eeb75efc75b5-kube-api-access-x57w4" (OuterVolumeSpecName: "kube-api-access-x57w4") pod "6ec8e133-2a4d-4494-9372-eeb75efc75b5" (UID: "6ec8e133-2a4d-4494-9372-eeb75efc75b5"). InnerVolumeSpecName "kube-api-access-x57w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:04:05 crc kubenswrapper[4869]: I0312 15:04:05.777302 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x57w4\" (UniqueName: \"kubernetes.io/projected/6ec8e133-2a4d-4494-9372-eeb75efc75b5-kube-api-access-x57w4\") on node \"crc\" DevicePath \"\"" Mar 12 15:04:06 crc kubenswrapper[4869]: I0312 15:04:06.375243 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555464-nrhv5" event={"ID":"6ec8e133-2a4d-4494-9372-eeb75efc75b5","Type":"ContainerDied","Data":"a65ec41137ed224e63632619047511973a46210bfa1bcc97e3ccd3d764cec629"} Mar 12 15:04:06 crc kubenswrapper[4869]: I0312 15:04:06.375317 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a65ec41137ed224e63632619047511973a46210bfa1bcc97e3ccd3d764cec629" Mar 12 15:04:06 crc kubenswrapper[4869]: I0312 15:04:06.375272 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555464-nrhv5" Mar 12 15:04:06 crc kubenswrapper[4869]: I0312 15:04:06.721670 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555458-b2j8t"] Mar 12 15:04:06 crc kubenswrapper[4869]: I0312 15:04:06.726885 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555458-b2j8t"] Mar 12 15:04:08 crc kubenswrapper[4869]: I0312 15:04:08.346296 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f78a2bb0-61ee-488f-986d-4e1f62a7ff0c" path="/var/lib/kubelet/pods/f78a2bb0-61ee-488f-986d-4e1f62a7ff0c/volumes" Mar 12 15:04:09 crc kubenswrapper[4869]: I0312 15:04:09.862054 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-666b5bf768-cfwmm" Mar 12 15:04:29 crc kubenswrapper[4869]: I0312 15:04:29.327271 4869 scope.go:117] "RemoveContainer" containerID="601bd64c4b111a60c475cc2d26c60302abd2916331fd8f1ab3780f494751fdeb" Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.830533 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-g9sqx"] Mar 12 15:04:46 crc kubenswrapper[4869]: E0312 15:04:46.831256 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec8e133-2a4d-4494-9372-eeb75efc75b5" containerName="oc" Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.831272 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec8e133-2a4d-4494-9372-eeb75efc75b5" containerName="oc" Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.831456 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ec8e133-2a4d-4494-9372-eeb75efc75b5" containerName="oc" Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.832196 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-g9sqx" Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.834261 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-zf5z9" Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.840207 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-hn64k"] Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.841337 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-hn64k" Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.843874 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-blbgz" Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.850128 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-g9sqx"] Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.855452 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-hn64k"] Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.862520 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-2phvg"] Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.863439 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-2phvg" Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.865427 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-86vpb" Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.885263 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-bbfwc"] Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.886196 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-bbfwc" Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.890730 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-w6g4l" Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.899598 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-2phvg"] Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.909821 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-bbfwc"] Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.922328 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-qhsnc"] Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.923151 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-qhsnc" Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.928230 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-xz858" Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.941686 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-hxljg"] Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.942596 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-hxljg" Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.946133 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-qhsnc"] Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.948659 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-bd6mw" Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.968617 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-hxljg"] Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.973666 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-vshpv"] Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.974430 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vshpv" Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.980873 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-vxm4j" Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.980961 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.981049 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-cxtpp"] Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.981999 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-cxtpp" Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.984392 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-pbzjj" Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.991598 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-vshpv"] Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.995600 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgnvt\" (UniqueName: \"kubernetes.io/projected/8d41de12-9bea-4bbc-a276-296376e563a8-kube-api-access-dgnvt\") pod \"glance-operator-controller-manager-5964f64c48-bbfwc\" (UID: \"8d41de12-9bea-4bbc-a276-296376e563a8\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-bbfwc" Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.995666 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88ssx\" (UniqueName: \"kubernetes.io/projected/62ce9e5f-8d12-425b-b966-aca955bd96d9-kube-api-access-88ssx\") pod \"barbican-operator-controller-manager-677bd678f7-g9sqx\" (UID: \"62ce9e5f-8d12-425b-b966-aca955bd96d9\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-g9sqx" Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.995692 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkwpd\" (UniqueName: \"kubernetes.io/projected/acaa3149-c349-4d8e-95ba-56d1714eb3b6-kube-api-access-tkwpd\") pod \"cinder-operator-controller-manager-984cd4dcf-hn64k\" (UID: \"acaa3149-c349-4d8e-95ba-56d1714eb3b6\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-hn64k" Mar 12 15:04:46 crc kubenswrapper[4869]: I0312 15:04:46.995731 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29p5f\" (UniqueName: \"kubernetes.io/projected/d1f4e20f-8a64-4692-aa9b-73e8eb2aeb26-kube-api-access-29p5f\") pod \"designate-operator-controller-manager-66d56f6ff4-2phvg\" (UID: \"d1f4e20f-8a64-4692-aa9b-73e8eb2aeb26\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-2phvg" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.013360 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-cxtpp"] Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.028655 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-bgssp"] Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.029623 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-bgssp" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.040356 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-f7cqv" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.044181 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-bgssp"] Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.056586 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-sggnn"] Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.057345 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-sggnn" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.064097 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-5bgzt" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.067828 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-sggnn"] Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.071058 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-zsvnq"] Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.071887 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-zsvnq" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.075437 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-h7r9w" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.083863 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-h9gxw"] Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.084731 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h9gxw" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.087957 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-sjz5w" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.090473 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-7p294"] Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.091323 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-7p294" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.094849 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-ckjlp" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.097123 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9ftz\" (UniqueName: \"kubernetes.io/projected/c9089d1a-4d28-4973-a826-c7fa8b99acab-kube-api-access-x9ftz\") pod \"infra-operator-controller-manager-5995f4446f-vshpv\" (UID: \"c9089d1a-4d28-4973-a826-c7fa8b99acab\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vshpv" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.097183 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47pg8\" (UniqueName: \"kubernetes.io/projected/bdcee336-08a4-4504-8eb3-b09b4899d2ed-kube-api-access-47pg8\") pod \"ironic-operator-controller-manager-6bbb499bbc-cxtpp\" (UID: \"bdcee336-08a4-4504-8eb3-b09b4899d2ed\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-cxtpp" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.097222 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgnvt\" (UniqueName: \"kubernetes.io/projected/8d41de12-9bea-4bbc-a276-296376e563a8-kube-api-access-dgnvt\") pod \"glance-operator-controller-manager-5964f64c48-bbfwc\" (UID: \"8d41de12-9bea-4bbc-a276-296376e563a8\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-bbfwc" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.097243 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9089d1a-4d28-4973-a826-c7fa8b99acab-cert\") pod \"infra-operator-controller-manager-5995f4446f-vshpv\" (UID: \"c9089d1a-4d28-4973-a826-c7fa8b99acab\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vshpv" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.097267 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88ssx\" (UniqueName: \"kubernetes.io/projected/62ce9e5f-8d12-425b-b966-aca955bd96d9-kube-api-access-88ssx\") pod \"barbican-operator-controller-manager-677bd678f7-g9sqx\" (UID: \"62ce9e5f-8d12-425b-b966-aca955bd96d9\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-g9sqx" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.097283 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jxb7\" (UniqueName: \"kubernetes.io/projected/454883b6-ef08-4828-acc8-237632cf4a35-kube-api-access-5jxb7\") pod \"heat-operator-controller-manager-77b6666d85-qhsnc\" (UID: \"454883b6-ef08-4828-acc8-237632cf4a35\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-qhsnc" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.097303 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkwpd\" (UniqueName: \"kubernetes.io/projected/acaa3149-c349-4d8e-95ba-56d1714eb3b6-kube-api-access-tkwpd\") pod \"cinder-operator-controller-manager-984cd4dcf-hn64k\" (UID: \"acaa3149-c349-4d8e-95ba-56d1714eb3b6\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-hn64k" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.097319 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29p5f\" (UniqueName: \"kubernetes.io/projected/d1f4e20f-8a64-4692-aa9b-73e8eb2aeb26-kube-api-access-29p5f\") pod \"designate-operator-controller-manager-66d56f6ff4-2phvg\" (UID: \"d1f4e20f-8a64-4692-aa9b-73e8eb2aeb26\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-2phvg" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.097342 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgmcc\" (UniqueName: \"kubernetes.io/projected/9c4c52f4-d899-4059-8d91-29e4dd1971fd-kube-api-access-bgmcc\") pod \"horizon-operator-controller-manager-6d9d6b584d-hxljg\" (UID: \"9c4c52f4-d899-4059-8d91-29e4dd1971fd\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-hxljg" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.102438 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-zsvnq"] Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.131997 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88ssx\" (UniqueName: \"kubernetes.io/projected/62ce9e5f-8d12-425b-b966-aca955bd96d9-kube-api-access-88ssx\") pod \"barbican-operator-controller-manager-677bd678f7-g9sqx\" (UID: \"62ce9e5f-8d12-425b-b966-aca955bd96d9\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-g9sqx" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.133195 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-7p294"] Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.137261 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkwpd\" (UniqueName: \"kubernetes.io/projected/acaa3149-c349-4d8e-95ba-56d1714eb3b6-kube-api-access-tkwpd\") pod \"cinder-operator-controller-manager-984cd4dcf-hn64k\" (UID: \"acaa3149-c349-4d8e-95ba-56d1714eb3b6\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-hn64k" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.144404 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-h9gxw"] Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.147701 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29p5f\" (UniqueName: \"kubernetes.io/projected/d1f4e20f-8a64-4692-aa9b-73e8eb2aeb26-kube-api-access-29p5f\") pod \"designate-operator-controller-manager-66d56f6ff4-2phvg\" (UID: \"d1f4e20f-8a64-4692-aa9b-73e8eb2aeb26\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-2phvg" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.149452 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-f9gh7"] Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.150081 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgnvt\" (UniqueName: \"kubernetes.io/projected/8d41de12-9bea-4bbc-a276-296376e563a8-kube-api-access-dgnvt\") pod \"glance-operator-controller-manager-5964f64c48-bbfwc\" (UID: \"8d41de12-9bea-4bbc-a276-296376e563a8\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-bbfwc" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.150214 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-f9gh7" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.155208 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-8z5qw" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.158055 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-g9sqx" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.179453 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-hn64k" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.190835 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-f9gh7"] Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.201821 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9089d1a-4d28-4973-a826-c7fa8b99acab-cert\") pod \"infra-operator-controller-manager-5995f4446f-vshpv\" (UID: \"c9089d1a-4d28-4973-a826-c7fa8b99acab\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vshpv" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.202163 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7ktb\" (UniqueName: \"kubernetes.io/projected/2195e51b-747e-4e94-b616-8fe940ffe5ed-kube-api-access-h7ktb\") pod \"mariadb-operator-controller-manager-658d4cdd5-zsvnq\" (UID: \"2195e51b-747e-4e94-b616-8fe940ffe5ed\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-zsvnq" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.202188 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2464n\" (UniqueName: \"kubernetes.io/projected/fe9827df-43c1-4491-923e-0c745e025aec-kube-api-access-2464n\") pod \"neutron-operator-controller-manager-776c5696bf-h9gxw\" (UID: \"fe9827df-43c1-4491-923e-0c745e025aec\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h9gxw" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.202237 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g52z9\" (UniqueName: \"kubernetes.io/projected/50bd86b6-6404-44de-822c-b75e6692a36b-kube-api-access-g52z9\") pod \"manila-operator-controller-manager-68f45f9d9f-sggnn\" (UID: \"50bd86b6-6404-44de-822c-b75e6692a36b\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-sggnn" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.202265 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jxb7\" (UniqueName: \"kubernetes.io/projected/454883b6-ef08-4828-acc8-237632cf4a35-kube-api-access-5jxb7\") pod \"heat-operator-controller-manager-77b6666d85-qhsnc\" (UID: \"454883b6-ef08-4828-acc8-237632cf4a35\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-qhsnc" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.202307 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kmhj\" (UniqueName: \"kubernetes.io/projected/9c8724e6-72a2-441e-bb6a-330ee1ccfe6f-kube-api-access-6kmhj\") pod \"keystone-operator-controller-manager-684f77d66d-bgssp\" (UID: \"9c8724e6-72a2-441e-bb6a-330ee1ccfe6f\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-bgssp" Mar 12 15:04:47 crc kubenswrapper[4869]: E0312 15:04:47.202563 4869 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 12 15:04:47 crc kubenswrapper[4869]: E0312 15:04:47.202659 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9089d1a-4d28-4973-a826-c7fa8b99acab-cert podName:c9089d1a-4d28-4973-a826-c7fa8b99acab nodeName:}" failed. No retries permitted until 2026-03-12 15:04:47.702623806 +0000 UTC m=+1039.987849084 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c9089d1a-4d28-4973-a826-c7fa8b99acab-cert") pod "infra-operator-controller-manager-5995f4446f-vshpv" (UID: "c9089d1a-4d28-4973-a826-c7fa8b99acab") : secret "infra-operator-webhook-server-cert" not found Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.202859 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgmcc\" (UniqueName: \"kubernetes.io/projected/9c4c52f4-d899-4059-8d91-29e4dd1971fd-kube-api-access-bgmcc\") pod \"horizon-operator-controller-manager-6d9d6b584d-hxljg\" (UID: \"9c4c52f4-d899-4059-8d91-29e4dd1971fd\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-hxljg" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.203143 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9ftz\" (UniqueName: \"kubernetes.io/projected/c9089d1a-4d28-4973-a826-c7fa8b99acab-kube-api-access-x9ftz\") pod \"infra-operator-controller-manager-5995f4446f-vshpv\" (UID: \"c9089d1a-4d28-4973-a826-c7fa8b99acab\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vshpv" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.203198 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j2bc\" (UniqueName: \"kubernetes.io/projected/cbf420e9-2c19-4582-b36b-0d4651f6d067-kube-api-access-2j2bc\") pod \"nova-operator-controller-manager-569cc54c5-7p294\" (UID: \"cbf420e9-2c19-4582-b36b-0d4651f6d067\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-7p294" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.203290 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47pg8\" (UniqueName: \"kubernetes.io/projected/bdcee336-08a4-4504-8eb3-b09b4899d2ed-kube-api-access-47pg8\") pod \"ironic-operator-controller-manager-6bbb499bbc-cxtpp\" (UID: \"bdcee336-08a4-4504-8eb3-b09b4899d2ed\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-cxtpp" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.203732 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-2phvg" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.217755 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-bbfwc" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.247957 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47pg8\" (UniqueName: \"kubernetes.io/projected/bdcee336-08a4-4504-8eb3-b09b4899d2ed-kube-api-access-47pg8\") pod \"ironic-operator-controller-manager-6bbb499bbc-cxtpp\" (UID: \"bdcee336-08a4-4504-8eb3-b09b4899d2ed\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-cxtpp" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.255861 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz"] Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.259330 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9ftz\" (UniqueName: \"kubernetes.io/projected/c9089d1a-4d28-4973-a826-c7fa8b99acab-kube-api-access-x9ftz\") pod \"infra-operator-controller-manager-5995f4446f-vshpv\" (UID: \"c9089d1a-4d28-4973-a826-c7fa8b99acab\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vshpv" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.264132 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.265745 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-zns9g" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.266139 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.266982 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgmcc\" (UniqueName: \"kubernetes.io/projected/9c4c52f4-d899-4059-8d91-29e4dd1971fd-kube-api-access-bgmcc\") pod \"horizon-operator-controller-manager-6d9d6b584d-hxljg\" (UID: \"9c4c52f4-d899-4059-8d91-29e4dd1971fd\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-hxljg" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.267305 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jxb7\" (UniqueName: \"kubernetes.io/projected/454883b6-ef08-4828-acc8-237632cf4a35-kube-api-access-5jxb7\") pod \"heat-operator-controller-manager-77b6666d85-qhsnc\" (UID: \"454883b6-ef08-4828-acc8-237632cf4a35\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-qhsnc" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.269559 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-hxljg" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.281164 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hthfl"] Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.284338 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hthfl" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.285943 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-dxk2p" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.289287 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-t2cd8"] Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.290152 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-t2cd8" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.291675 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-8qppq" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.296281 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz"] Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.308907 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7ktb\" (UniqueName: \"kubernetes.io/projected/2195e51b-747e-4e94-b616-8fe940ffe5ed-kube-api-access-h7ktb\") pod \"mariadb-operator-controller-manager-658d4cdd5-zsvnq\" (UID: \"2195e51b-747e-4e94-b616-8fe940ffe5ed\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-zsvnq" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.308945 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2464n\" (UniqueName: \"kubernetes.io/projected/fe9827df-43c1-4491-923e-0c745e025aec-kube-api-access-2464n\") pod \"neutron-operator-controller-manager-776c5696bf-h9gxw\" (UID: \"fe9827df-43c1-4491-923e-0c745e025aec\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h9gxw" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.308971 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g52z9\" (UniqueName: \"kubernetes.io/projected/50bd86b6-6404-44de-822c-b75e6692a36b-kube-api-access-g52z9\") pod \"manila-operator-controller-manager-68f45f9d9f-sggnn\" (UID: \"50bd86b6-6404-44de-822c-b75e6692a36b\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-sggnn" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.308998 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kmhj\" (UniqueName: \"kubernetes.io/projected/9c8724e6-72a2-441e-bb6a-330ee1ccfe6f-kube-api-access-6kmhj\") pod \"keystone-operator-controller-manager-684f77d66d-bgssp\" (UID: \"9c8724e6-72a2-441e-bb6a-330ee1ccfe6f\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-bgssp" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.309041 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gqvw\" (UniqueName: \"kubernetes.io/projected/4532e8a5-d461-4d46-99b9-6da31edb678b-kube-api-access-2gqvw\") pod \"octavia-operator-controller-manager-5f4f55cb5c-f9gh7\" (UID: \"4532e8a5-d461-4d46-99b9-6da31edb678b\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-f9gh7" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.309069 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j2bc\" (UniqueName: \"kubernetes.io/projected/cbf420e9-2c19-4582-b36b-0d4651f6d067-kube-api-access-2j2bc\") pod \"nova-operator-controller-manager-569cc54c5-7p294\" (UID: \"cbf420e9-2c19-4582-b36b-0d4651f6d067\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-7p294" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.322054 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-cxtpp" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.332089 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kmhj\" (UniqueName: \"kubernetes.io/projected/9c8724e6-72a2-441e-bb6a-330ee1ccfe6f-kube-api-access-6kmhj\") pod \"keystone-operator-controller-manager-684f77d66d-bgssp\" (UID: \"9c8724e6-72a2-441e-bb6a-330ee1ccfe6f\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-bgssp" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.340350 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hthfl"] Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.346077 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g52z9\" (UniqueName: \"kubernetes.io/projected/50bd86b6-6404-44de-822c-b75e6692a36b-kube-api-access-g52z9\") pod \"manila-operator-controller-manager-68f45f9d9f-sggnn\" (UID: \"50bd86b6-6404-44de-822c-b75e6692a36b\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-sggnn" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.346588 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j2bc\" (UniqueName: \"kubernetes.io/projected/cbf420e9-2c19-4582-b36b-0d4651f6d067-kube-api-access-2j2bc\") pod \"nova-operator-controller-manager-569cc54c5-7p294\" (UID: \"cbf420e9-2c19-4582-b36b-0d4651f6d067\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-7p294" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.350644 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-t2cd8"] Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.352783 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7ktb\" (UniqueName: \"kubernetes.io/projected/2195e51b-747e-4e94-b616-8fe940ffe5ed-kube-api-access-h7ktb\") pod \"mariadb-operator-controller-manager-658d4cdd5-zsvnq\" (UID: \"2195e51b-747e-4e94-b616-8fe940ffe5ed\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-zsvnq" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.356701 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2464n\" (UniqueName: \"kubernetes.io/projected/fe9827df-43c1-4491-923e-0c745e025aec-kube-api-access-2464n\") pod \"neutron-operator-controller-manager-776c5696bf-h9gxw\" (UID: \"fe9827df-43c1-4491-923e-0c745e025aec\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h9gxw" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.364183 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-bgssp" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.375493 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-fh776"] Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.376350 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-fh776" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.378245 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-9sr7k" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.394653 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-sggnn" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.402229 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-fh776"] Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.410196 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gqvw\" (UniqueName: \"kubernetes.io/projected/4532e8a5-d461-4d46-99b9-6da31edb678b-kube-api-access-2gqvw\") pod \"octavia-operator-controller-manager-5f4f55cb5c-f9gh7\" (UID: \"4532e8a5-d461-4d46-99b9-6da31edb678b\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-f9gh7" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.410272 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr6bl\" (UniqueName: \"kubernetes.io/projected/87c7aae2-dd1d-40f2-a54c-9239fc1998de-kube-api-access-wr6bl\") pod \"placement-operator-controller-manager-574d45c66c-t2cd8\" (UID: \"87c7aae2-dd1d-40f2-a54c-9239fc1998de\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-t2cd8" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.410331 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g68jt\" (UniqueName: \"kubernetes.io/projected/6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715-kube-api-access-g68jt\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz\" (UID: \"6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.410364 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz\" (UID: \"6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.410392 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxdbb\" (UniqueName: \"kubernetes.io/projected/e07ee970-a07b-4a6b-b5b5-3387bd4b6da2-kube-api-access-kxdbb\") pod \"ovn-operator-controller-manager-bbc5b68f9-hthfl\" (UID: \"e07ee970-a07b-4a6b-b5b5-3387bd4b6da2\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hthfl" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.411000 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-zsvnq" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.427490 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h9gxw" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.430323 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mhw7h"] Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.431113 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mhw7h" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.435141 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-rmx8h" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.441184 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gqvw\" (UniqueName: \"kubernetes.io/projected/4532e8a5-d461-4d46-99b9-6da31edb678b-kube-api-access-2gqvw\") pod \"octavia-operator-controller-manager-5f4f55cb5c-f9gh7\" (UID: \"4532e8a5-d461-4d46-99b9-6da31edb678b\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-f9gh7" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.463949 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mhw7h"] Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.500049 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-96d7k"] Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.504279 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-96d7k" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.512060 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-k9xwl" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.515092 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n82kt\" (UniqueName: \"kubernetes.io/projected/98c64520-2dc6-4f39-a102-58e0205f7d46-kube-api-access-n82kt\") pod \"swift-operator-controller-manager-677c674df7-fh776\" (UID: \"98c64520-2dc6-4f39-a102-58e0205f7d46\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-fh776" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.515148 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr6bl\" (UniqueName: \"kubernetes.io/projected/87c7aae2-dd1d-40f2-a54c-9239fc1998de-kube-api-access-wr6bl\") pod \"placement-operator-controller-manager-574d45c66c-t2cd8\" (UID: \"87c7aae2-dd1d-40f2-a54c-9239fc1998de\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-t2cd8" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.515202 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g68jt\" (UniqueName: \"kubernetes.io/projected/6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715-kube-api-access-g68jt\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz\" (UID: \"6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.515236 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz\" (UID: \"6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.515259 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxdbb\" (UniqueName: \"kubernetes.io/projected/e07ee970-a07b-4a6b-b5b5-3387bd4b6da2-kube-api-access-kxdbb\") pod \"ovn-operator-controller-manager-bbc5b68f9-hthfl\" (UID: \"e07ee970-a07b-4a6b-b5b5-3387bd4b6da2\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hthfl" Mar 12 15:04:47 crc kubenswrapper[4869]: E0312 15:04:47.516082 4869 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 15:04:47 crc kubenswrapper[4869]: E0312 15:04:47.516129 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715-cert podName:6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715 nodeName:}" failed. No retries permitted until 2026-03-12 15:04:48.016114753 +0000 UTC m=+1040.301340021 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz" (UID: "6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.528578 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-96d7k"] Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.537630 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-s5ptd"] Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.538490 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-s5ptd" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.540452 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-8gp7h" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.542203 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-s5ptd"] Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.544039 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g68jt\" (UniqueName: \"kubernetes.io/projected/6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715-kube-api-access-g68jt\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz\" (UID: \"6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.544320 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-qhsnc" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.546714 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxdbb\" (UniqueName: \"kubernetes.io/projected/e07ee970-a07b-4a6b-b5b5-3387bd4b6da2-kube-api-access-kxdbb\") pod \"ovn-operator-controller-manager-bbc5b68f9-hthfl\" (UID: \"e07ee970-a07b-4a6b-b5b5-3387bd4b6da2\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hthfl" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.556123 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7d46bf84bd-r6xt7"] Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.556907 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7d46bf84bd-r6xt7" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.560101 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-z5jz6" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.560395 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.560557 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.561131 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr6bl\" (UniqueName: \"kubernetes.io/projected/87c7aae2-dd1d-40f2-a54c-9239fc1998de-kube-api-access-wr6bl\") pod \"placement-operator-controller-manager-574d45c66c-t2cd8\" (UID: \"87c7aae2-dd1d-40f2-a54c-9239fc1998de\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-t2cd8" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.570214 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7d46bf84bd-r6xt7"] Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.592058 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-7p294" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.600867 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-f9gh7" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.616125 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n82kt\" (UniqueName: \"kubernetes.io/projected/98c64520-2dc6-4f39-a102-58e0205f7d46-kube-api-access-n82kt\") pod \"swift-operator-controller-manager-677c674df7-fh776\" (UID: \"98c64520-2dc6-4f39-a102-58e0205f7d46\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-fh776" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.616202 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qwcw\" (UniqueName: \"kubernetes.io/projected/da890c60-e9bb-49a9-97cc-67696823d7d8-kube-api-access-2qwcw\") pod \"test-operator-controller-manager-5c5cb9c4d7-96d7k\" (UID: \"da890c60-e9bb-49a9-97cc-67696823d7d8\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-96d7k" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.616271 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-857dn\" (UniqueName: \"kubernetes.io/projected/3760f848-91f6-4fdf-9bd4-d6ffbac2de6d-kube-api-access-857dn\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-mhw7h\" (UID: \"3760f848-91f6-4fdf-9bd4-d6ffbac2de6d\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mhw7h" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.658159 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zlbth"] Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.658184 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n82kt\" (UniqueName: \"kubernetes.io/projected/98c64520-2dc6-4f39-a102-58e0205f7d46-kube-api-access-n82kt\") pod \"swift-operator-controller-manager-677c674df7-fh776\" (UID: \"98c64520-2dc6-4f39-a102-58e0205f7d46\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-fh776" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.658988 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zlbth" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.667489 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hthfl" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.668702 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-xzs7b" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.673991 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-t2cd8" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.681182 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zlbth"] Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.696405 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-fh776" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.718895 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-metrics-certs\") pod \"openstack-operator-controller-manager-7d46bf84bd-r6xt7\" (UID: \"2fd71c8f-f3bf-416b-9a7b-fd108b10853d\") " pod="openstack-operators/openstack-operator-controller-manager-7d46bf84bd-r6xt7" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.718971 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qwcw\" (UniqueName: \"kubernetes.io/projected/da890c60-e9bb-49a9-97cc-67696823d7d8-kube-api-access-2qwcw\") pod \"test-operator-controller-manager-5c5cb9c4d7-96d7k\" (UID: \"da890c60-e9bb-49a9-97cc-67696823d7d8\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-96d7k" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.719034 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9089d1a-4d28-4973-a826-c7fa8b99acab-cert\") pod \"infra-operator-controller-manager-5995f4446f-vshpv\" (UID: \"c9089d1a-4d28-4973-a826-c7fa8b99acab\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vshpv" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.719060 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swh28\" (UniqueName: \"kubernetes.io/projected/eff4102a-7465-4994-b6d5-1982a6ec713b-kube-api-access-swh28\") pod \"watcher-operator-controller-manager-6dd88c6f67-s5ptd\" (UID: \"eff4102a-7465-4994-b6d5-1982a6ec713b\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-s5ptd" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.719106 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-857dn\" (UniqueName: \"kubernetes.io/projected/3760f848-91f6-4fdf-9bd4-d6ffbac2de6d-kube-api-access-857dn\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-mhw7h\" (UID: \"3760f848-91f6-4fdf-9bd4-d6ffbac2de6d\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mhw7h" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.719127 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbsgg\" (UniqueName: \"kubernetes.io/projected/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-kube-api-access-gbsgg\") pod \"openstack-operator-controller-manager-7d46bf84bd-r6xt7\" (UID: \"2fd71c8f-f3bf-416b-9a7b-fd108b10853d\") " pod="openstack-operators/openstack-operator-controller-manager-7d46bf84bd-r6xt7" Mar 12 15:04:47 crc kubenswrapper[4869]: E0312 15:04:47.719134 4869 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 12 15:04:47 crc kubenswrapper[4869]: E0312 15:04:47.719182 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9089d1a-4d28-4973-a826-c7fa8b99acab-cert podName:c9089d1a-4d28-4973-a826-c7fa8b99acab nodeName:}" failed. No retries permitted until 2026-03-12 15:04:48.719166396 +0000 UTC m=+1041.004391674 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c9089d1a-4d28-4973-a826-c7fa8b99acab-cert") pod "infra-operator-controller-manager-5995f4446f-vshpv" (UID: "c9089d1a-4d28-4973-a826-c7fa8b99acab") : secret "infra-operator-webhook-server-cert" not found Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.719198 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-webhook-certs\") pod \"openstack-operator-controller-manager-7d46bf84bd-r6xt7\" (UID: \"2fd71c8f-f3bf-416b-9a7b-fd108b10853d\") " pod="openstack-operators/openstack-operator-controller-manager-7d46bf84bd-r6xt7" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.773361 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-g9sqx"] Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.777462 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-857dn\" (UniqueName: \"kubernetes.io/projected/3760f848-91f6-4fdf-9bd4-d6ffbac2de6d-kube-api-access-857dn\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-mhw7h\" (UID: \"3760f848-91f6-4fdf-9bd4-d6ffbac2de6d\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mhw7h" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.779293 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qwcw\" (UniqueName: \"kubernetes.io/projected/da890c60-e9bb-49a9-97cc-67696823d7d8-kube-api-access-2qwcw\") pod \"test-operator-controller-manager-5c5cb9c4d7-96d7k\" (UID: \"da890c60-e9bb-49a9-97cc-67696823d7d8\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-96d7k" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.801633 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mhw7h" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.823496 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swh28\" (UniqueName: \"kubernetes.io/projected/eff4102a-7465-4994-b6d5-1982a6ec713b-kube-api-access-swh28\") pod \"watcher-operator-controller-manager-6dd88c6f67-s5ptd\" (UID: \"eff4102a-7465-4994-b6d5-1982a6ec713b\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-s5ptd" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.823590 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbsgg\" (UniqueName: \"kubernetes.io/projected/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-kube-api-access-gbsgg\") pod \"openstack-operator-controller-manager-7d46bf84bd-r6xt7\" (UID: \"2fd71c8f-f3bf-416b-9a7b-fd108b10853d\") " pod="openstack-operators/openstack-operator-controller-manager-7d46bf84bd-r6xt7" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.823656 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-webhook-certs\") pod \"openstack-operator-controller-manager-7d46bf84bd-r6xt7\" (UID: \"2fd71c8f-f3bf-416b-9a7b-fd108b10853d\") " pod="openstack-operators/openstack-operator-controller-manager-7d46bf84bd-r6xt7" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.823692 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-metrics-certs\") pod \"openstack-operator-controller-manager-7d46bf84bd-r6xt7\" (UID: \"2fd71c8f-f3bf-416b-9a7b-fd108b10853d\") " pod="openstack-operators/openstack-operator-controller-manager-7d46bf84bd-r6xt7" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.823790 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qclx5\" (UniqueName: \"kubernetes.io/projected/93492e48-5fe6-4e20-8597-738a93b6412c-kube-api-access-qclx5\") pod \"rabbitmq-cluster-operator-manager-668c99d594-zlbth\" (UID: \"93492e48-5fe6-4e20-8597-738a93b6412c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zlbth" Mar 12 15:04:47 crc kubenswrapper[4869]: E0312 15:04:47.824039 4869 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 12 15:04:47 crc kubenswrapper[4869]: E0312 15:04:47.824103 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-webhook-certs podName:2fd71c8f-f3bf-416b-9a7b-fd108b10853d nodeName:}" failed. No retries permitted until 2026-03-12 15:04:48.324084304 +0000 UTC m=+1040.609309582 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-webhook-certs") pod "openstack-operator-controller-manager-7d46bf84bd-r6xt7" (UID: "2fd71c8f-f3bf-416b-9a7b-fd108b10853d") : secret "webhook-server-cert" not found Mar 12 15:04:47 crc kubenswrapper[4869]: E0312 15:04:47.824317 4869 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 12 15:04:47 crc kubenswrapper[4869]: E0312 15:04:47.824341 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-metrics-certs podName:2fd71c8f-f3bf-416b-9a7b-fd108b10853d nodeName:}" failed. No retries permitted until 2026-03-12 15:04:48.324334361 +0000 UTC m=+1040.609559639 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-metrics-certs") pod "openstack-operator-controller-manager-7d46bf84bd-r6xt7" (UID: "2fd71c8f-f3bf-416b-9a7b-fd108b10853d") : secret "metrics-server-cert" not found Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.845444 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbsgg\" (UniqueName: \"kubernetes.io/projected/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-kube-api-access-gbsgg\") pod \"openstack-operator-controller-manager-7d46bf84bd-r6xt7\" (UID: \"2fd71c8f-f3bf-416b-9a7b-fd108b10853d\") " pod="openstack-operators/openstack-operator-controller-manager-7d46bf84bd-r6xt7" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.848004 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swh28\" (UniqueName: \"kubernetes.io/projected/eff4102a-7465-4994-b6d5-1982a6ec713b-kube-api-access-swh28\") pod \"watcher-operator-controller-manager-6dd88c6f67-s5ptd\" (UID: \"eff4102a-7465-4994-b6d5-1982a6ec713b\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-s5ptd" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.906378 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-96d7k" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.916220 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-s5ptd" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.928337 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qclx5\" (UniqueName: \"kubernetes.io/projected/93492e48-5fe6-4e20-8597-738a93b6412c-kube-api-access-qclx5\") pod \"rabbitmq-cluster-operator-manager-668c99d594-zlbth\" (UID: \"93492e48-5fe6-4e20-8597-738a93b6412c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zlbth" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.947387 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qclx5\" (UniqueName: \"kubernetes.io/projected/93492e48-5fe6-4e20-8597-738a93b6412c-kube-api-access-qclx5\") pod \"rabbitmq-cluster-operator-manager-668c99d594-zlbth\" (UID: \"93492e48-5fe6-4e20-8597-738a93b6412c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zlbth" Mar 12 15:04:47 crc kubenswrapper[4869]: I0312 15:04:47.987748 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zlbth" Mar 12 15:04:48 crc kubenswrapper[4869]: I0312 15:04:48.033159 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz\" (UID: \"6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz" Mar 12 15:04:48 crc kubenswrapper[4869]: E0312 15:04:48.033377 4869 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 15:04:48 crc kubenswrapper[4869]: E0312 15:04:48.033443 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715-cert podName:6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715 nodeName:}" failed. No retries permitted until 2026-03-12 15:04:49.033428996 +0000 UTC m=+1041.318654274 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz" (UID: "6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 15:04:48 crc kubenswrapper[4869]: I0312 15:04:48.036984 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-hn64k"] Mar 12 15:04:48 crc kubenswrapper[4869]: I0312 15:04:48.350523 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-webhook-certs\") pod \"openstack-operator-controller-manager-7d46bf84bd-r6xt7\" (UID: \"2fd71c8f-f3bf-416b-9a7b-fd108b10853d\") " pod="openstack-operators/openstack-operator-controller-manager-7d46bf84bd-r6xt7" Mar 12 15:04:48 crc kubenswrapper[4869]: I0312 15:04:48.360640 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-metrics-certs\") pod \"openstack-operator-controller-manager-7d46bf84bd-r6xt7\" (UID: \"2fd71c8f-f3bf-416b-9a7b-fd108b10853d\") " pod="openstack-operators/openstack-operator-controller-manager-7d46bf84bd-r6xt7" Mar 12 15:04:48 crc kubenswrapper[4869]: E0312 15:04:48.352685 4869 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 12 15:04:48 crc kubenswrapper[4869]: E0312 15:04:48.361598 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-webhook-certs podName:2fd71c8f-f3bf-416b-9a7b-fd108b10853d nodeName:}" failed. No retries permitted until 2026-03-12 15:04:49.361517078 +0000 UTC m=+1041.646742416 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-webhook-certs") pod "openstack-operator-controller-manager-7d46bf84bd-r6xt7" (UID: "2fd71c8f-f3bf-416b-9a7b-fd108b10853d") : secret "webhook-server-cert" not found Mar 12 15:04:48 crc kubenswrapper[4869]: E0312 15:04:48.360906 4869 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 12 15:04:48 crc kubenswrapper[4869]: E0312 15:04:48.361983 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-metrics-certs podName:2fd71c8f-f3bf-416b-9a7b-fd108b10853d nodeName:}" failed. No retries permitted until 2026-03-12 15:04:49.361974331 +0000 UTC m=+1041.647199609 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-metrics-certs") pod "openstack-operator-controller-manager-7d46bf84bd-r6xt7" (UID: "2fd71c8f-f3bf-416b-9a7b-fd108b10853d") : secret "metrics-server-cert" not found Mar 12 15:04:48 crc kubenswrapper[4869]: I0312 15:04:48.477784 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-hxljg"] Mar 12 15:04:48 crc kubenswrapper[4869]: I0312 15:04:48.480707 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-bbfwc"] Mar 12 15:04:48 crc kubenswrapper[4869]: I0312 15:04:48.484752 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-2phvg"] Mar 12 15:04:48 crc kubenswrapper[4869]: I0312 15:04:48.658923 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-bbfwc" event={"ID":"8d41de12-9bea-4bbc-a276-296376e563a8","Type":"ContainerStarted","Data":"5b470a3811e55a8df95576d9bc01ef0aecfbcaaf535a925cd20cfeeee9a73308"} Mar 12 15:04:48 crc kubenswrapper[4869]: I0312 15:04:48.674161 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-2phvg" event={"ID":"d1f4e20f-8a64-4692-aa9b-73e8eb2aeb26","Type":"ContainerStarted","Data":"9e032b8bfe40d5b67d711b4bac8cc8bda418e51a5f116a364054d2d4233ebca9"} Mar 12 15:04:48 crc kubenswrapper[4869]: I0312 15:04:48.675598 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-bgssp"] Mar 12 15:04:48 crc kubenswrapper[4869]: I0312 15:04:48.680871 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-hxljg" event={"ID":"9c4c52f4-d899-4059-8d91-29e4dd1971fd","Type":"ContainerStarted","Data":"549df4cbf2e4305169959273ce435b7dfeb804418aaad35b1c0365fdf77f9203"} Mar 12 15:04:48 crc kubenswrapper[4869]: I0312 15:04:48.683682 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-hn64k" event={"ID":"acaa3149-c349-4d8e-95ba-56d1714eb3b6","Type":"ContainerStarted","Data":"b114216f5f6711a80ab0cf2409ee470f84f5d29cca3df3ef842eae5177879109"} Mar 12 15:04:48 crc kubenswrapper[4869]: I0312 15:04:48.688089 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-g9sqx" event={"ID":"62ce9e5f-8d12-425b-b966-aca955bd96d9","Type":"ContainerStarted","Data":"593e227af528258bbaf9510d0eaa2de57bdbd3396c9f8f3570d9a25d759c7ba2"} Mar 12 15:04:48 crc kubenswrapper[4869]: W0312 15:04:48.690115 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c8724e6_72a2_441e_bb6a_330ee1ccfe6f.slice/crio-71ee46c0c36680e9acf589af8cf68db20f21f44638e5afff692bfaf10f069794 WatchSource:0}: Error finding container 71ee46c0c36680e9acf589af8cf68db20f21f44638e5afff692bfaf10f069794: Status 404 returned error can't find the container with id 71ee46c0c36680e9acf589af8cf68db20f21f44638e5afff692bfaf10f069794 Mar 12 15:04:48 crc kubenswrapper[4869]: I0312 15:04:48.707256 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-zsvnq"] Mar 12 15:04:48 crc kubenswrapper[4869]: W0312 15:04:48.726705 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2195e51b_747e_4e94_b616_8fe940ffe5ed.slice/crio-5e463c3c418da7affdda42fed8e3cafae0f34a74de09313e39fbe5bc79edea62 WatchSource:0}: Error finding container 5e463c3c418da7affdda42fed8e3cafae0f34a74de09313e39fbe5bc79edea62: Status 404 returned error can't find the container with id 5e463c3c418da7affdda42fed8e3cafae0f34a74de09313e39fbe5bc79edea62 Mar 12 15:04:48 crc kubenswrapper[4869]: I0312 15:04:48.726766 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-sggnn"] Mar 12 15:04:48 crc kubenswrapper[4869]: I0312 15:04:48.759865 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-cxtpp"] Mar 12 15:04:48 crc kubenswrapper[4869]: I0312 15:04:48.769001 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-qhsnc"] Mar 12 15:04:48 crc kubenswrapper[4869]: I0312 15:04:48.772691 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9089d1a-4d28-4973-a826-c7fa8b99acab-cert\") pod \"infra-operator-controller-manager-5995f4446f-vshpv\" (UID: \"c9089d1a-4d28-4973-a826-c7fa8b99acab\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vshpv" Mar 12 15:04:48 crc kubenswrapper[4869]: E0312 15:04:48.773608 4869 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 12 15:04:48 crc kubenswrapper[4869]: E0312 15:04:48.773674 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9089d1a-4d28-4973-a826-c7fa8b99acab-cert podName:c9089d1a-4d28-4973-a826-c7fa8b99acab nodeName:}" failed. No retries permitted until 2026-03-12 15:04:50.773653964 +0000 UTC m=+1043.058879242 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c9089d1a-4d28-4973-a826-c7fa8b99acab-cert") pod "infra-operator-controller-manager-5995f4446f-vshpv" (UID: "c9089d1a-4d28-4973-a826-c7fa8b99acab") : secret "infra-operator-webhook-server-cert" not found Mar 12 15:04:48 crc kubenswrapper[4869]: I0312 15:04:48.858939 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-7p294"] Mar 12 15:04:48 crc kubenswrapper[4869]: I0312 15:04:48.885876 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mhw7h"] Mar 12 15:04:48 crc kubenswrapper[4869]: I0312 15:04:48.893265 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-h9gxw"] Mar 12 15:04:48 crc kubenswrapper[4869]: I0312 15:04:48.898099 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-f9gh7"] Mar 12 15:04:48 crc kubenswrapper[4869]: I0312 15:04:48.904631 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-96d7k"] Mar 12 15:04:48 crc kubenswrapper[4869]: W0312 15:04:48.915017 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3760f848_91f6_4fdf_9bd4_d6ffbac2de6d.slice/crio-49c798ca3a2da9911d01ddbbb1b28419f05dbf52c404633c5072b9ba5010efea WatchSource:0}: Error finding container 49c798ca3a2da9911d01ddbbb1b28419f05dbf52c404633c5072b9ba5010efea: Status 404 returned error can't find the container with id 49c798ca3a2da9911d01ddbbb1b28419f05dbf52c404633c5072b9ba5010efea Mar 12 15:04:48 crc kubenswrapper[4869]: E0312 15:04:48.924996 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-857dn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6cd66dbd4b-mhw7h_openstack-operators(3760f848-91f6-4fdf-9bd4-d6ffbac2de6d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 12 15:04:48 crc kubenswrapper[4869]: E0312 15:04:48.926840 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mhw7h" podUID="3760f848-91f6-4fdf-9bd4-d6ffbac2de6d" Mar 12 15:04:48 crc kubenswrapper[4869]: W0312 15:04:48.929987 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode07ee970_a07b_4a6b_b5b5_3387bd4b6da2.slice/crio-ddf2f4771187d553bcb210d8fd969ed7551fec3a6b7b7f716fa2aaec42da135b WatchSource:0}: Error finding container ddf2f4771187d553bcb210d8fd969ed7551fec3a6b7b7f716fa2aaec42da135b: Status 404 returned error can't find the container with id ddf2f4771187d553bcb210d8fd969ed7551fec3a6b7b7f716fa2aaec42da135b Mar 12 15:04:48 crc kubenswrapper[4869]: E0312 15:04:48.934779 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kxdbb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-bbc5b68f9-hthfl_openstack-operators(e07ee970-a07b-4a6b-b5b5-3387bd4b6da2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 12 15:04:48 crc kubenswrapper[4869]: E0312 15:04:48.936049 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hthfl" podUID="e07ee970-a07b-4a6b-b5b5-3387bd4b6da2" Mar 12 15:04:48 crc kubenswrapper[4869]: I0312 15:04:48.947192 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hthfl"] Mar 12 15:04:49 crc kubenswrapper[4869]: I0312 15:04:49.072367 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-fh776"] Mar 12 15:04:49 crc kubenswrapper[4869]: I0312 15:04:49.077450 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz\" (UID: \"6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz" Mar 12 15:04:49 crc kubenswrapper[4869]: E0312 15:04:49.077645 4869 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 15:04:49 crc kubenswrapper[4869]: E0312 15:04:49.077703 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715-cert podName:6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715 nodeName:}" failed. No retries permitted until 2026-03-12 15:04:51.077686144 +0000 UTC m=+1043.362911422 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz" (UID: "6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 15:04:49 crc kubenswrapper[4869]: I0312 15:04:49.080569 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zlbth"] Mar 12 15:04:49 crc kubenswrapper[4869]: I0312 15:04:49.086123 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-t2cd8"] Mar 12 15:04:49 crc kubenswrapper[4869]: W0312 15:04:49.087856 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93492e48_5fe6_4e20_8597_738a93b6412c.slice/crio-87e93c055b2886ae3ad0f04464894e8d799435a41260639d95c4a50d22ef9a7e WatchSource:0}: Error finding container 87e93c055b2886ae3ad0f04464894e8d799435a41260639d95c4a50d22ef9a7e: Status 404 returned error can't find the container with id 87e93c055b2886ae3ad0f04464894e8d799435a41260639d95c4a50d22ef9a7e Mar 12 15:04:49 crc kubenswrapper[4869]: W0312 15:04:49.088463 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98c64520_2dc6_4f39_a102_58e0205f7d46.slice/crio-0b8e4e802411d8bde331dffec06d4efcb370abd499210e5c1afa803f3c593958 WatchSource:0}: Error finding container 0b8e4e802411d8bde331dffec06d4efcb370abd499210e5c1afa803f3c593958: Status 404 returned error can't find the container with id 0b8e4e802411d8bde331dffec06d4efcb370abd499210e5c1afa803f3c593958 Mar 12 15:04:49 crc kubenswrapper[4869]: W0312 15:04:49.090175 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87c7aae2_dd1d_40f2_a54c_9239fc1998de.slice/crio-e91c31a14cd97855cffcdf87d994eadd014e0cd5528c4da3d62095253cecf842 WatchSource:0}: Error finding container e91c31a14cd97855cffcdf87d994eadd014e0cd5528c4da3d62095253cecf842: Status 404 returned error can't find the container with id e91c31a14cd97855cffcdf87d994eadd014e0cd5528c4da3d62095253cecf842 Mar 12 15:04:49 crc kubenswrapper[4869]: W0312 15:04:49.090857 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeff4102a_7465_4994_b6d5_1982a6ec713b.slice/crio-fc40cf05849b77801146ab67b47ba0938f711c91b31abeaf7153904566af65d3 WatchSource:0}: Error finding container fc40cf05849b77801146ab67b47ba0938f711c91b31abeaf7153904566af65d3: Status 404 returned error can't find the container with id fc40cf05849b77801146ab67b47ba0938f711c91b31abeaf7153904566af65d3 Mar 12 15:04:49 crc kubenswrapper[4869]: E0312 15:04:49.092338 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wr6bl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-574d45c66c-t2cd8_openstack-operators(87c7aae2-dd1d-40f2-a54c-9239fc1998de): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 12 15:04:49 crc kubenswrapper[4869]: E0312 15:04:49.092479 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n82kt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-677c674df7-fh776_openstack-operators(98c64520-2dc6-4f39-a102-58e0205f7d46): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 12 15:04:49 crc kubenswrapper[4869]: E0312 15:04:49.093406 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-t2cd8" podUID="87c7aae2-dd1d-40f2-a54c-9239fc1998de" Mar 12 15:04:49 crc kubenswrapper[4869]: I0312 15:04:49.094407 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-s5ptd"] Mar 12 15:04:49 crc kubenswrapper[4869]: E0312 15:04:49.094483 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-fh776" podUID="98c64520-2dc6-4f39-a102-58e0205f7d46" Mar 12 15:04:49 crc kubenswrapper[4869]: E0312 15:04:49.096364 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qclx5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-zlbth_openstack-operators(93492e48-5fe6-4e20-8597-738a93b6412c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 12 15:04:49 crc kubenswrapper[4869]: E0312 15:04:49.097736 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zlbth" podUID="93492e48-5fe6-4e20-8597-738a93b6412c" Mar 12 15:04:49 crc kubenswrapper[4869]: I0312 15:04:49.383321 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-webhook-certs\") pod \"openstack-operator-controller-manager-7d46bf84bd-r6xt7\" (UID: \"2fd71c8f-f3bf-416b-9a7b-fd108b10853d\") " pod="openstack-operators/openstack-operator-controller-manager-7d46bf84bd-r6xt7" Mar 12 15:04:49 crc kubenswrapper[4869]: I0312 15:04:49.383400 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-metrics-certs\") pod \"openstack-operator-controller-manager-7d46bf84bd-r6xt7\" (UID: \"2fd71c8f-f3bf-416b-9a7b-fd108b10853d\") " pod="openstack-operators/openstack-operator-controller-manager-7d46bf84bd-r6xt7" Mar 12 15:04:49 crc kubenswrapper[4869]: E0312 15:04:49.383601 4869 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 12 15:04:49 crc kubenswrapper[4869]: E0312 15:04:49.383655 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-metrics-certs podName:2fd71c8f-f3bf-416b-9a7b-fd108b10853d nodeName:}" failed. No retries permitted until 2026-03-12 15:04:51.383638937 +0000 UTC m=+1043.668864215 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-metrics-certs") pod "openstack-operator-controller-manager-7d46bf84bd-r6xt7" (UID: "2fd71c8f-f3bf-416b-9a7b-fd108b10853d") : secret "metrics-server-cert" not found Mar 12 15:04:49 crc kubenswrapper[4869]: E0312 15:04:49.384095 4869 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 12 15:04:49 crc kubenswrapper[4869]: E0312 15:04:49.384129 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-webhook-certs podName:2fd71c8f-f3bf-416b-9a7b-fd108b10853d nodeName:}" failed. No retries permitted until 2026-03-12 15:04:51.384120171 +0000 UTC m=+1043.669345449 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-webhook-certs") pod "openstack-operator-controller-manager-7d46bf84bd-r6xt7" (UID: "2fd71c8f-f3bf-416b-9a7b-fd108b10853d") : secret "webhook-server-cert" not found Mar 12 15:04:49 crc kubenswrapper[4869]: I0312 15:04:49.693646 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hthfl" event={"ID":"e07ee970-a07b-4a6b-b5b5-3387bd4b6da2","Type":"ContainerStarted","Data":"ddf2f4771187d553bcb210d8fd969ed7551fec3a6b7b7f716fa2aaec42da135b"} Mar 12 15:04:49 crc kubenswrapper[4869]: E0312 15:04:49.695164 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hthfl" podUID="e07ee970-a07b-4a6b-b5b5-3387bd4b6da2" Mar 12 15:04:49 crc kubenswrapper[4869]: I0312 15:04:49.968557 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:04:49 crc kubenswrapper[4869]: I0312 15:04:49.968643 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:04:50 crc kubenswrapper[4869]: I0312 15:04:50.016613 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-zsvnq" event={"ID":"2195e51b-747e-4e94-b616-8fe940ffe5ed","Type":"ContainerStarted","Data":"5e463c3c418da7affdda42fed8e3cafae0f34a74de09313e39fbe5bc79edea62"} Mar 12 15:04:50 crc kubenswrapper[4869]: I0312 15:04:50.018400 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-t2cd8" event={"ID":"87c7aae2-dd1d-40f2-a54c-9239fc1998de","Type":"ContainerStarted","Data":"e91c31a14cd97855cffcdf87d994eadd014e0cd5528c4da3d62095253cecf842"} Mar 12 15:04:50 crc kubenswrapper[4869]: I0312 15:04:50.050875 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h9gxw" event={"ID":"fe9827df-43c1-4491-923e-0c745e025aec","Type":"ContainerStarted","Data":"30690f713b11f806b6e502e14418e9d007569ce289a68fe202d003494abffd13"} Mar 12 15:04:50 crc kubenswrapper[4869]: E0312 15:04:50.076788 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978\\\"\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-t2cd8" podUID="87c7aae2-dd1d-40f2-a54c-9239fc1998de" Mar 12 15:04:50 crc kubenswrapper[4869]: I0312 15:04:50.161850 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zlbth" event={"ID":"93492e48-5fe6-4e20-8597-738a93b6412c","Type":"ContainerStarted","Data":"87e93c055b2886ae3ad0f04464894e8d799435a41260639d95c4a50d22ef9a7e"} Mar 12 15:04:50 crc kubenswrapper[4869]: E0312 15:04:50.165409 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zlbth" podUID="93492e48-5fe6-4e20-8597-738a93b6412c" Mar 12 15:04:50 crc kubenswrapper[4869]: I0312 15:04:50.169131 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-sggnn" event={"ID":"50bd86b6-6404-44de-822c-b75e6692a36b","Type":"ContainerStarted","Data":"609d7c2855f39eff5f7b711ceb395801b137d67b16ce0431896101d1acc0edc7"} Mar 12 15:04:50 crc kubenswrapper[4869]: I0312 15:04:50.181862 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-cxtpp" event={"ID":"bdcee336-08a4-4504-8eb3-b09b4899d2ed","Type":"ContainerStarted","Data":"76f470810f49f241573699292cd2928668bcbd15cb6d214c71fb0b766f80b5ad"} Mar 12 15:04:50 crc kubenswrapper[4869]: I0312 15:04:50.197678 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-f9gh7" event={"ID":"4532e8a5-d461-4d46-99b9-6da31edb678b","Type":"ContainerStarted","Data":"dc64b31c896d01a3c830812513354872e758c0052b305201e9023009f8906f11"} Mar 12 15:04:50 crc kubenswrapper[4869]: I0312 15:04:50.215223 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-bgssp" event={"ID":"9c8724e6-72a2-441e-bb6a-330ee1ccfe6f","Type":"ContainerStarted","Data":"71ee46c0c36680e9acf589af8cf68db20f21f44638e5afff692bfaf10f069794"} Mar 12 15:04:50 crc kubenswrapper[4869]: I0312 15:04:50.229638 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mhw7h" event={"ID":"3760f848-91f6-4fdf-9bd4-d6ffbac2de6d","Type":"ContainerStarted","Data":"49c798ca3a2da9911d01ddbbb1b28419f05dbf52c404633c5072b9ba5010efea"} Mar 12 15:04:50 crc kubenswrapper[4869]: E0312 15:04:50.238588 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mhw7h" podUID="3760f848-91f6-4fdf-9bd4-d6ffbac2de6d" Mar 12 15:04:50 crc kubenswrapper[4869]: I0312 15:04:50.240961 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-96d7k" event={"ID":"da890c60-e9bb-49a9-97cc-67696823d7d8","Type":"ContainerStarted","Data":"e49e7b415209e89fe6703f0ed3bb0452ad77d934b1af577e9826a357dac5174d"} Mar 12 15:04:50 crc kubenswrapper[4869]: I0312 15:04:50.243592 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-7p294" event={"ID":"cbf420e9-2c19-4582-b36b-0d4651f6d067","Type":"ContainerStarted","Data":"c22d69aa98135ea5d959b619c751aea21652ec158c643480ca2d9bfd6bf5be5b"} Mar 12 15:04:50 crc kubenswrapper[4869]: I0312 15:04:50.248199 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-fh776" event={"ID":"98c64520-2dc6-4f39-a102-58e0205f7d46","Type":"ContainerStarted","Data":"0b8e4e802411d8bde331dffec06d4efcb370abd499210e5c1afa803f3c593958"} Mar 12 15:04:50 crc kubenswrapper[4869]: E0312 15:04:50.252109 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-fh776" podUID="98c64520-2dc6-4f39-a102-58e0205f7d46" Mar 12 15:04:50 crc kubenswrapper[4869]: I0312 15:04:50.257923 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-s5ptd" event={"ID":"eff4102a-7465-4994-b6d5-1982a6ec713b","Type":"ContainerStarted","Data":"fc40cf05849b77801146ab67b47ba0938f711c91b31abeaf7153904566af65d3"} Mar 12 15:04:50 crc kubenswrapper[4869]: I0312 15:04:50.261703 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-qhsnc" event={"ID":"454883b6-ef08-4828-acc8-237632cf4a35","Type":"ContainerStarted","Data":"2c78be0e2804544508076696d1d148bd3daebf356fd841cb326a77188bfa3275"} Mar 12 15:04:50 crc kubenswrapper[4869]: I0312 15:04:50.792444 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9089d1a-4d28-4973-a826-c7fa8b99acab-cert\") pod \"infra-operator-controller-manager-5995f4446f-vshpv\" (UID: \"c9089d1a-4d28-4973-a826-c7fa8b99acab\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vshpv" Mar 12 15:04:50 crc kubenswrapper[4869]: E0312 15:04:50.792612 4869 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 12 15:04:50 crc kubenswrapper[4869]: E0312 15:04:50.792679 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9089d1a-4d28-4973-a826-c7fa8b99acab-cert podName:c9089d1a-4d28-4973-a826-c7fa8b99acab nodeName:}" failed. No retries permitted until 2026-03-12 15:04:54.792661139 +0000 UTC m=+1047.077886417 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c9089d1a-4d28-4973-a826-c7fa8b99acab-cert") pod "infra-operator-controller-manager-5995f4446f-vshpv" (UID: "c9089d1a-4d28-4973-a826-c7fa8b99acab") : secret "infra-operator-webhook-server-cert" not found Mar 12 15:04:51 crc kubenswrapper[4869]: I0312 15:04:51.097942 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz\" (UID: \"6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz" Mar 12 15:04:51 crc kubenswrapper[4869]: E0312 15:04:51.098108 4869 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 15:04:51 crc kubenswrapper[4869]: E0312 15:04:51.098193 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715-cert podName:6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715 nodeName:}" failed. No retries permitted until 2026-03-12 15:04:55.09815168 +0000 UTC m=+1047.383376948 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz" (UID: "6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 15:04:51 crc kubenswrapper[4869]: E0312 15:04:51.271351 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mhw7h" podUID="3760f848-91f6-4fdf-9bd4-d6ffbac2de6d" Mar 12 15:04:51 crc kubenswrapper[4869]: E0312 15:04:51.271453 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-fh776" podUID="98c64520-2dc6-4f39-a102-58e0205f7d46" Mar 12 15:04:51 crc kubenswrapper[4869]: E0312 15:04:51.271472 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hthfl" podUID="e07ee970-a07b-4a6b-b5b5-3387bd4b6da2" Mar 12 15:04:51 crc kubenswrapper[4869]: E0312 15:04:51.272326 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978\\\"\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-t2cd8" podUID="87c7aae2-dd1d-40f2-a54c-9239fc1998de" Mar 12 15:04:51 crc kubenswrapper[4869]: E0312 15:04:51.274471 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zlbth" podUID="93492e48-5fe6-4e20-8597-738a93b6412c" Mar 12 15:04:51 crc kubenswrapper[4869]: I0312 15:04:51.437168 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-webhook-certs\") pod \"openstack-operator-controller-manager-7d46bf84bd-r6xt7\" (UID: \"2fd71c8f-f3bf-416b-9a7b-fd108b10853d\") " pod="openstack-operators/openstack-operator-controller-manager-7d46bf84bd-r6xt7" Mar 12 15:04:51 crc kubenswrapper[4869]: I0312 15:04:51.437233 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-metrics-certs\") pod \"openstack-operator-controller-manager-7d46bf84bd-r6xt7\" (UID: \"2fd71c8f-f3bf-416b-9a7b-fd108b10853d\") " pod="openstack-operators/openstack-operator-controller-manager-7d46bf84bd-r6xt7" Mar 12 15:04:51 crc kubenswrapper[4869]: E0312 15:04:51.437365 4869 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 12 15:04:51 crc kubenswrapper[4869]: E0312 15:04:51.437433 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-metrics-certs podName:2fd71c8f-f3bf-416b-9a7b-fd108b10853d nodeName:}" failed. No retries permitted until 2026-03-12 15:04:55.437417489 +0000 UTC m=+1047.722642767 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-metrics-certs") pod "openstack-operator-controller-manager-7d46bf84bd-r6xt7" (UID: "2fd71c8f-f3bf-416b-9a7b-fd108b10853d") : secret "metrics-server-cert" not found Mar 12 15:04:51 crc kubenswrapper[4869]: E0312 15:04:51.437368 4869 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 12 15:04:51 crc kubenswrapper[4869]: E0312 15:04:51.437482 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-webhook-certs podName:2fd71c8f-f3bf-416b-9a7b-fd108b10853d nodeName:}" failed. No retries permitted until 2026-03-12 15:04:55.437476501 +0000 UTC m=+1047.722701779 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-webhook-certs") pod "openstack-operator-controller-manager-7d46bf84bd-r6xt7" (UID: "2fd71c8f-f3bf-416b-9a7b-fd108b10853d") : secret "webhook-server-cert" not found Mar 12 15:04:54 crc kubenswrapper[4869]: I0312 15:04:54.888469 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9089d1a-4d28-4973-a826-c7fa8b99acab-cert\") pod \"infra-operator-controller-manager-5995f4446f-vshpv\" (UID: \"c9089d1a-4d28-4973-a826-c7fa8b99acab\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vshpv" Mar 12 15:04:54 crc kubenswrapper[4869]: E0312 15:04:54.888713 4869 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 12 15:04:54 crc kubenswrapper[4869]: E0312 15:04:54.889057 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9089d1a-4d28-4973-a826-c7fa8b99acab-cert podName:c9089d1a-4d28-4973-a826-c7fa8b99acab nodeName:}" failed. No retries permitted until 2026-03-12 15:05:02.889033984 +0000 UTC m=+1055.174259332 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c9089d1a-4d28-4973-a826-c7fa8b99acab-cert") pod "infra-operator-controller-manager-5995f4446f-vshpv" (UID: "c9089d1a-4d28-4973-a826-c7fa8b99acab") : secret "infra-operator-webhook-server-cert" not found Mar 12 15:04:55 crc kubenswrapper[4869]: I0312 15:04:55.193705 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz\" (UID: \"6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz" Mar 12 15:04:55 crc kubenswrapper[4869]: E0312 15:04:55.193860 4869 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 15:04:55 crc kubenswrapper[4869]: E0312 15:04:55.193996 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715-cert podName:6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715 nodeName:}" failed. No retries permitted until 2026-03-12 15:05:03.193950958 +0000 UTC m=+1055.479176316 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz" (UID: "6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 15:04:55 crc kubenswrapper[4869]: I0312 15:04:55.498231 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-webhook-certs\") pod \"openstack-operator-controller-manager-7d46bf84bd-r6xt7\" (UID: \"2fd71c8f-f3bf-416b-9a7b-fd108b10853d\") " pod="openstack-operators/openstack-operator-controller-manager-7d46bf84bd-r6xt7" Mar 12 15:04:55 crc kubenswrapper[4869]: I0312 15:04:55.498311 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-metrics-certs\") pod \"openstack-operator-controller-manager-7d46bf84bd-r6xt7\" (UID: \"2fd71c8f-f3bf-416b-9a7b-fd108b10853d\") " pod="openstack-operators/openstack-operator-controller-manager-7d46bf84bd-r6xt7" Mar 12 15:04:55 crc kubenswrapper[4869]: E0312 15:04:55.498556 4869 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 12 15:04:55 crc kubenswrapper[4869]: E0312 15:04:55.498641 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-metrics-certs podName:2fd71c8f-f3bf-416b-9a7b-fd108b10853d nodeName:}" failed. No retries permitted until 2026-03-12 15:05:03.498618715 +0000 UTC m=+1055.783843983 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-metrics-certs") pod "openstack-operator-controller-manager-7d46bf84bd-r6xt7" (UID: "2fd71c8f-f3bf-416b-9a7b-fd108b10853d") : secret "metrics-server-cert" not found Mar 12 15:04:55 crc kubenswrapper[4869]: E0312 15:04:55.498697 4869 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 12 15:04:55 crc kubenswrapper[4869]: E0312 15:04:55.498726 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-webhook-certs podName:2fd71c8f-f3bf-416b-9a7b-fd108b10853d nodeName:}" failed. No retries permitted until 2026-03-12 15:05:03.498719348 +0000 UTC m=+1055.783944626 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-webhook-certs") pod "openstack-operator-controller-manager-7d46bf84bd-r6xt7" (UID: "2fd71c8f-f3bf-416b-9a7b-fd108b10853d") : secret "webhook-server-cert" not found Mar 12 15:05:02 crc kubenswrapper[4869]: I0312 15:05:02.921016 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9089d1a-4d28-4973-a826-c7fa8b99acab-cert\") pod \"infra-operator-controller-manager-5995f4446f-vshpv\" (UID: \"c9089d1a-4d28-4973-a826-c7fa8b99acab\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vshpv" Mar 12 15:05:02 crc kubenswrapper[4869]: E0312 15:05:02.921282 4869 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 12 15:05:02 crc kubenswrapper[4869]: E0312 15:05:02.922127 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9089d1a-4d28-4973-a826-c7fa8b99acab-cert podName:c9089d1a-4d28-4973-a826-c7fa8b99acab nodeName:}" failed. No retries permitted until 2026-03-12 15:05:18.922100051 +0000 UTC m=+1071.207325329 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c9089d1a-4d28-4973-a826-c7fa8b99acab-cert") pod "infra-operator-controller-manager-5995f4446f-vshpv" (UID: "c9089d1a-4d28-4973-a826-c7fa8b99acab") : secret "infra-operator-webhook-server-cert" not found Mar 12 15:05:03 crc kubenswrapper[4869]: I0312 15:05:03.227372 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz\" (UID: \"6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz" Mar 12 15:05:03 crc kubenswrapper[4869]: I0312 15:05:03.239485 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz\" (UID: \"6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz" Mar 12 15:05:03 crc kubenswrapper[4869]: I0312 15:05:03.528109 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz" Mar 12 15:05:03 crc kubenswrapper[4869]: I0312 15:05:03.532358 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-webhook-certs\") pod \"openstack-operator-controller-manager-7d46bf84bd-r6xt7\" (UID: \"2fd71c8f-f3bf-416b-9a7b-fd108b10853d\") " pod="openstack-operators/openstack-operator-controller-manager-7d46bf84bd-r6xt7" Mar 12 15:05:03 crc kubenswrapper[4869]: I0312 15:05:03.532413 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-metrics-certs\") pod \"openstack-operator-controller-manager-7d46bf84bd-r6xt7\" (UID: \"2fd71c8f-f3bf-416b-9a7b-fd108b10853d\") " pod="openstack-operators/openstack-operator-controller-manager-7d46bf84bd-r6xt7" Mar 12 15:05:03 crc kubenswrapper[4869]: I0312 15:05:03.535265 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-metrics-certs\") pod \"openstack-operator-controller-manager-7d46bf84bd-r6xt7\" (UID: \"2fd71c8f-f3bf-416b-9a7b-fd108b10853d\") " pod="openstack-operators/openstack-operator-controller-manager-7d46bf84bd-r6xt7" Mar 12 15:05:03 crc kubenswrapper[4869]: I0312 15:05:03.537261 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2fd71c8f-f3bf-416b-9a7b-fd108b10853d-webhook-certs\") pod \"openstack-operator-controller-manager-7d46bf84bd-r6xt7\" (UID: \"2fd71c8f-f3bf-416b-9a7b-fd108b10853d\") " pod="openstack-operators/openstack-operator-controller-manager-7d46bf84bd-r6xt7" Mar 12 15:05:03 crc kubenswrapper[4869]: I0312 15:05:03.576616 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7d46bf84bd-r6xt7" Mar 12 15:05:08 crc kubenswrapper[4869]: I0312 15:05:08.235137 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz"] Mar 12 15:05:08 crc kubenswrapper[4869]: I0312 15:05:08.379807 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7d46bf84bd-r6xt7"] Mar 12 15:05:08 crc kubenswrapper[4869]: I0312 15:05:08.410059 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-sggnn" event={"ID":"50bd86b6-6404-44de-822c-b75e6692a36b","Type":"ContainerStarted","Data":"4b56f978b3979877e75ad20475afe529947b8900260565628f0d7480d1e0aaf8"} Mar 12 15:05:08 crc kubenswrapper[4869]: I0312 15:05:08.410366 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-sggnn" Mar 12 15:05:08 crc kubenswrapper[4869]: I0312 15:05:08.437588 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-2phvg" event={"ID":"d1f4e20f-8a64-4692-aa9b-73e8eb2aeb26","Type":"ContainerStarted","Data":"5a37911fa06a1f13ebec6ed817a2f650448379cbe49a82719261f128a616ea61"} Mar 12 15:05:08 crc kubenswrapper[4869]: I0312 15:05:08.439005 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-2phvg" Mar 12 15:05:08 crc kubenswrapper[4869]: I0312 15:05:08.442703 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-sggnn" podStartSLOduration=9.762321259 podStartE2EDuration="22.442682657s" podCreationTimestamp="2026-03-12 15:04:46 +0000 UTC" firstStartedPulling="2026-03-12 15:04:48.752917806 +0000 UTC m=+1041.038143084" lastFinishedPulling="2026-03-12 15:05:01.433279204 +0000 UTC m=+1053.718504482" observedRunningTime="2026-03-12 15:05:08.433852167 +0000 UTC m=+1060.719077445" watchObservedRunningTime="2026-03-12 15:05:08.442682657 +0000 UTC m=+1060.727907935" Mar 12 15:05:08 crc kubenswrapper[4869]: I0312 15:05:08.451656 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz" event={"ID":"6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715","Type":"ContainerStarted","Data":"5d60197b45256e071dfb66a61fdb8a05a401a2921ac6c3b3620cfdf28727550a"} Mar 12 15:05:08 crc kubenswrapper[4869]: I0312 15:05:08.490491 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-2phvg" podStartSLOduration=9.566602785 podStartE2EDuration="22.490462484s" podCreationTimestamp="2026-03-12 15:04:46 +0000 UTC" firstStartedPulling="2026-03-12 15:04:48.509342343 +0000 UTC m=+1040.794567621" lastFinishedPulling="2026-03-12 15:05:01.433202042 +0000 UTC m=+1053.718427320" observedRunningTime="2026-03-12 15:05:08.489509497 +0000 UTC m=+1060.774734805" watchObservedRunningTime="2026-03-12 15:05:08.490462484 +0000 UTC m=+1060.775687762" Mar 12 15:05:08 crc kubenswrapper[4869]: I0312 15:05:08.502844 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-cxtpp" event={"ID":"bdcee336-08a4-4504-8eb3-b09b4899d2ed","Type":"ContainerStarted","Data":"01e07ea57f282282dcf5df8d046c9ab7b71767107ad1a05c67be51cdda1be556"} Mar 12 15:05:08 crc kubenswrapper[4869]: I0312 15:05:08.504177 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-cxtpp" Mar 12 15:05:08 crc kubenswrapper[4869]: I0312 15:05:08.528924 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-bbfwc" event={"ID":"8d41de12-9bea-4bbc-a276-296376e563a8","Type":"ContainerStarted","Data":"fd2d47a4f8407501b0f2f6f390b1fd48a2d7ba1f351076401804b41f5b046d15"} Mar 12 15:05:08 crc kubenswrapper[4869]: I0312 15:05:08.529798 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-bbfwc" Mar 12 15:05:08 crc kubenswrapper[4869]: I0312 15:05:08.545043 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-cxtpp" podStartSLOduration=9.87017258 podStartE2EDuration="22.545028152s" podCreationTimestamp="2026-03-12 15:04:46 +0000 UTC" firstStartedPulling="2026-03-12 15:04:48.758772072 +0000 UTC m=+1041.043997350" lastFinishedPulling="2026-03-12 15:05:01.433627644 +0000 UTC m=+1053.718852922" observedRunningTime="2026-03-12 15:05:08.54388161 +0000 UTC m=+1060.829106888" watchObservedRunningTime="2026-03-12 15:05:08.545028152 +0000 UTC m=+1060.830253430" Mar 12 15:05:08 crc kubenswrapper[4869]: I0312 15:05:08.563917 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h9gxw" event={"ID":"fe9827df-43c1-4491-923e-0c745e025aec","Type":"ContainerStarted","Data":"8b94b449fd95e7911766471205df06a1216009c31ac371f95642af571751a615"} Mar 12 15:05:08 crc kubenswrapper[4869]: I0312 15:05:08.564661 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h9gxw" Mar 12 15:05:08 crc kubenswrapper[4869]: I0312 15:05:08.566569 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-bbfwc" podStartSLOduration=3.285847142 podStartE2EDuration="22.566552653s" podCreationTimestamp="2026-03-12 15:04:46 +0000 UTC" firstStartedPulling="2026-03-12 15:04:48.497346813 +0000 UTC m=+1040.782572091" lastFinishedPulling="2026-03-12 15:05:07.778052324 +0000 UTC m=+1060.063277602" observedRunningTime="2026-03-12 15:05:08.563704082 +0000 UTC m=+1060.848929380" watchObservedRunningTime="2026-03-12 15:05:08.566552653 +0000 UTC m=+1060.851777931" Mar 12 15:05:08 crc kubenswrapper[4869]: I0312 15:05:08.587975 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-hn64k" event={"ID":"acaa3149-c349-4d8e-95ba-56d1714eb3b6","Type":"ContainerStarted","Data":"5cf1f66d9970ce8f5e7933c40edee3804dc084428d7534e422bfd351d5092fb2"} Mar 12 15:05:08 crc kubenswrapper[4869]: I0312 15:05:08.588670 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-hn64k" Mar 12 15:05:08 crc kubenswrapper[4869]: I0312 15:05:08.595193 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-96d7k" event={"ID":"da890c60-e9bb-49a9-97cc-67696823d7d8","Type":"ContainerStarted","Data":"4cba99e458288ad94b52ce718da736cbe47383f831bdc2c184afdbac34c2171b"} Mar 12 15:05:08 crc kubenswrapper[4869]: I0312 15:05:08.595860 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-96d7k" Mar 12 15:05:08 crc kubenswrapper[4869]: I0312 15:05:08.599180 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h9gxw" podStartSLOduration=2.796885527 podStartE2EDuration="21.599164929s" podCreationTimestamp="2026-03-12 15:04:47 +0000 UTC" firstStartedPulling="2026-03-12 15:04:48.904829598 +0000 UTC m=+1041.190054876" lastFinishedPulling="2026-03-12 15:05:07.70710898 +0000 UTC m=+1059.992334278" observedRunningTime="2026-03-12 15:05:08.591939634 +0000 UTC m=+1060.877164912" watchObservedRunningTime="2026-03-12 15:05:08.599164929 +0000 UTC m=+1060.884390207" Mar 12 15:05:08 crc kubenswrapper[4869]: I0312 15:05:08.607633 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-g9sqx" event={"ID":"62ce9e5f-8d12-425b-b966-aca955bd96d9","Type":"ContainerStarted","Data":"c6796e67f92fda109fcf66d7070f93b80c0797b4ea4c2f28a7a8780d1711b7cf"} Mar 12 15:05:08 crc kubenswrapper[4869]: I0312 15:05:08.608294 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-g9sqx" Mar 12 15:05:08 crc kubenswrapper[4869]: I0312 15:05:08.612100 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-f9gh7" event={"ID":"4532e8a5-d461-4d46-99b9-6da31edb678b","Type":"ContainerStarted","Data":"6822d1e25319a64d0cbd87d3fc9d1f8f60f32f2549c25c0e807e0cfe7c6e2783"} Mar 12 15:05:08 crc kubenswrapper[4869]: I0312 15:05:08.612879 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-f9gh7" Mar 12 15:05:08 crc kubenswrapper[4869]: I0312 15:05:08.623357 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-hn64k" podStartSLOduration=9.244876993 podStartE2EDuration="22.623341295s" podCreationTimestamp="2026-03-12 15:04:46 +0000 UTC" firstStartedPulling="2026-03-12 15:04:48.054896405 +0000 UTC m=+1040.340121673" lastFinishedPulling="2026-03-12 15:05:01.433360697 +0000 UTC m=+1053.718585975" observedRunningTime="2026-03-12 15:05:08.622221693 +0000 UTC m=+1060.907446971" watchObservedRunningTime="2026-03-12 15:05:08.623341295 +0000 UTC m=+1060.908566573" Mar 12 15:05:08 crc kubenswrapper[4869]: I0312 15:05:08.649566 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-g9sqx" podStartSLOduration=2.788107545 podStartE2EDuration="22.649534418s" podCreationTimestamp="2026-03-12 15:04:46 +0000 UTC" firstStartedPulling="2026-03-12 15:04:47.839718808 +0000 UTC m=+1040.124944086" lastFinishedPulling="2026-03-12 15:05:07.701145671 +0000 UTC m=+1059.986370959" observedRunningTime="2026-03-12 15:05:08.648674004 +0000 UTC m=+1060.933899282" watchObservedRunningTime="2026-03-12 15:05:08.649534418 +0000 UTC m=+1060.934759696" Mar 12 15:05:08 crc kubenswrapper[4869]: I0312 15:05:08.675741 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-f9gh7" podStartSLOduration=2.810935385 podStartE2EDuration="21.675725282s" podCreationTimestamp="2026-03-12 15:04:47 +0000 UTC" firstStartedPulling="2026-03-12 15:04:48.914618335 +0000 UTC m=+1041.199843613" lastFinishedPulling="2026-03-12 15:05:07.779408222 +0000 UTC m=+1060.064633510" observedRunningTime="2026-03-12 15:05:08.673831548 +0000 UTC m=+1060.959056826" watchObservedRunningTime="2026-03-12 15:05:08.675725282 +0000 UTC m=+1060.960950560" Mar 12 15:05:09 crc kubenswrapper[4869]: I0312 15:05:09.627222 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7d46bf84bd-r6xt7" event={"ID":"2fd71c8f-f3bf-416b-9a7b-fd108b10853d","Type":"ContainerStarted","Data":"40582438b3c981e3ca335b845b6310d855176623d44c157ea588be0023a30908"} Mar 12 15:05:09 crc kubenswrapper[4869]: I0312 15:05:09.627313 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7d46bf84bd-r6xt7" event={"ID":"2fd71c8f-f3bf-416b-9a7b-fd108b10853d","Type":"ContainerStarted","Data":"bfafcb056b06bbe5e3b26783475d5b68f00f6df158606b43f50607cb8faab143"} Mar 12 15:05:09 crc kubenswrapper[4869]: I0312 15:05:09.628002 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7d46bf84bd-r6xt7" Mar 12 15:05:09 crc kubenswrapper[4869]: I0312 15:05:09.636355 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-hxljg" event={"ID":"9c4c52f4-d899-4059-8d91-29e4dd1971fd","Type":"ContainerStarted","Data":"9c52ef28a25fbba12086e20927bb3ade91b2c92d20266ecf12c68b5561887d0f"} Mar 12 15:05:09 crc kubenswrapper[4869]: I0312 15:05:09.636825 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-hxljg" Mar 12 15:05:09 crc kubenswrapper[4869]: I0312 15:05:09.643875 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-bgssp" event={"ID":"9c8724e6-72a2-441e-bb6a-330ee1ccfe6f","Type":"ContainerStarted","Data":"a8315c08f80ebfeb48a649297c67f78751c67741d5122bfb17d4b85851239d60"} Mar 12 15:05:09 crc kubenswrapper[4869]: I0312 15:05:09.644351 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-bgssp" Mar 12 15:05:09 crc kubenswrapper[4869]: I0312 15:05:09.650879 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-s5ptd" event={"ID":"eff4102a-7465-4994-b6d5-1982a6ec713b","Type":"ContainerStarted","Data":"60e6ff25030ce03f57fda14850408a8622e4e0d1e5d1380ff380018db76f3df8"} Mar 12 15:05:09 crc kubenswrapper[4869]: I0312 15:05:09.651385 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-s5ptd" Mar 12 15:05:09 crc kubenswrapper[4869]: I0312 15:05:09.660727 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7d46bf84bd-r6xt7" podStartSLOduration=22.660710498 podStartE2EDuration="22.660710498s" podCreationTimestamp="2026-03-12 15:04:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:05:09.659467333 +0000 UTC m=+1061.944692611" watchObservedRunningTime="2026-03-12 15:05:09.660710498 +0000 UTC m=+1061.945935776" Mar 12 15:05:09 crc kubenswrapper[4869]: I0312 15:05:09.661145 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-96d7k" podStartSLOduration=10.142853943 podStartE2EDuration="22.661141611s" podCreationTimestamp="2026-03-12 15:04:47 +0000 UTC" firstStartedPulling="2026-03-12 15:04:48.914995936 +0000 UTC m=+1041.200221214" lastFinishedPulling="2026-03-12 15:05:01.433283604 +0000 UTC m=+1053.718508882" observedRunningTime="2026-03-12 15:05:08.698578831 +0000 UTC m=+1060.983804109" watchObservedRunningTime="2026-03-12 15:05:09.661141611 +0000 UTC m=+1061.946366889" Mar 12 15:05:09 crc kubenswrapper[4869]: I0312 15:05:09.664364 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-qhsnc" event={"ID":"454883b6-ef08-4828-acc8-237632cf4a35","Type":"ContainerStarted","Data":"6a27945a5014fe768cb569405aeb439f99f305fd107d2ad6294201e014d49760"} Mar 12 15:05:09 crc kubenswrapper[4869]: I0312 15:05:09.664670 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-qhsnc" Mar 12 15:05:09 crc kubenswrapper[4869]: I0312 15:05:09.671083 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-7p294" event={"ID":"cbf420e9-2c19-4582-b36b-0d4651f6d067","Type":"ContainerStarted","Data":"7aafafb92df5c43e05ba31339257edb4112521d00437d179c608215d67592cd0"} Mar 12 15:05:09 crc kubenswrapper[4869]: I0312 15:05:09.671516 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-7p294" Mar 12 15:05:09 crc kubenswrapper[4869]: I0312 15:05:09.674687 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-zsvnq" event={"ID":"2195e51b-747e-4e94-b616-8fe940ffe5ed","Type":"ContainerStarted","Data":"c246819a7aeab0e29de3f81adf6493edfcebb7d991aba5481e339a899e622aed"} Mar 12 15:05:09 crc kubenswrapper[4869]: I0312 15:05:09.675365 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-zsvnq" Mar 12 15:05:09 crc kubenswrapper[4869]: I0312 15:05:09.688567 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-bgssp" podStartSLOduration=4.567779146 podStartE2EDuration="23.688552159s" podCreationTimestamp="2026-03-12 15:04:46 +0000 UTC" firstStartedPulling="2026-03-12 15:04:48.702062982 +0000 UTC m=+1040.987288260" lastFinishedPulling="2026-03-12 15:05:07.822835995 +0000 UTC m=+1060.108061273" observedRunningTime="2026-03-12 15:05:09.687669854 +0000 UTC m=+1061.972895132" watchObservedRunningTime="2026-03-12 15:05:09.688552159 +0000 UTC m=+1061.973777437" Mar 12 15:05:09 crc kubenswrapper[4869]: I0312 15:05:09.707377 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-hxljg" podStartSLOduration=4.428060222 podStartE2EDuration="23.707360783s" podCreationTimestamp="2026-03-12 15:04:46 +0000 UTC" firstStartedPulling="2026-03-12 15:04:48.496257512 +0000 UTC m=+1040.781482790" lastFinishedPulling="2026-03-12 15:05:07.775558073 +0000 UTC m=+1060.060783351" observedRunningTime="2026-03-12 15:05:09.704272135 +0000 UTC m=+1061.989497413" watchObservedRunningTime="2026-03-12 15:05:09.707360783 +0000 UTC m=+1061.992586061" Mar 12 15:05:09 crc kubenswrapper[4869]: I0312 15:05:09.736870 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-s5ptd" podStartSLOduration=4.1365535 podStartE2EDuration="22.73685356s" podCreationTimestamp="2026-03-12 15:04:47 +0000 UTC" firstStartedPulling="2026-03-12 15:04:49.097329961 +0000 UTC m=+1041.382555239" lastFinishedPulling="2026-03-12 15:05:07.697630011 +0000 UTC m=+1059.982855299" observedRunningTime="2026-03-12 15:05:09.72701752 +0000 UTC m=+1062.012242798" watchObservedRunningTime="2026-03-12 15:05:09.73685356 +0000 UTC m=+1062.022078838" Mar 12 15:05:09 crc kubenswrapper[4869]: I0312 15:05:09.755297 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-7p294" podStartSLOduration=3.863935583 podStartE2EDuration="22.755283683s" podCreationTimestamp="2026-03-12 15:04:47 +0000 UTC" firstStartedPulling="2026-03-12 15:04:48.886308112 +0000 UTC m=+1041.171533390" lastFinishedPulling="2026-03-12 15:05:07.777656212 +0000 UTC m=+1060.062881490" observedRunningTime="2026-03-12 15:05:09.751870986 +0000 UTC m=+1062.037096264" watchObservedRunningTime="2026-03-12 15:05:09.755283683 +0000 UTC m=+1062.040508961" Mar 12 15:05:09 crc kubenswrapper[4869]: I0312 15:05:09.789222 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-zsvnq" podStartSLOduration=11.08638601 podStartE2EDuration="23.789205646s" podCreationTimestamp="2026-03-12 15:04:46 +0000 UTC" firstStartedPulling="2026-03-12 15:04:48.730381026 +0000 UTC m=+1041.015606304" lastFinishedPulling="2026-03-12 15:05:01.433200672 +0000 UTC m=+1053.718425940" observedRunningTime="2026-03-12 15:05:09.771695689 +0000 UTC m=+1062.056920967" watchObservedRunningTime="2026-03-12 15:05:09.789205646 +0000 UTC m=+1062.074430924" Mar 12 15:05:09 crc kubenswrapper[4869]: I0312 15:05:09.789608 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-qhsnc" podStartSLOduration=4.771478778 podStartE2EDuration="23.789602767s" podCreationTimestamp="2026-03-12 15:04:46 +0000 UTC" firstStartedPulling="2026-03-12 15:04:48.75730921 +0000 UTC m=+1041.042534488" lastFinishedPulling="2026-03-12 15:05:07.775433199 +0000 UTC m=+1060.060658477" observedRunningTime="2026-03-12 15:05:09.789054551 +0000 UTC m=+1062.074279829" watchObservedRunningTime="2026-03-12 15:05:09.789602767 +0000 UTC m=+1062.074828045" Mar 12 15:05:13 crc kubenswrapper[4869]: I0312 15:05:13.582861 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7d46bf84bd-r6xt7" Mar 12 15:05:16 crc kubenswrapper[4869]: I0312 15:05:16.749787 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-t2cd8" event={"ID":"87c7aae2-dd1d-40f2-a54c-9239fc1998de","Type":"ContainerStarted","Data":"ca580ddfb7a4ff324f20d7c0bbf128ed9658c3726fe95dae83ff1af2fdba2767"} Mar 12 15:05:16 crc kubenswrapper[4869]: I0312 15:05:16.750697 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-t2cd8" Mar 12 15:05:16 crc kubenswrapper[4869]: I0312 15:05:16.751037 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-fh776" event={"ID":"98c64520-2dc6-4f39-a102-58e0205f7d46","Type":"ContainerStarted","Data":"1014f51373705fed590721008df9dcd84096a4236df7b50c98be48ab0f12410e"} Mar 12 15:05:16 crc kubenswrapper[4869]: I0312 15:05:16.751184 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-677c674df7-fh776" Mar 12 15:05:16 crc kubenswrapper[4869]: I0312 15:05:16.752393 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zlbth" event={"ID":"93492e48-5fe6-4e20-8597-738a93b6412c","Type":"ContainerStarted","Data":"84eb628a6f50f432ebf935163a1fe858ad37bc58992e1990d2f7c6e2c3386eea"} Mar 12 15:05:16 crc kubenswrapper[4869]: I0312 15:05:16.753423 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mhw7h" event={"ID":"3760f848-91f6-4fdf-9bd4-d6ffbac2de6d","Type":"ContainerStarted","Data":"a448610f0e4a31207f8dd67a1fce2e61e550d37df4dcdaa1fdd7dc2572480a55"} Mar 12 15:05:16 crc kubenswrapper[4869]: I0312 15:05:16.753572 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mhw7h" Mar 12 15:05:16 crc kubenswrapper[4869]: I0312 15:05:16.754696 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hthfl" event={"ID":"e07ee970-a07b-4a6b-b5b5-3387bd4b6da2","Type":"ContainerStarted","Data":"24753e7a0b900bd39299962abd9f83e443faa764a518499bd207f8c8d44a2a26"} Mar 12 15:05:16 crc kubenswrapper[4869]: I0312 15:05:16.755003 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hthfl" Mar 12 15:05:16 crc kubenswrapper[4869]: I0312 15:05:16.756073 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz" event={"ID":"6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715","Type":"ContainerStarted","Data":"301e15fe0d64d914a95eb23a936897263a461399068490323bead4f523b8d9e8"} Mar 12 15:05:16 crc kubenswrapper[4869]: I0312 15:05:16.756244 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz" Mar 12 15:05:16 crc kubenswrapper[4869]: I0312 15:05:16.766803 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-t2cd8" podStartSLOduration=3.184514218 podStartE2EDuration="29.766789005s" podCreationTimestamp="2026-03-12 15:04:47 +0000 UTC" firstStartedPulling="2026-03-12 15:04:49.092234807 +0000 UTC m=+1041.377460085" lastFinishedPulling="2026-03-12 15:05:15.674509594 +0000 UTC m=+1067.959734872" observedRunningTime="2026-03-12 15:05:16.764239813 +0000 UTC m=+1069.049465091" watchObservedRunningTime="2026-03-12 15:05:16.766789005 +0000 UTC m=+1069.052014273" Mar 12 15:05:16 crc kubenswrapper[4869]: I0312 15:05:16.785999 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mhw7h" podStartSLOduration=3.035135888 podStartE2EDuration="29.78597507s" podCreationTimestamp="2026-03-12 15:04:47 +0000 UTC" firstStartedPulling="2026-03-12 15:04:48.924488745 +0000 UTC m=+1041.209714023" lastFinishedPulling="2026-03-12 15:05:15.675327927 +0000 UTC m=+1067.960553205" observedRunningTime="2026-03-12 15:05:16.784240561 +0000 UTC m=+1069.069465839" watchObservedRunningTime="2026-03-12 15:05:16.78597507 +0000 UTC m=+1069.071200368" Mar 12 15:05:16 crc kubenswrapper[4869]: I0312 15:05:16.805474 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zlbth" podStartSLOduration=3.192990158 podStartE2EDuration="29.805445652s" podCreationTimestamp="2026-03-12 15:04:47 +0000 UTC" firstStartedPulling="2026-03-12 15:04:49.09624755 +0000 UTC m=+1041.381472828" lastFinishedPulling="2026-03-12 15:05:15.708703044 +0000 UTC m=+1067.993928322" observedRunningTime="2026-03-12 15:05:16.800491682 +0000 UTC m=+1069.085716970" watchObservedRunningTime="2026-03-12 15:05:16.805445652 +0000 UTC m=+1069.090670970" Mar 12 15:05:16 crc kubenswrapper[4869]: I0312 15:05:16.835889 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz" podStartSLOduration=22.483518979 podStartE2EDuration="29.835872086s" podCreationTimestamp="2026-03-12 15:04:47 +0000 UTC" firstStartedPulling="2026-03-12 15:05:08.366224917 +0000 UTC m=+1060.651450195" lastFinishedPulling="2026-03-12 15:05:15.718578024 +0000 UTC m=+1068.003803302" observedRunningTime="2026-03-12 15:05:16.83108007 +0000 UTC m=+1069.116305368" watchObservedRunningTime="2026-03-12 15:05:16.835872086 +0000 UTC m=+1069.121097374" Mar 12 15:05:16 crc kubenswrapper[4869]: I0312 15:05:16.851822 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-677c674df7-fh776" podStartSLOduration=3.269731227 podStartE2EDuration="29.851800018s" podCreationTimestamp="2026-03-12 15:04:47 +0000 UTC" firstStartedPulling="2026-03-12 15:04:49.092415382 +0000 UTC m=+1041.377640660" lastFinishedPulling="2026-03-12 15:05:15.674484183 +0000 UTC m=+1067.959709451" observedRunningTime="2026-03-12 15:05:16.843846652 +0000 UTC m=+1069.129071930" watchObservedRunningTime="2026-03-12 15:05:16.851800018 +0000 UTC m=+1069.137025296" Mar 12 15:05:16 crc kubenswrapper[4869]: I0312 15:05:16.864094 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hthfl" podStartSLOduration=3.116160278 podStartE2EDuration="29.864070056s" podCreationTimestamp="2026-03-12 15:04:47 +0000 UTC" firstStartedPulling="2026-03-12 15:04:48.934672565 +0000 UTC m=+1041.219897843" lastFinishedPulling="2026-03-12 15:05:15.682582343 +0000 UTC m=+1067.967807621" observedRunningTime="2026-03-12 15:05:16.856859602 +0000 UTC m=+1069.142084900" watchObservedRunningTime="2026-03-12 15:05:16.864070056 +0000 UTC m=+1069.149295334" Mar 12 15:05:17 crc kubenswrapper[4869]: I0312 15:05:17.162611 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-g9sqx" Mar 12 15:05:17 crc kubenswrapper[4869]: I0312 15:05:17.184047 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-hn64k" Mar 12 15:05:17 crc kubenswrapper[4869]: I0312 15:05:17.196486 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-2phvg" Mar 12 15:05:17 crc kubenswrapper[4869]: I0312 15:05:17.223668 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-bbfwc" Mar 12 15:05:17 crc kubenswrapper[4869]: I0312 15:05:17.283129 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-hxljg" Mar 12 15:05:17 crc kubenswrapper[4869]: I0312 15:05:17.325166 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-cxtpp" Mar 12 15:05:17 crc kubenswrapper[4869]: I0312 15:05:17.375551 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-bgssp" Mar 12 15:05:17 crc kubenswrapper[4869]: I0312 15:05:17.401147 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-sggnn" Mar 12 15:05:17 crc kubenswrapper[4869]: I0312 15:05:17.423383 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-zsvnq" Mar 12 15:05:17 crc kubenswrapper[4869]: I0312 15:05:17.434417 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-h9gxw" Mar 12 15:05:17 crc kubenswrapper[4869]: I0312 15:05:17.547702 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-qhsnc" Mar 12 15:05:17 crc kubenswrapper[4869]: I0312 15:05:17.594498 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-7p294" Mar 12 15:05:17 crc kubenswrapper[4869]: I0312 15:05:17.603820 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-f9gh7" Mar 12 15:05:17 crc kubenswrapper[4869]: I0312 15:05:17.910841 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-96d7k" Mar 12 15:05:17 crc kubenswrapper[4869]: I0312 15:05:17.921853 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-s5ptd" Mar 12 15:05:18 crc kubenswrapper[4869]: I0312 15:05:18.972452 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9089d1a-4d28-4973-a826-c7fa8b99acab-cert\") pod \"infra-operator-controller-manager-5995f4446f-vshpv\" (UID: \"c9089d1a-4d28-4973-a826-c7fa8b99acab\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vshpv" Mar 12 15:05:18 crc kubenswrapper[4869]: I0312 15:05:18.980224 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9089d1a-4d28-4973-a826-c7fa8b99acab-cert\") pod \"infra-operator-controller-manager-5995f4446f-vshpv\" (UID: \"c9089d1a-4d28-4973-a826-c7fa8b99acab\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vshpv" Mar 12 15:05:19 crc kubenswrapper[4869]: I0312 15:05:19.100756 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vshpv" Mar 12 15:05:19 crc kubenswrapper[4869]: I0312 15:05:19.602173 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-vshpv"] Mar 12 15:05:19 crc kubenswrapper[4869]: I0312 15:05:19.684173 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:05:19 crc kubenswrapper[4869]: I0312 15:05:19.684239 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:05:19 crc kubenswrapper[4869]: I0312 15:05:19.782080 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vshpv" event={"ID":"c9089d1a-4d28-4973-a826-c7fa8b99acab","Type":"ContainerStarted","Data":"938b73022c03e355602c39d1c57b70ab601aa763fbded3e98c4f3a823e7b67ac"} Mar 12 15:05:21 crc kubenswrapper[4869]: I0312 15:05:21.797109 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vshpv" event={"ID":"c9089d1a-4d28-4973-a826-c7fa8b99acab","Type":"ContainerStarted","Data":"e11909093b0e22f0506bd5a7779891060d79b13082e840d0e663b8904866b6ac"} Mar 12 15:05:21 crc kubenswrapper[4869]: I0312 15:05:21.797483 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vshpv" Mar 12 15:05:21 crc kubenswrapper[4869]: I0312 15:05:21.810784 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vshpv" podStartSLOduration=34.258374284 podStartE2EDuration="35.810761264s" podCreationTimestamp="2026-03-12 15:04:46 +0000 UTC" firstStartedPulling="2026-03-12 15:05:19.615419446 +0000 UTC m=+1071.900644724" lastFinishedPulling="2026-03-12 15:05:21.167806426 +0000 UTC m=+1073.453031704" observedRunningTime="2026-03-12 15:05:21.809957991 +0000 UTC m=+1074.095183269" watchObservedRunningTime="2026-03-12 15:05:21.810761264 +0000 UTC m=+1074.095986562" Mar 12 15:05:23 crc kubenswrapper[4869]: I0312 15:05:23.537669 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz" Mar 12 15:05:27 crc kubenswrapper[4869]: I0312 15:05:27.672236 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hthfl" Mar 12 15:05:27 crc kubenswrapper[4869]: I0312 15:05:27.677025 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-t2cd8" Mar 12 15:05:27 crc kubenswrapper[4869]: I0312 15:05:27.705080 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-677c674df7-fh776" Mar 12 15:05:27 crc kubenswrapper[4869]: I0312 15:05:27.804892 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-mhw7h" Mar 12 15:05:29 crc kubenswrapper[4869]: I0312 15:05:29.106215 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-vshpv" Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.278945 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-qlhxl"] Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.281189 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-qlhxl" Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.284380 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-n6bg4" Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.284780 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.310877 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-qlhxl"] Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.312762 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kphqk\" (UniqueName: \"kubernetes.io/projected/b5017c25-f09d-4525-976e-40d3cb461614-kube-api-access-kphqk\") pod \"dnsmasq-dns-675f4bcbfc-qlhxl\" (UID: \"b5017c25-f09d-4525-976e-40d3cb461614\") " pod="openstack/dnsmasq-dns-675f4bcbfc-qlhxl" Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.312815 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5017c25-f09d-4525-976e-40d3cb461614-config\") pod \"dnsmasq-dns-675f4bcbfc-qlhxl\" (UID: \"b5017c25-f09d-4525-976e-40d3cb461614\") " pod="openstack/dnsmasq-dns-675f4bcbfc-qlhxl" Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.348000 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-92rhk"] Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.349463 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-92rhk" Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.352063 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.357893 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-92rhk"] Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.413796 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6485c16-2820-4120-97f6-dccc8e1700ed-config\") pod \"dnsmasq-dns-78dd6ddcc-92rhk\" (UID: \"d6485c16-2820-4120-97f6-dccc8e1700ed\") " pod="openstack/dnsmasq-dns-78dd6ddcc-92rhk" Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.413859 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kphqk\" (UniqueName: \"kubernetes.io/projected/b5017c25-f09d-4525-976e-40d3cb461614-kube-api-access-kphqk\") pod \"dnsmasq-dns-675f4bcbfc-qlhxl\" (UID: \"b5017c25-f09d-4525-976e-40d3cb461614\") " pod="openstack/dnsmasq-dns-675f4bcbfc-qlhxl" Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.413885 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6485c16-2820-4120-97f6-dccc8e1700ed-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-92rhk\" (UID: \"d6485c16-2820-4120-97f6-dccc8e1700ed\") " pod="openstack/dnsmasq-dns-78dd6ddcc-92rhk" Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.413922 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5017c25-f09d-4525-976e-40d3cb461614-config\") pod \"dnsmasq-dns-675f4bcbfc-qlhxl\" (UID: \"b5017c25-f09d-4525-976e-40d3cb461614\") " pod="openstack/dnsmasq-dns-675f4bcbfc-qlhxl" Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.413954 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4prnq\" (UniqueName: \"kubernetes.io/projected/d6485c16-2820-4120-97f6-dccc8e1700ed-kube-api-access-4prnq\") pod \"dnsmasq-dns-78dd6ddcc-92rhk\" (UID: \"d6485c16-2820-4120-97f6-dccc8e1700ed\") " pod="openstack/dnsmasq-dns-78dd6ddcc-92rhk" Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.415084 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5017c25-f09d-4525-976e-40d3cb461614-config\") pod \"dnsmasq-dns-675f4bcbfc-qlhxl\" (UID: \"b5017c25-f09d-4525-976e-40d3cb461614\") " pod="openstack/dnsmasq-dns-675f4bcbfc-qlhxl" Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.439087 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kphqk\" (UniqueName: \"kubernetes.io/projected/b5017c25-f09d-4525-976e-40d3cb461614-kube-api-access-kphqk\") pod \"dnsmasq-dns-675f4bcbfc-qlhxl\" (UID: \"b5017c25-f09d-4525-976e-40d3cb461614\") " pod="openstack/dnsmasq-dns-675f4bcbfc-qlhxl" Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.515075 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6485c16-2820-4120-97f6-dccc8e1700ed-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-92rhk\" (UID: \"d6485c16-2820-4120-97f6-dccc8e1700ed\") " pod="openstack/dnsmasq-dns-78dd6ddcc-92rhk" Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.515170 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4prnq\" (UniqueName: \"kubernetes.io/projected/d6485c16-2820-4120-97f6-dccc8e1700ed-kube-api-access-4prnq\") pod \"dnsmasq-dns-78dd6ddcc-92rhk\" (UID: \"d6485c16-2820-4120-97f6-dccc8e1700ed\") " pod="openstack/dnsmasq-dns-78dd6ddcc-92rhk" Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.515237 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6485c16-2820-4120-97f6-dccc8e1700ed-config\") pod \"dnsmasq-dns-78dd6ddcc-92rhk\" (UID: \"d6485c16-2820-4120-97f6-dccc8e1700ed\") " pod="openstack/dnsmasq-dns-78dd6ddcc-92rhk" Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.516046 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6485c16-2820-4120-97f6-dccc8e1700ed-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-92rhk\" (UID: \"d6485c16-2820-4120-97f6-dccc8e1700ed\") " pod="openstack/dnsmasq-dns-78dd6ddcc-92rhk" Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.516124 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6485c16-2820-4120-97f6-dccc8e1700ed-config\") pod \"dnsmasq-dns-78dd6ddcc-92rhk\" (UID: \"d6485c16-2820-4120-97f6-dccc8e1700ed\") " pod="openstack/dnsmasq-dns-78dd6ddcc-92rhk" Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.532292 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4prnq\" (UniqueName: \"kubernetes.io/projected/d6485c16-2820-4120-97f6-dccc8e1700ed-kube-api-access-4prnq\") pod \"dnsmasq-dns-78dd6ddcc-92rhk\" (UID: \"d6485c16-2820-4120-97f6-dccc8e1700ed\") " pod="openstack/dnsmasq-dns-78dd6ddcc-92rhk" Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.608076 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-qlhxl" Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.665443 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-92rhk" Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.684129 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.684460 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.684514 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.685244 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e0eefeacb8cbd3ff8e6abbc8c8ef619674d30f19c6c54cde868b340244e6c200"} pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.685311 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" containerID="cri-o://e0eefeacb8cbd3ff8e6abbc8c8ef619674d30f19c6c54cde868b340244e6c200" gracePeriod=600 Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.996376 4869 generic.go:334] "Generic (PLEG): container finished" podID="1621c994-94d2-4105-a988-f4739518ba91" containerID="e0eefeacb8cbd3ff8e6abbc8c8ef619674d30f19c6c54cde868b340244e6c200" exitCode=0 Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.996415 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerDied","Data":"e0eefeacb8cbd3ff8e6abbc8c8ef619674d30f19c6c54cde868b340244e6c200"} Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.996442 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerStarted","Data":"2f5452570a2d00afc7e7591fc67b3884055af23168ca0e1d9b4ff0e5dcdc6950"} Mar 12 15:05:49 crc kubenswrapper[4869]: I0312 15:05:49.996461 4869 scope.go:117] "RemoveContainer" containerID="f13d6e2c562cf7652d696fbd35b8d5dcd20c099639553782e920f330cc3ff75c" Mar 12 15:05:50 crc kubenswrapper[4869]: I0312 15:05:50.039619 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-qlhxl"] Mar 12 15:05:50 crc kubenswrapper[4869]: W0312 15:05:50.044071 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5017c25_f09d_4525_976e_40d3cb461614.slice/crio-c7f3a3c8e1fe1d2b8dd67e5d3e1482e3b66bc6ff50b7e89a24908b63f27c9ed1 WatchSource:0}: Error finding container c7f3a3c8e1fe1d2b8dd67e5d3e1482e3b66bc6ff50b7e89a24908b63f27c9ed1: Status 404 returned error can't find the container with id c7f3a3c8e1fe1d2b8dd67e5d3e1482e3b66bc6ff50b7e89a24908b63f27c9ed1 Mar 12 15:05:50 crc kubenswrapper[4869]: I0312 15:05:50.122567 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-92rhk"] Mar 12 15:05:51 crc kubenswrapper[4869]: I0312 15:05:51.004773 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-qlhxl" event={"ID":"b5017c25-f09d-4525-976e-40d3cb461614","Type":"ContainerStarted","Data":"c7f3a3c8e1fe1d2b8dd67e5d3e1482e3b66bc6ff50b7e89a24908b63f27c9ed1"} Mar 12 15:05:51 crc kubenswrapper[4869]: I0312 15:05:51.009384 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-92rhk" event={"ID":"d6485c16-2820-4120-97f6-dccc8e1700ed","Type":"ContainerStarted","Data":"6075ef73f3dc8134b320b552a31b58961b93d76008304139f929a2bed68d4c34"} Mar 12 15:05:52 crc kubenswrapper[4869]: I0312 15:05:52.107544 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-qlhxl"] Mar 12 15:05:52 crc kubenswrapper[4869]: I0312 15:05:52.136449 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-67phk"] Mar 12 15:05:52 crc kubenswrapper[4869]: I0312 15:05:52.139438 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-67phk" Mar 12 15:05:52 crc kubenswrapper[4869]: I0312 15:05:52.146883 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-67phk"] Mar 12 15:05:52 crc kubenswrapper[4869]: I0312 15:05:52.260263 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e05b23e2-5c7f-4421-9b32-b70957eea944-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-67phk\" (UID: \"e05b23e2-5c7f-4421-9b32-b70957eea944\") " pod="openstack/dnsmasq-dns-5ccc8479f9-67phk" Mar 12 15:05:52 crc kubenswrapper[4869]: I0312 15:05:52.260366 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27lm9\" (UniqueName: \"kubernetes.io/projected/e05b23e2-5c7f-4421-9b32-b70957eea944-kube-api-access-27lm9\") pod \"dnsmasq-dns-5ccc8479f9-67phk\" (UID: \"e05b23e2-5c7f-4421-9b32-b70957eea944\") " pod="openstack/dnsmasq-dns-5ccc8479f9-67phk" Mar 12 15:05:52 crc kubenswrapper[4869]: I0312 15:05:52.260386 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e05b23e2-5c7f-4421-9b32-b70957eea944-config\") pod \"dnsmasq-dns-5ccc8479f9-67phk\" (UID: \"e05b23e2-5c7f-4421-9b32-b70957eea944\") " pod="openstack/dnsmasq-dns-5ccc8479f9-67phk" Mar 12 15:05:52 crc kubenswrapper[4869]: I0312 15:05:52.365364 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e05b23e2-5c7f-4421-9b32-b70957eea944-config\") pod \"dnsmasq-dns-5ccc8479f9-67phk\" (UID: \"e05b23e2-5c7f-4421-9b32-b70957eea944\") " pod="openstack/dnsmasq-dns-5ccc8479f9-67phk" Mar 12 15:05:52 crc kubenswrapper[4869]: I0312 15:05:52.365434 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27lm9\" (UniqueName: \"kubernetes.io/projected/e05b23e2-5c7f-4421-9b32-b70957eea944-kube-api-access-27lm9\") pod \"dnsmasq-dns-5ccc8479f9-67phk\" (UID: \"e05b23e2-5c7f-4421-9b32-b70957eea944\") " pod="openstack/dnsmasq-dns-5ccc8479f9-67phk" Mar 12 15:05:52 crc kubenswrapper[4869]: I0312 15:05:52.365508 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e05b23e2-5c7f-4421-9b32-b70957eea944-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-67phk\" (UID: \"e05b23e2-5c7f-4421-9b32-b70957eea944\") " pod="openstack/dnsmasq-dns-5ccc8479f9-67phk" Mar 12 15:05:52 crc kubenswrapper[4869]: I0312 15:05:52.366454 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e05b23e2-5c7f-4421-9b32-b70957eea944-config\") pod \"dnsmasq-dns-5ccc8479f9-67phk\" (UID: \"e05b23e2-5c7f-4421-9b32-b70957eea944\") " pod="openstack/dnsmasq-dns-5ccc8479f9-67phk" Mar 12 15:05:52 crc kubenswrapper[4869]: I0312 15:05:52.366948 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e05b23e2-5c7f-4421-9b32-b70957eea944-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-67phk\" (UID: \"e05b23e2-5c7f-4421-9b32-b70957eea944\") " pod="openstack/dnsmasq-dns-5ccc8479f9-67phk" Mar 12 15:05:52 crc kubenswrapper[4869]: I0312 15:05:52.418005 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27lm9\" (UniqueName: \"kubernetes.io/projected/e05b23e2-5c7f-4421-9b32-b70957eea944-kube-api-access-27lm9\") pod \"dnsmasq-dns-5ccc8479f9-67phk\" (UID: \"e05b23e2-5c7f-4421-9b32-b70957eea944\") " pod="openstack/dnsmasq-dns-5ccc8479f9-67phk" Mar 12 15:05:52 crc kubenswrapper[4869]: I0312 15:05:52.468218 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-67phk" Mar 12 15:05:52 crc kubenswrapper[4869]: I0312 15:05:52.525848 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-92rhk"] Mar 12 15:05:52 crc kubenswrapper[4869]: I0312 15:05:52.548146 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-glxdl"] Mar 12 15:05:52 crc kubenswrapper[4869]: I0312 15:05:52.549234 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-glxdl" Mar 12 15:05:52 crc kubenswrapper[4869]: I0312 15:05:52.569904 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qx7k\" (UniqueName: \"kubernetes.io/projected/fb130117-6676-4748-89ff-b2e5f4a1c120-kube-api-access-6qx7k\") pod \"dnsmasq-dns-57d769cc4f-glxdl\" (UID: \"fb130117-6676-4748-89ff-b2e5f4a1c120\") " pod="openstack/dnsmasq-dns-57d769cc4f-glxdl" Mar 12 15:05:52 crc kubenswrapper[4869]: I0312 15:05:52.569959 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb130117-6676-4748-89ff-b2e5f4a1c120-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-glxdl\" (UID: \"fb130117-6676-4748-89ff-b2e5f4a1c120\") " pod="openstack/dnsmasq-dns-57d769cc4f-glxdl" Mar 12 15:05:52 crc kubenswrapper[4869]: I0312 15:05:52.569993 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb130117-6676-4748-89ff-b2e5f4a1c120-config\") pod \"dnsmasq-dns-57d769cc4f-glxdl\" (UID: \"fb130117-6676-4748-89ff-b2e5f4a1c120\") " pod="openstack/dnsmasq-dns-57d769cc4f-glxdl" Mar 12 15:05:52 crc kubenswrapper[4869]: I0312 15:05:52.573190 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-glxdl"] Mar 12 15:05:52 crc kubenswrapper[4869]: I0312 15:05:52.672856 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qx7k\" (UniqueName: \"kubernetes.io/projected/fb130117-6676-4748-89ff-b2e5f4a1c120-kube-api-access-6qx7k\") pod \"dnsmasq-dns-57d769cc4f-glxdl\" (UID: \"fb130117-6676-4748-89ff-b2e5f4a1c120\") " pod="openstack/dnsmasq-dns-57d769cc4f-glxdl" Mar 12 15:05:52 crc kubenswrapper[4869]: I0312 15:05:52.673268 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb130117-6676-4748-89ff-b2e5f4a1c120-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-glxdl\" (UID: \"fb130117-6676-4748-89ff-b2e5f4a1c120\") " pod="openstack/dnsmasq-dns-57d769cc4f-glxdl" Mar 12 15:05:52 crc kubenswrapper[4869]: I0312 15:05:52.673297 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb130117-6676-4748-89ff-b2e5f4a1c120-config\") pod \"dnsmasq-dns-57d769cc4f-glxdl\" (UID: \"fb130117-6676-4748-89ff-b2e5f4a1c120\") " pod="openstack/dnsmasq-dns-57d769cc4f-glxdl" Mar 12 15:05:52 crc kubenswrapper[4869]: I0312 15:05:52.674357 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb130117-6676-4748-89ff-b2e5f4a1c120-config\") pod \"dnsmasq-dns-57d769cc4f-glxdl\" (UID: \"fb130117-6676-4748-89ff-b2e5f4a1c120\") " pod="openstack/dnsmasq-dns-57d769cc4f-glxdl" Mar 12 15:05:52 crc kubenswrapper[4869]: I0312 15:05:52.674381 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb130117-6676-4748-89ff-b2e5f4a1c120-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-glxdl\" (UID: \"fb130117-6676-4748-89ff-b2e5f4a1c120\") " pod="openstack/dnsmasq-dns-57d769cc4f-glxdl" Mar 12 15:05:52 crc kubenswrapper[4869]: I0312 15:05:52.702867 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qx7k\" (UniqueName: \"kubernetes.io/projected/fb130117-6676-4748-89ff-b2e5f4a1c120-kube-api-access-6qx7k\") pod \"dnsmasq-dns-57d769cc4f-glxdl\" (UID: \"fb130117-6676-4748-89ff-b2e5f4a1c120\") " pod="openstack/dnsmasq-dns-57d769cc4f-glxdl" Mar 12 15:05:52 crc kubenswrapper[4869]: I0312 15:05:52.923256 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-glxdl" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.004157 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-67phk"] Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.301259 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.305142 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.308120 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.308498 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-xbp7k" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.310298 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.310726 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.310789 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.310753 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.310954 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.319350 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.380931 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdbmn\" (UniqueName: \"kubernetes.io/projected/e0323899-ea3b-4572-baa4-3483b0d5fd86-kube-api-access-zdbmn\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.380972 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e0323899-ea3b-4572-baa4-3483b0d5fd86-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.380993 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0323899-ea3b-4572-baa4-3483b0d5fd86-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.381045 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e0323899-ea3b-4572-baa4-3483b0d5fd86-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.381065 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e0323899-ea3b-4572-baa4-3483b0d5fd86-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.381168 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e0323899-ea3b-4572-baa4-3483b0d5fd86-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.381222 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.381350 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e0323899-ea3b-4572-baa4-3483b0d5fd86-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.381392 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e0323899-ea3b-4572-baa4-3483b0d5fd86-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.381422 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e0323899-ea3b-4572-baa4-3483b0d5fd86-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.381465 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e0323899-ea3b-4572-baa4-3483b0d5fd86-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.482447 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e0323899-ea3b-4572-baa4-3483b0d5fd86-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.482520 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdbmn\" (UniqueName: \"kubernetes.io/projected/e0323899-ea3b-4572-baa4-3483b0d5fd86-kube-api-access-zdbmn\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.482547 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e0323899-ea3b-4572-baa4-3483b0d5fd86-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.482579 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0323899-ea3b-4572-baa4-3483b0d5fd86-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.482600 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e0323899-ea3b-4572-baa4-3483b0d5fd86-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.482618 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e0323899-ea3b-4572-baa4-3483b0d5fd86-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.482640 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e0323899-ea3b-4572-baa4-3483b0d5fd86-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.482665 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.482697 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e0323899-ea3b-4572-baa4-3483b0d5fd86-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.482717 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e0323899-ea3b-4572-baa4-3483b0d5fd86-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.482736 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e0323899-ea3b-4572-baa4-3483b0d5fd86-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.483162 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.483547 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0323899-ea3b-4572-baa4-3483b0d5fd86-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.483589 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e0323899-ea3b-4572-baa4-3483b0d5fd86-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.483840 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e0323899-ea3b-4572-baa4-3483b0d5fd86-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.483900 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e0323899-ea3b-4572-baa4-3483b0d5fd86-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.485774 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e0323899-ea3b-4572-baa4-3483b0d5fd86-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.487430 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e0323899-ea3b-4572-baa4-3483b0d5fd86-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.494985 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e0323899-ea3b-4572-baa4-3483b0d5fd86-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.495654 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e0323899-ea3b-4572-baa4-3483b0d5fd86-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.502215 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdbmn\" (UniqueName: \"kubernetes.io/projected/e0323899-ea3b-4572-baa4-3483b0d5fd86-kube-api-access-zdbmn\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.508222 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e0323899-ea3b-4572-baa4-3483b0d5fd86-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.509642 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.636443 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.671628 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.673777 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.682248 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.682860 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.683208 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.682278 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-2cvmj" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.683714 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.684807 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.688881 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.712344 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.790696 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3e764959-1933-4a88-b8de-fd853d49a0d3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.790741 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3e764959-1933-4a88-b8de-fd853d49a0d3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.790780 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3e764959-1933-4a88-b8de-fd853d49a0d3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.790797 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6x4h\" (UniqueName: \"kubernetes.io/projected/3e764959-1933-4a88-b8de-fd853d49a0d3-kube-api-access-l6x4h\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.791021 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3e764959-1933-4a88-b8de-fd853d49a0d3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.791072 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3e764959-1933-4a88-b8de-fd853d49a0d3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.791175 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3e764959-1933-4a88-b8de-fd853d49a0d3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.791202 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3e764959-1933-4a88-b8de-fd853d49a0d3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.791273 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3e764959-1933-4a88-b8de-fd853d49a0d3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.791311 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.791327 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e764959-1933-4a88-b8de-fd853d49a0d3-config-data\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.892843 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e764959-1933-4a88-b8de-fd853d49a0d3-config-data\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.892890 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.892939 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3e764959-1933-4a88-b8de-fd853d49a0d3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.892965 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3e764959-1933-4a88-b8de-fd853d49a0d3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.893429 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3e764959-1933-4a88-b8de-fd853d49a0d3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.893426 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.893750 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3e764959-1933-4a88-b8de-fd853d49a0d3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.893849 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6x4h\" (UniqueName: \"kubernetes.io/projected/3e764959-1933-4a88-b8de-fd853d49a0d3-kube-api-access-l6x4h\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.893926 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3e764959-1933-4a88-b8de-fd853d49a0d3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.893941 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3e764959-1933-4a88-b8de-fd853d49a0d3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.893994 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3e764959-1933-4a88-b8de-fd853d49a0d3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.894008 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3e764959-1933-4a88-b8de-fd853d49a0d3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.894058 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3e764959-1933-4a88-b8de-fd853d49a0d3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.894214 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e764959-1933-4a88-b8de-fd853d49a0d3-config-data\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.895097 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3e764959-1933-4a88-b8de-fd853d49a0d3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.895599 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3e764959-1933-4a88-b8de-fd853d49a0d3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.895936 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3e764959-1933-4a88-b8de-fd853d49a0d3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.898536 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3e764959-1933-4a88-b8de-fd853d49a0d3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.899597 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3e764959-1933-4a88-b8de-fd853d49a0d3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.899687 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3e764959-1933-4a88-b8de-fd853d49a0d3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.900792 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3e764959-1933-4a88-b8de-fd853d49a0d3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.928907 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6x4h\" (UniqueName: \"kubernetes.io/projected/3e764959-1933-4a88-b8de-fd853d49a0d3-kube-api-access-l6x4h\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:53 crc kubenswrapper[4869]: I0312 15:05:53.938927 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " pod="openstack/rabbitmq-server-0" Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.006619 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.682354 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.684147 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.689346 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-444rx" Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.690813 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.690894 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.691547 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.694321 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.703149 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.833780 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"4406efc2-cefd-4e44-a5f0-7384101c9b36\") " pod="openstack/openstack-galera-0" Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.833865 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4406efc2-cefd-4e44-a5f0-7384101c9b36-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4406efc2-cefd-4e44-a5f0-7384101c9b36\") " pod="openstack/openstack-galera-0" Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.833971 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4406efc2-cefd-4e44-a5f0-7384101c9b36-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4406efc2-cefd-4e44-a5f0-7384101c9b36\") " pod="openstack/openstack-galera-0" Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.834021 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4406efc2-cefd-4e44-a5f0-7384101c9b36-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4406efc2-cefd-4e44-a5f0-7384101c9b36\") " pod="openstack/openstack-galera-0" Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.834076 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4406efc2-cefd-4e44-a5f0-7384101c9b36-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4406efc2-cefd-4e44-a5f0-7384101c9b36\") " pod="openstack/openstack-galera-0" Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.834116 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4406efc2-cefd-4e44-a5f0-7384101c9b36-kolla-config\") pod \"openstack-galera-0\" (UID: \"4406efc2-cefd-4e44-a5f0-7384101c9b36\") " pod="openstack/openstack-galera-0" Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.834153 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4406efc2-cefd-4e44-a5f0-7384101c9b36-config-data-default\") pod \"openstack-galera-0\" (UID: \"4406efc2-cefd-4e44-a5f0-7384101c9b36\") " pod="openstack/openstack-galera-0" Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.834207 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mzw7\" (UniqueName: \"kubernetes.io/projected/4406efc2-cefd-4e44-a5f0-7384101c9b36-kube-api-access-2mzw7\") pod \"openstack-galera-0\" (UID: \"4406efc2-cefd-4e44-a5f0-7384101c9b36\") " pod="openstack/openstack-galera-0" Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.935136 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4406efc2-cefd-4e44-a5f0-7384101c9b36-config-data-default\") pod \"openstack-galera-0\" (UID: \"4406efc2-cefd-4e44-a5f0-7384101c9b36\") " pod="openstack/openstack-galera-0" Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.935203 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mzw7\" (UniqueName: \"kubernetes.io/projected/4406efc2-cefd-4e44-a5f0-7384101c9b36-kube-api-access-2mzw7\") pod \"openstack-galera-0\" (UID: \"4406efc2-cefd-4e44-a5f0-7384101c9b36\") " pod="openstack/openstack-galera-0" Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.935245 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"4406efc2-cefd-4e44-a5f0-7384101c9b36\") " pod="openstack/openstack-galera-0" Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.935278 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4406efc2-cefd-4e44-a5f0-7384101c9b36-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4406efc2-cefd-4e44-a5f0-7384101c9b36\") " pod="openstack/openstack-galera-0" Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.935348 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4406efc2-cefd-4e44-a5f0-7384101c9b36-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4406efc2-cefd-4e44-a5f0-7384101c9b36\") " pod="openstack/openstack-galera-0" Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.935378 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4406efc2-cefd-4e44-a5f0-7384101c9b36-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4406efc2-cefd-4e44-a5f0-7384101c9b36\") " pod="openstack/openstack-galera-0" Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.935417 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4406efc2-cefd-4e44-a5f0-7384101c9b36-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4406efc2-cefd-4e44-a5f0-7384101c9b36\") " pod="openstack/openstack-galera-0" Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.935445 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4406efc2-cefd-4e44-a5f0-7384101c9b36-kolla-config\") pod \"openstack-galera-0\" (UID: \"4406efc2-cefd-4e44-a5f0-7384101c9b36\") " pod="openstack/openstack-galera-0" Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.936127 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4406efc2-cefd-4e44-a5f0-7384101c9b36-config-data-default\") pod \"openstack-galera-0\" (UID: \"4406efc2-cefd-4e44-a5f0-7384101c9b36\") " pod="openstack/openstack-galera-0" Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.936132 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4406efc2-cefd-4e44-a5f0-7384101c9b36-kolla-config\") pod \"openstack-galera-0\" (UID: \"4406efc2-cefd-4e44-a5f0-7384101c9b36\") " pod="openstack/openstack-galera-0" Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.936842 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"4406efc2-cefd-4e44-a5f0-7384101c9b36\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.937092 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4406efc2-cefd-4e44-a5f0-7384101c9b36-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4406efc2-cefd-4e44-a5f0-7384101c9b36\") " pod="openstack/openstack-galera-0" Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.937922 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4406efc2-cefd-4e44-a5f0-7384101c9b36-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4406efc2-cefd-4e44-a5f0-7384101c9b36\") " pod="openstack/openstack-galera-0" Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.942884 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4406efc2-cefd-4e44-a5f0-7384101c9b36-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4406efc2-cefd-4e44-a5f0-7384101c9b36\") " pod="openstack/openstack-galera-0" Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.954943 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4406efc2-cefd-4e44-a5f0-7384101c9b36-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4406efc2-cefd-4e44-a5f0-7384101c9b36\") " pod="openstack/openstack-galera-0" Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.957160 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"4406efc2-cefd-4e44-a5f0-7384101c9b36\") " pod="openstack/openstack-galera-0" Mar 12 15:05:54 crc kubenswrapper[4869]: I0312 15:05:54.958282 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mzw7\" (UniqueName: \"kubernetes.io/projected/4406efc2-cefd-4e44-a5f0-7384101c9b36-kube-api-access-2mzw7\") pod \"openstack-galera-0\" (UID: \"4406efc2-cefd-4e44-a5f0-7384101c9b36\") " pod="openstack/openstack-galera-0" Mar 12 15:05:55 crc kubenswrapper[4869]: I0312 15:05:55.011722 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.061463 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-67phk" event={"ID":"e05b23e2-5c7f-4421-9b32-b70957eea944","Type":"ContainerStarted","Data":"59f46c4d644a41e9395acc36430a4c995073c7d5b05d4985882a5d9b54630e2e"} Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.136218 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.137521 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.140970 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-zmxk2" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.141312 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.142815 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.142941 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.154411 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.257891 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c8b58aad-8641-4aec-8053-f4b75d5931e8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c8b58aad-8641-4aec-8053-f4b75d5931e8\") " pod="openstack/openstack-cell1-galera-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.257940 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8b58aad-8641-4aec-8053-f4b75d5931e8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c8b58aad-8641-4aec-8053-f4b75d5931e8\") " pod="openstack/openstack-cell1-galera-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.258015 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"c8b58aad-8641-4aec-8053-f4b75d5931e8\") " pod="openstack/openstack-cell1-galera-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.258067 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c8b58aad-8641-4aec-8053-f4b75d5931e8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c8b58aad-8641-4aec-8053-f4b75d5931e8\") " pod="openstack/openstack-cell1-galera-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.258131 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8b58aad-8641-4aec-8053-f4b75d5931e8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c8b58aad-8641-4aec-8053-f4b75d5931e8\") " pod="openstack/openstack-cell1-galera-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.258183 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n8dm\" (UniqueName: \"kubernetes.io/projected/c8b58aad-8641-4aec-8053-f4b75d5931e8-kube-api-access-6n8dm\") pod \"openstack-cell1-galera-0\" (UID: \"c8b58aad-8641-4aec-8053-f4b75d5931e8\") " pod="openstack/openstack-cell1-galera-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.258239 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c8b58aad-8641-4aec-8053-f4b75d5931e8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c8b58aad-8641-4aec-8053-f4b75d5931e8\") " pod="openstack/openstack-cell1-galera-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.258294 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8b58aad-8641-4aec-8053-f4b75d5931e8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c8b58aad-8641-4aec-8053-f4b75d5931e8\") " pod="openstack/openstack-cell1-galera-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.359863 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c8b58aad-8641-4aec-8053-f4b75d5931e8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c8b58aad-8641-4aec-8053-f4b75d5931e8\") " pod="openstack/openstack-cell1-galera-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.360386 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8b58aad-8641-4aec-8053-f4b75d5931e8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c8b58aad-8641-4aec-8053-f4b75d5931e8\") " pod="openstack/openstack-cell1-galera-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.360426 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"c8b58aad-8641-4aec-8053-f4b75d5931e8\") " pod="openstack/openstack-cell1-galera-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.360472 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c8b58aad-8641-4aec-8053-f4b75d5931e8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c8b58aad-8641-4aec-8053-f4b75d5931e8\") " pod="openstack/openstack-cell1-galera-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.360520 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8b58aad-8641-4aec-8053-f4b75d5931e8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c8b58aad-8641-4aec-8053-f4b75d5931e8\") " pod="openstack/openstack-cell1-galera-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.360547 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n8dm\" (UniqueName: \"kubernetes.io/projected/c8b58aad-8641-4aec-8053-f4b75d5931e8-kube-api-access-6n8dm\") pod \"openstack-cell1-galera-0\" (UID: \"c8b58aad-8641-4aec-8053-f4b75d5931e8\") " pod="openstack/openstack-cell1-galera-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.360602 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c8b58aad-8641-4aec-8053-f4b75d5931e8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c8b58aad-8641-4aec-8053-f4b75d5931e8\") " pod="openstack/openstack-cell1-galera-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.360638 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8b58aad-8641-4aec-8053-f4b75d5931e8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c8b58aad-8641-4aec-8053-f4b75d5931e8\") " pod="openstack/openstack-cell1-galera-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.362027 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8b58aad-8641-4aec-8053-f4b75d5931e8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c8b58aad-8641-4aec-8053-f4b75d5931e8\") " pod="openstack/openstack-cell1-galera-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.362959 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c8b58aad-8641-4aec-8053-f4b75d5931e8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c8b58aad-8641-4aec-8053-f4b75d5931e8\") " pod="openstack/openstack-cell1-galera-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.364245 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"c8b58aad-8641-4aec-8053-f4b75d5931e8\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-cell1-galera-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.364497 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c8b58aad-8641-4aec-8053-f4b75d5931e8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c8b58aad-8641-4aec-8053-f4b75d5931e8\") " pod="openstack/openstack-cell1-galera-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.364516 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c8b58aad-8641-4aec-8053-f4b75d5931e8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c8b58aad-8641-4aec-8053-f4b75d5931e8\") " pod="openstack/openstack-cell1-galera-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.386797 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n8dm\" (UniqueName: \"kubernetes.io/projected/c8b58aad-8641-4aec-8053-f4b75d5931e8-kube-api-access-6n8dm\") pod \"openstack-cell1-galera-0\" (UID: \"c8b58aad-8641-4aec-8053-f4b75d5931e8\") " pod="openstack/openstack-cell1-galera-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.386813 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8b58aad-8641-4aec-8053-f4b75d5931e8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c8b58aad-8641-4aec-8053-f4b75d5931e8\") " pod="openstack/openstack-cell1-galera-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.387097 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8b58aad-8641-4aec-8053-f4b75d5931e8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c8b58aad-8641-4aec-8053-f4b75d5931e8\") " pod="openstack/openstack-cell1-galera-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.398895 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"c8b58aad-8641-4aec-8053-f4b75d5931e8\") " pod="openstack/openstack-cell1-galera-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.471377 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.476967 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.480964 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.484867 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.486896 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.487281 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-lj5pg" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.497986 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.564019 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/28b0b79e-10ab-436b-a34b-af51bd63d60a-config-data\") pod \"memcached-0\" (UID: \"28b0b79e-10ab-436b-a34b-af51bd63d60a\") " pod="openstack/memcached-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.564077 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/28b0b79e-10ab-436b-a34b-af51bd63d60a-memcached-tls-certs\") pod \"memcached-0\" (UID: \"28b0b79e-10ab-436b-a34b-af51bd63d60a\") " pod="openstack/memcached-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.564102 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28b0b79e-10ab-436b-a34b-af51bd63d60a-combined-ca-bundle\") pod \"memcached-0\" (UID: \"28b0b79e-10ab-436b-a34b-af51bd63d60a\") " pod="openstack/memcached-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.564168 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/28b0b79e-10ab-436b-a34b-af51bd63d60a-kolla-config\") pod \"memcached-0\" (UID: \"28b0b79e-10ab-436b-a34b-af51bd63d60a\") " pod="openstack/memcached-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.564200 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtbt8\" (UniqueName: \"kubernetes.io/projected/28b0b79e-10ab-436b-a34b-af51bd63d60a-kube-api-access-gtbt8\") pod \"memcached-0\" (UID: \"28b0b79e-10ab-436b-a34b-af51bd63d60a\") " pod="openstack/memcached-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.665355 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/28b0b79e-10ab-436b-a34b-af51bd63d60a-kolla-config\") pod \"memcached-0\" (UID: \"28b0b79e-10ab-436b-a34b-af51bd63d60a\") " pod="openstack/memcached-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.665411 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtbt8\" (UniqueName: \"kubernetes.io/projected/28b0b79e-10ab-436b-a34b-af51bd63d60a-kube-api-access-gtbt8\") pod \"memcached-0\" (UID: \"28b0b79e-10ab-436b-a34b-af51bd63d60a\") " pod="openstack/memcached-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.665443 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/28b0b79e-10ab-436b-a34b-af51bd63d60a-config-data\") pod \"memcached-0\" (UID: \"28b0b79e-10ab-436b-a34b-af51bd63d60a\") " pod="openstack/memcached-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.665468 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/28b0b79e-10ab-436b-a34b-af51bd63d60a-memcached-tls-certs\") pod \"memcached-0\" (UID: \"28b0b79e-10ab-436b-a34b-af51bd63d60a\") " pod="openstack/memcached-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.665488 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28b0b79e-10ab-436b-a34b-af51bd63d60a-combined-ca-bundle\") pod \"memcached-0\" (UID: \"28b0b79e-10ab-436b-a34b-af51bd63d60a\") " pod="openstack/memcached-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.666838 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/28b0b79e-10ab-436b-a34b-af51bd63d60a-config-data\") pod \"memcached-0\" (UID: \"28b0b79e-10ab-436b-a34b-af51bd63d60a\") " pod="openstack/memcached-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.667390 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/28b0b79e-10ab-436b-a34b-af51bd63d60a-kolla-config\") pod \"memcached-0\" (UID: \"28b0b79e-10ab-436b-a34b-af51bd63d60a\") " pod="openstack/memcached-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.668896 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28b0b79e-10ab-436b-a34b-af51bd63d60a-combined-ca-bundle\") pod \"memcached-0\" (UID: \"28b0b79e-10ab-436b-a34b-af51bd63d60a\") " pod="openstack/memcached-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.672514 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/28b0b79e-10ab-436b-a34b-af51bd63d60a-memcached-tls-certs\") pod \"memcached-0\" (UID: \"28b0b79e-10ab-436b-a34b-af51bd63d60a\") " pod="openstack/memcached-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.691507 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtbt8\" (UniqueName: \"kubernetes.io/projected/28b0b79e-10ab-436b-a34b-af51bd63d60a-kube-api-access-gtbt8\") pod \"memcached-0\" (UID: \"28b0b79e-10ab-436b-a34b-af51bd63d60a\") " pod="openstack/memcached-0" Mar 12 15:05:56 crc kubenswrapper[4869]: I0312 15:05:56.811130 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 12 15:05:58 crc kubenswrapper[4869]: I0312 15:05:58.732591 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 15:05:58 crc kubenswrapper[4869]: I0312 15:05:58.734063 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 12 15:05:58 crc kubenswrapper[4869]: I0312 15:05:58.736976 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-wnmbv" Mar 12 15:05:58 crc kubenswrapper[4869]: I0312 15:05:58.740214 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 15:05:58 crc kubenswrapper[4869]: I0312 15:05:58.897619 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bxxn\" (UniqueName: \"kubernetes.io/projected/5c2e4870-a7a0-4271-b4dd-a68fc8a80ef3-kube-api-access-8bxxn\") pod \"kube-state-metrics-0\" (UID: \"5c2e4870-a7a0-4271-b4dd-a68fc8a80ef3\") " pod="openstack/kube-state-metrics-0" Mar 12 15:05:58 crc kubenswrapper[4869]: I0312 15:05:58.999516 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bxxn\" (UniqueName: \"kubernetes.io/projected/5c2e4870-a7a0-4271-b4dd-a68fc8a80ef3-kube-api-access-8bxxn\") pod \"kube-state-metrics-0\" (UID: \"5c2e4870-a7a0-4271-b4dd-a68fc8a80ef3\") " pod="openstack/kube-state-metrics-0" Mar 12 15:05:59 crc kubenswrapper[4869]: I0312 15:05:59.016374 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bxxn\" (UniqueName: \"kubernetes.io/projected/5c2e4870-a7a0-4271-b4dd-a68fc8a80ef3-kube-api-access-8bxxn\") pod \"kube-state-metrics-0\" (UID: \"5c2e4870-a7a0-4271-b4dd-a68fc8a80ef3\") " pod="openstack/kube-state-metrics-0" Mar 12 15:05:59 crc kubenswrapper[4869]: I0312 15:05:59.110471 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 12 15:06:00 crc kubenswrapper[4869]: I0312 15:06:00.136925 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555466-j7fkr"] Mar 12 15:06:00 crc kubenswrapper[4869]: I0312 15:06:00.138030 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555466-j7fkr" Mar 12 15:06:00 crc kubenswrapper[4869]: I0312 15:06:00.142037 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:06:00 crc kubenswrapper[4869]: I0312 15:06:00.142543 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 15:06:00 crc kubenswrapper[4869]: I0312 15:06:00.143692 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555466-j7fkr"] Mar 12 15:06:00 crc kubenswrapper[4869]: I0312 15:06:00.143867 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:06:00 crc kubenswrapper[4869]: I0312 15:06:00.216784 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqvxq\" (UniqueName: \"kubernetes.io/projected/1b691259-763f-4535-98af-ef6fa62d8a0d-kube-api-access-sqvxq\") pod \"auto-csr-approver-29555466-j7fkr\" (UID: \"1b691259-763f-4535-98af-ef6fa62d8a0d\") " pod="openshift-infra/auto-csr-approver-29555466-j7fkr" Mar 12 15:06:00 crc kubenswrapper[4869]: I0312 15:06:00.318962 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqvxq\" (UniqueName: \"kubernetes.io/projected/1b691259-763f-4535-98af-ef6fa62d8a0d-kube-api-access-sqvxq\") pod \"auto-csr-approver-29555466-j7fkr\" (UID: \"1b691259-763f-4535-98af-ef6fa62d8a0d\") " pod="openshift-infra/auto-csr-approver-29555466-j7fkr" Mar 12 15:06:00 crc kubenswrapper[4869]: I0312 15:06:00.349769 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqvxq\" (UniqueName: \"kubernetes.io/projected/1b691259-763f-4535-98af-ef6fa62d8a0d-kube-api-access-sqvxq\") pod \"auto-csr-approver-29555466-j7fkr\" (UID: \"1b691259-763f-4535-98af-ef6fa62d8a0d\") " pod="openshift-infra/auto-csr-approver-29555466-j7fkr" Mar 12 15:06:00 crc kubenswrapper[4869]: I0312 15:06:00.468654 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555466-j7fkr" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.652138 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-wzzxq"] Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.653054 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wzzxq" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.655394 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-hs4mw" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.655481 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.655526 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.671135 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wzzxq"] Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.698331 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-jvmhv"] Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.699782 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-jvmhv" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.712204 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-jvmhv"] Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.741195 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t6rt\" (UniqueName: \"kubernetes.io/projected/ad973222-a042-43af-9c00-b0f6d795c7d1-kube-api-access-4t6rt\") pod \"ovn-controller-wzzxq\" (UID: \"ad973222-a042-43af-9c00-b0f6d795c7d1\") " pod="openstack/ovn-controller-wzzxq" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.741452 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ad973222-a042-43af-9c00-b0f6d795c7d1-var-run\") pod \"ovn-controller-wzzxq\" (UID: \"ad973222-a042-43af-9c00-b0f6d795c7d1\") " pod="openstack/ovn-controller-wzzxq" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.741747 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad973222-a042-43af-9c00-b0f6d795c7d1-ovn-controller-tls-certs\") pod \"ovn-controller-wzzxq\" (UID: \"ad973222-a042-43af-9c00-b0f6d795c7d1\") " pod="openstack/ovn-controller-wzzxq" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.741853 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ad973222-a042-43af-9c00-b0f6d795c7d1-var-log-ovn\") pod \"ovn-controller-wzzxq\" (UID: \"ad973222-a042-43af-9c00-b0f6d795c7d1\") " pod="openstack/ovn-controller-wzzxq" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.741885 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad973222-a042-43af-9c00-b0f6d795c7d1-combined-ca-bundle\") pod \"ovn-controller-wzzxq\" (UID: \"ad973222-a042-43af-9c00-b0f6d795c7d1\") " pod="openstack/ovn-controller-wzzxq" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.741935 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ad973222-a042-43af-9c00-b0f6d795c7d1-var-run-ovn\") pod \"ovn-controller-wzzxq\" (UID: \"ad973222-a042-43af-9c00-b0f6d795c7d1\") " pod="openstack/ovn-controller-wzzxq" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.742169 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad973222-a042-43af-9c00-b0f6d795c7d1-scripts\") pod \"ovn-controller-wzzxq\" (UID: \"ad973222-a042-43af-9c00-b0f6d795c7d1\") " pod="openstack/ovn-controller-wzzxq" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.843440 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5b0ed011-5259-4904-82e5-320adf5ff1cf-var-log\") pod \"ovn-controller-ovs-jvmhv\" (UID: \"5b0ed011-5259-4904-82e5-320adf5ff1cf\") " pod="openstack/ovn-controller-ovs-jvmhv" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.843492 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad973222-a042-43af-9c00-b0f6d795c7d1-ovn-controller-tls-certs\") pod \"ovn-controller-wzzxq\" (UID: \"ad973222-a042-43af-9c00-b0f6d795c7d1\") " pod="openstack/ovn-controller-wzzxq" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.843511 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pxwq\" (UniqueName: \"kubernetes.io/projected/5b0ed011-5259-4904-82e5-320adf5ff1cf-kube-api-access-4pxwq\") pod \"ovn-controller-ovs-jvmhv\" (UID: \"5b0ed011-5259-4904-82e5-320adf5ff1cf\") " pod="openstack/ovn-controller-ovs-jvmhv" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.843713 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ad973222-a042-43af-9c00-b0f6d795c7d1-var-log-ovn\") pod \"ovn-controller-wzzxq\" (UID: \"ad973222-a042-43af-9c00-b0f6d795c7d1\") " pod="openstack/ovn-controller-wzzxq" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.843755 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad973222-a042-43af-9c00-b0f6d795c7d1-combined-ca-bundle\") pod \"ovn-controller-wzzxq\" (UID: \"ad973222-a042-43af-9c00-b0f6d795c7d1\") " pod="openstack/ovn-controller-wzzxq" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.843804 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ad973222-a042-43af-9c00-b0f6d795c7d1-var-run-ovn\") pod \"ovn-controller-wzzxq\" (UID: \"ad973222-a042-43af-9c00-b0f6d795c7d1\") " pod="openstack/ovn-controller-wzzxq" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.843830 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5b0ed011-5259-4904-82e5-320adf5ff1cf-var-lib\") pod \"ovn-controller-ovs-jvmhv\" (UID: \"5b0ed011-5259-4904-82e5-320adf5ff1cf\") " pod="openstack/ovn-controller-ovs-jvmhv" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.843890 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad973222-a042-43af-9c00-b0f6d795c7d1-scripts\") pod \"ovn-controller-wzzxq\" (UID: \"ad973222-a042-43af-9c00-b0f6d795c7d1\") " pod="openstack/ovn-controller-wzzxq" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.843959 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t6rt\" (UniqueName: \"kubernetes.io/projected/ad973222-a042-43af-9c00-b0f6d795c7d1-kube-api-access-4t6rt\") pod \"ovn-controller-wzzxq\" (UID: \"ad973222-a042-43af-9c00-b0f6d795c7d1\") " pod="openstack/ovn-controller-wzzxq" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.843991 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5b0ed011-5259-4904-82e5-320adf5ff1cf-scripts\") pod \"ovn-controller-ovs-jvmhv\" (UID: \"5b0ed011-5259-4904-82e5-320adf5ff1cf\") " pod="openstack/ovn-controller-ovs-jvmhv" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.844035 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ad973222-a042-43af-9c00-b0f6d795c7d1-var-run\") pod \"ovn-controller-wzzxq\" (UID: \"ad973222-a042-43af-9c00-b0f6d795c7d1\") " pod="openstack/ovn-controller-wzzxq" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.844071 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5b0ed011-5259-4904-82e5-320adf5ff1cf-etc-ovs\") pod \"ovn-controller-ovs-jvmhv\" (UID: \"5b0ed011-5259-4904-82e5-320adf5ff1cf\") " pod="openstack/ovn-controller-ovs-jvmhv" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.844098 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5b0ed011-5259-4904-82e5-320adf5ff1cf-var-run\") pod \"ovn-controller-ovs-jvmhv\" (UID: \"5b0ed011-5259-4904-82e5-320adf5ff1cf\") " pod="openstack/ovn-controller-ovs-jvmhv" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.844653 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ad973222-a042-43af-9c00-b0f6d795c7d1-var-log-ovn\") pod \"ovn-controller-wzzxq\" (UID: \"ad973222-a042-43af-9c00-b0f6d795c7d1\") " pod="openstack/ovn-controller-wzzxq" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.844665 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ad973222-a042-43af-9c00-b0f6d795c7d1-var-run-ovn\") pod \"ovn-controller-wzzxq\" (UID: \"ad973222-a042-43af-9c00-b0f6d795c7d1\") " pod="openstack/ovn-controller-wzzxq" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.844753 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ad973222-a042-43af-9c00-b0f6d795c7d1-var-run\") pod \"ovn-controller-wzzxq\" (UID: \"ad973222-a042-43af-9c00-b0f6d795c7d1\") " pod="openstack/ovn-controller-wzzxq" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.846266 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad973222-a042-43af-9c00-b0f6d795c7d1-scripts\") pod \"ovn-controller-wzzxq\" (UID: \"ad973222-a042-43af-9c00-b0f6d795c7d1\") " pod="openstack/ovn-controller-wzzxq" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.848194 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad973222-a042-43af-9c00-b0f6d795c7d1-ovn-controller-tls-certs\") pod \"ovn-controller-wzzxq\" (UID: \"ad973222-a042-43af-9c00-b0f6d795c7d1\") " pod="openstack/ovn-controller-wzzxq" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.850938 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad973222-a042-43af-9c00-b0f6d795c7d1-combined-ca-bundle\") pod \"ovn-controller-wzzxq\" (UID: \"ad973222-a042-43af-9c00-b0f6d795c7d1\") " pod="openstack/ovn-controller-wzzxq" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.860732 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t6rt\" (UniqueName: \"kubernetes.io/projected/ad973222-a042-43af-9c00-b0f6d795c7d1-kube-api-access-4t6rt\") pod \"ovn-controller-wzzxq\" (UID: \"ad973222-a042-43af-9c00-b0f6d795c7d1\") " pod="openstack/ovn-controller-wzzxq" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.949918 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5b0ed011-5259-4904-82e5-320adf5ff1cf-var-lib\") pod \"ovn-controller-ovs-jvmhv\" (UID: \"5b0ed011-5259-4904-82e5-320adf5ff1cf\") " pod="openstack/ovn-controller-ovs-jvmhv" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.950133 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5b0ed011-5259-4904-82e5-320adf5ff1cf-scripts\") pod \"ovn-controller-ovs-jvmhv\" (UID: \"5b0ed011-5259-4904-82e5-320adf5ff1cf\") " pod="openstack/ovn-controller-ovs-jvmhv" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.950171 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5b0ed011-5259-4904-82e5-320adf5ff1cf-etc-ovs\") pod \"ovn-controller-ovs-jvmhv\" (UID: \"5b0ed011-5259-4904-82e5-320adf5ff1cf\") " pod="openstack/ovn-controller-ovs-jvmhv" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.950191 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5b0ed011-5259-4904-82e5-320adf5ff1cf-var-run\") pod \"ovn-controller-ovs-jvmhv\" (UID: \"5b0ed011-5259-4904-82e5-320adf5ff1cf\") " pod="openstack/ovn-controller-ovs-jvmhv" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.950215 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5b0ed011-5259-4904-82e5-320adf5ff1cf-var-log\") pod \"ovn-controller-ovs-jvmhv\" (UID: \"5b0ed011-5259-4904-82e5-320adf5ff1cf\") " pod="openstack/ovn-controller-ovs-jvmhv" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.950238 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pxwq\" (UniqueName: \"kubernetes.io/projected/5b0ed011-5259-4904-82e5-320adf5ff1cf-kube-api-access-4pxwq\") pod \"ovn-controller-ovs-jvmhv\" (UID: \"5b0ed011-5259-4904-82e5-320adf5ff1cf\") " pod="openstack/ovn-controller-ovs-jvmhv" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.950669 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5b0ed011-5259-4904-82e5-320adf5ff1cf-var-lib\") pod \"ovn-controller-ovs-jvmhv\" (UID: \"5b0ed011-5259-4904-82e5-320adf5ff1cf\") " pod="openstack/ovn-controller-ovs-jvmhv" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.950829 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5b0ed011-5259-4904-82e5-320adf5ff1cf-var-run\") pod \"ovn-controller-ovs-jvmhv\" (UID: \"5b0ed011-5259-4904-82e5-320adf5ff1cf\") " pod="openstack/ovn-controller-ovs-jvmhv" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.951160 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5b0ed011-5259-4904-82e5-320adf5ff1cf-etc-ovs\") pod \"ovn-controller-ovs-jvmhv\" (UID: \"5b0ed011-5259-4904-82e5-320adf5ff1cf\") " pod="openstack/ovn-controller-ovs-jvmhv" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.951180 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5b0ed011-5259-4904-82e5-320adf5ff1cf-var-log\") pod \"ovn-controller-ovs-jvmhv\" (UID: \"5b0ed011-5259-4904-82e5-320adf5ff1cf\") " pod="openstack/ovn-controller-ovs-jvmhv" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.958887 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5b0ed011-5259-4904-82e5-320adf5ff1cf-scripts\") pod \"ovn-controller-ovs-jvmhv\" (UID: \"5b0ed011-5259-4904-82e5-320adf5ff1cf\") " pod="openstack/ovn-controller-ovs-jvmhv" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.966693 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wzzxq" Mar 12 15:06:01 crc kubenswrapper[4869]: I0312 15:06:01.973404 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pxwq\" (UniqueName: \"kubernetes.io/projected/5b0ed011-5259-4904-82e5-320adf5ff1cf-kube-api-access-4pxwq\") pod \"ovn-controller-ovs-jvmhv\" (UID: \"5b0ed011-5259-4904-82e5-320adf5ff1cf\") " pod="openstack/ovn-controller-ovs-jvmhv" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.014324 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-jvmhv" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.371603 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.373423 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.373520 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.413592 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-9wsmv" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.413619 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.413773 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.413922 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.414026 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.459331 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5bf47c8a-507e-4eba-9776-b516e4555df4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"5bf47c8a-507e-4eba-9776-b516e4555df4\") " pod="openstack/ovsdbserver-nb-0" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.459371 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bf47c8a-507e-4eba-9776-b516e4555df4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"5bf47c8a-507e-4eba-9776-b516e4555df4\") " pod="openstack/ovsdbserver-nb-0" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.459504 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bf47c8a-507e-4eba-9776-b516e4555df4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5bf47c8a-507e-4eba-9776-b516e4555df4\") " pod="openstack/ovsdbserver-nb-0" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.459570 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bf47c8a-507e-4eba-9776-b516e4555df4-config\") pod \"ovsdbserver-nb-0\" (UID: \"5bf47c8a-507e-4eba-9776-b516e4555df4\") " pod="openstack/ovsdbserver-nb-0" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.459782 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bf47c8a-507e-4eba-9776-b516e4555df4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"5bf47c8a-507e-4eba-9776-b516e4555df4\") " pod="openstack/ovsdbserver-nb-0" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.459892 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbh95\" (UniqueName: \"kubernetes.io/projected/5bf47c8a-507e-4eba-9776-b516e4555df4-kube-api-access-hbh95\") pod \"ovsdbserver-nb-0\" (UID: \"5bf47c8a-507e-4eba-9776-b516e4555df4\") " pod="openstack/ovsdbserver-nb-0" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.459925 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"5bf47c8a-507e-4eba-9776-b516e4555df4\") " pod="openstack/ovsdbserver-nb-0" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.459982 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bf47c8a-507e-4eba-9776-b516e4555df4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5bf47c8a-507e-4eba-9776-b516e4555df4\") " pod="openstack/ovsdbserver-nb-0" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.561997 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bf47c8a-507e-4eba-9776-b516e4555df4-config\") pod \"ovsdbserver-nb-0\" (UID: \"5bf47c8a-507e-4eba-9776-b516e4555df4\") " pod="openstack/ovsdbserver-nb-0" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.562115 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bf47c8a-507e-4eba-9776-b516e4555df4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"5bf47c8a-507e-4eba-9776-b516e4555df4\") " pod="openstack/ovsdbserver-nb-0" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.562180 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbh95\" (UniqueName: \"kubernetes.io/projected/5bf47c8a-507e-4eba-9776-b516e4555df4-kube-api-access-hbh95\") pod \"ovsdbserver-nb-0\" (UID: \"5bf47c8a-507e-4eba-9776-b516e4555df4\") " pod="openstack/ovsdbserver-nb-0" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.562199 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"5bf47c8a-507e-4eba-9776-b516e4555df4\") " pod="openstack/ovsdbserver-nb-0" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.562251 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bf47c8a-507e-4eba-9776-b516e4555df4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5bf47c8a-507e-4eba-9776-b516e4555df4\") " pod="openstack/ovsdbserver-nb-0" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.562284 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5bf47c8a-507e-4eba-9776-b516e4555df4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"5bf47c8a-507e-4eba-9776-b516e4555df4\") " pod="openstack/ovsdbserver-nb-0" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.562324 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bf47c8a-507e-4eba-9776-b516e4555df4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"5bf47c8a-507e-4eba-9776-b516e4555df4\") " pod="openstack/ovsdbserver-nb-0" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.562397 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bf47c8a-507e-4eba-9776-b516e4555df4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5bf47c8a-507e-4eba-9776-b516e4555df4\") " pod="openstack/ovsdbserver-nb-0" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.563259 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"5bf47c8a-507e-4eba-9776-b516e4555df4\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-nb-0" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.564629 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bf47c8a-507e-4eba-9776-b516e4555df4-config\") pod \"ovsdbserver-nb-0\" (UID: \"5bf47c8a-507e-4eba-9776-b516e4555df4\") " pod="openstack/ovsdbserver-nb-0" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.564873 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bf47c8a-507e-4eba-9776-b516e4555df4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"5bf47c8a-507e-4eba-9776-b516e4555df4\") " pod="openstack/ovsdbserver-nb-0" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.565191 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5bf47c8a-507e-4eba-9776-b516e4555df4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"5bf47c8a-507e-4eba-9776-b516e4555df4\") " pod="openstack/ovsdbserver-nb-0" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.566389 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bf47c8a-507e-4eba-9776-b516e4555df4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5bf47c8a-507e-4eba-9776-b516e4555df4\") " pod="openstack/ovsdbserver-nb-0" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.567668 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bf47c8a-507e-4eba-9776-b516e4555df4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5bf47c8a-507e-4eba-9776-b516e4555df4\") " pod="openstack/ovsdbserver-nb-0" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.582357 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bf47c8a-507e-4eba-9776-b516e4555df4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"5bf47c8a-507e-4eba-9776-b516e4555df4\") " pod="openstack/ovsdbserver-nb-0" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.586691 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbh95\" (UniqueName: \"kubernetes.io/projected/5bf47c8a-507e-4eba-9776-b516e4555df4-kube-api-access-hbh95\") pod \"ovsdbserver-nb-0\" (UID: \"5bf47c8a-507e-4eba-9776-b516e4555df4\") " pod="openstack/ovsdbserver-nb-0" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.616703 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"5bf47c8a-507e-4eba-9776-b516e4555df4\") " pod="openstack/ovsdbserver-nb-0" Mar 12 15:06:02 crc kubenswrapper[4869]: I0312 15:06:02.736219 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 12 15:06:04 crc kubenswrapper[4869]: I0312 15:06:04.438059 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 15:06:05 crc kubenswrapper[4869]: E0312 15:06:05.335518 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 12 15:06:05 crc kubenswrapper[4869]: E0312 15:06:05.335931 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kphqk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-qlhxl_openstack(b5017c25-f09d-4525-976e-40d3cb461614): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 15:06:05 crc kubenswrapper[4869]: E0312 15:06:05.337779 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-qlhxl" podUID="b5017c25-f09d-4525-976e-40d3cb461614" Mar 12 15:06:05 crc kubenswrapper[4869]: W0312 15:06:05.358094 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0323899_ea3b_4572_baa4_3483b0d5fd86.slice/crio-e8b3d6437b78bc78380602884eef512b1bdb0c6a9ad5b0e001d6087cbc220f1e WatchSource:0}: Error finding container e8b3d6437b78bc78380602884eef512b1bdb0c6a9ad5b0e001d6087cbc220f1e: Status 404 returned error can't find the container with id e8b3d6437b78bc78380602884eef512b1bdb0c6a9ad5b0e001d6087cbc220f1e Mar 12 15:06:05 crc kubenswrapper[4869]: E0312 15:06:05.403604 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 12 15:06:05 crc kubenswrapper[4869]: E0312 15:06:05.403793 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4prnq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-92rhk_openstack(d6485c16-2820-4120-97f6-dccc8e1700ed): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 15:06:05 crc kubenswrapper[4869]: E0312 15:06:05.405685 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-92rhk" podUID="d6485c16-2820-4120-97f6-dccc8e1700ed" Mar 12 15:06:05 crc kubenswrapper[4869]: I0312 15:06:05.850686 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 12 15:06:05 crc kubenswrapper[4869]: W0312 15:06:05.872146 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8b58aad_8641_4aec_8053_f4b75d5931e8.slice/crio-13727bf042fdefd7053d8fec1a06c8abcc42a4fc5cb000ecca006d2b23886639 WatchSource:0}: Error finding container 13727bf042fdefd7053d8fec1a06c8abcc42a4fc5cb000ecca006d2b23886639: Status 404 returned error can't find the container with id 13727bf042fdefd7053d8fec1a06c8abcc42a4fc5cb000ecca006d2b23886639 Mar 12 15:06:05 crc kubenswrapper[4869]: I0312 15:06:05.887083 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 12 15:06:05 crc kubenswrapper[4869]: I0312 15:06:05.888255 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 12 15:06:05 crc kubenswrapper[4869]: I0312 15:06:05.890669 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 12 15:06:05 crc kubenswrapper[4869]: I0312 15:06:05.891335 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 12 15:06:05 crc kubenswrapper[4869]: I0312 15:06:05.891954 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 12 15:06:05 crc kubenswrapper[4869]: I0312 15:06:05.904138 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 12 15:06:05 crc kubenswrapper[4869]: I0312 15:06:05.904756 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-6pgs5" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.024829 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0454dc15-f63d-475c-9640-71a8d60d9e56-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0454dc15-f63d-475c-9640-71a8d60d9e56\") " pod="openstack/ovsdbserver-sb-0" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.024903 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0454dc15-f63d-475c-9640-71a8d60d9e56-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0454dc15-f63d-475c-9640-71a8d60d9e56\") " pod="openstack/ovsdbserver-sb-0" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.024953 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"0454dc15-f63d-475c-9640-71a8d60d9e56\") " pod="openstack/ovsdbserver-sb-0" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.024994 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0454dc15-f63d-475c-9640-71a8d60d9e56-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0454dc15-f63d-475c-9640-71a8d60d9e56\") " pod="openstack/ovsdbserver-sb-0" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.025076 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0454dc15-f63d-475c-9640-71a8d60d9e56-config\") pod \"ovsdbserver-sb-0\" (UID: \"0454dc15-f63d-475c-9640-71a8d60d9e56\") " pod="openstack/ovsdbserver-sb-0" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.025156 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0454dc15-f63d-475c-9640-71a8d60d9e56-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0454dc15-f63d-475c-9640-71a8d60d9e56\") " pod="openstack/ovsdbserver-sb-0" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.025244 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0454dc15-f63d-475c-9640-71a8d60d9e56-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0454dc15-f63d-475c-9640-71a8d60d9e56\") " pod="openstack/ovsdbserver-sb-0" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.025277 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5pm6\" (UniqueName: \"kubernetes.io/projected/0454dc15-f63d-475c-9640-71a8d60d9e56-kube-api-access-q5pm6\") pod \"ovsdbserver-sb-0\" (UID: \"0454dc15-f63d-475c-9640-71a8d60d9e56\") " pod="openstack/ovsdbserver-sb-0" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.127029 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0454dc15-f63d-475c-9640-71a8d60d9e56-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0454dc15-f63d-475c-9640-71a8d60d9e56\") " pod="openstack/ovsdbserver-sb-0" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.127071 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5pm6\" (UniqueName: \"kubernetes.io/projected/0454dc15-f63d-475c-9640-71a8d60d9e56-kube-api-access-q5pm6\") pod \"ovsdbserver-sb-0\" (UID: \"0454dc15-f63d-475c-9640-71a8d60d9e56\") " pod="openstack/ovsdbserver-sb-0" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.127108 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0454dc15-f63d-475c-9640-71a8d60d9e56-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0454dc15-f63d-475c-9640-71a8d60d9e56\") " pod="openstack/ovsdbserver-sb-0" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.127128 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0454dc15-f63d-475c-9640-71a8d60d9e56-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0454dc15-f63d-475c-9640-71a8d60d9e56\") " pod="openstack/ovsdbserver-sb-0" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.127153 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"0454dc15-f63d-475c-9640-71a8d60d9e56\") " pod="openstack/ovsdbserver-sb-0" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.127176 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0454dc15-f63d-475c-9640-71a8d60d9e56-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0454dc15-f63d-475c-9640-71a8d60d9e56\") " pod="openstack/ovsdbserver-sb-0" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.127206 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0454dc15-f63d-475c-9640-71a8d60d9e56-config\") pod \"ovsdbserver-sb-0\" (UID: \"0454dc15-f63d-475c-9640-71a8d60d9e56\") " pod="openstack/ovsdbserver-sb-0" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.127242 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0454dc15-f63d-475c-9640-71a8d60d9e56-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0454dc15-f63d-475c-9640-71a8d60d9e56\") " pod="openstack/ovsdbserver-sb-0" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.128677 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"0454dc15-f63d-475c-9640-71a8d60d9e56\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-sb-0" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.129126 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0454dc15-f63d-475c-9640-71a8d60d9e56-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0454dc15-f63d-475c-9640-71a8d60d9e56\") " pod="openstack/ovsdbserver-sb-0" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.129883 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0454dc15-f63d-475c-9640-71a8d60d9e56-config\") pod \"ovsdbserver-sb-0\" (UID: \"0454dc15-f63d-475c-9640-71a8d60d9e56\") " pod="openstack/ovsdbserver-sb-0" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.130098 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0454dc15-f63d-475c-9640-71a8d60d9e56-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0454dc15-f63d-475c-9640-71a8d60d9e56\") " pod="openstack/ovsdbserver-sb-0" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.134415 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0454dc15-f63d-475c-9640-71a8d60d9e56-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0454dc15-f63d-475c-9640-71a8d60d9e56\") " pod="openstack/ovsdbserver-sb-0" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.134651 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0454dc15-f63d-475c-9640-71a8d60d9e56-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0454dc15-f63d-475c-9640-71a8d60d9e56\") " pod="openstack/ovsdbserver-sb-0" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.143969 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0454dc15-f63d-475c-9640-71a8d60d9e56-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0454dc15-f63d-475c-9640-71a8d60d9e56\") " pod="openstack/ovsdbserver-sb-0" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.144509 4869 generic.go:334] "Generic (PLEG): container finished" podID="e05b23e2-5c7f-4421-9b32-b70957eea944" containerID="130ba41f8efb128bfc76cbe6082fa320e002250ccbcf11d7536afb9a683ada25" exitCode=0 Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.144583 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-67phk" event={"ID":"e05b23e2-5c7f-4421-9b32-b70957eea944","Type":"ContainerDied","Data":"130ba41f8efb128bfc76cbe6082fa320e002250ccbcf11d7536afb9a683ada25"} Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.146731 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5pm6\" (UniqueName: \"kubernetes.io/projected/0454dc15-f63d-475c-9640-71a8d60d9e56-kube-api-access-q5pm6\") pod \"ovsdbserver-sb-0\" (UID: \"0454dc15-f63d-475c-9640-71a8d60d9e56\") " pod="openstack/ovsdbserver-sb-0" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.148168 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c8b58aad-8641-4aec-8053-f4b75d5931e8","Type":"ContainerStarted","Data":"13727bf042fdefd7053d8fec1a06c8abcc42a4fc5cb000ecca006d2b23886639"} Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.149654 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e0323899-ea3b-4572-baa4-3483b0d5fd86","Type":"ContainerStarted","Data":"e8b3d6437b78bc78380602884eef512b1bdb0c6a9ad5b0e001d6087cbc220f1e"} Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.156203 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"0454dc15-f63d-475c-9640-71a8d60d9e56\") " pod="openstack/ovsdbserver-sb-0" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.213871 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.282336 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555466-j7fkr"] Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.295520 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.300743 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 15:06:06 crc kubenswrapper[4869]: W0312 15:06:06.313508 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b691259_763f_4535_98af_ef6fa62d8a0d.slice/crio-ad07d33358850ac7a5556a388fbb5e84e3521d719934fc83b215f20219273fa6 WatchSource:0}: Error finding container ad07d33358850ac7a5556a388fbb5e84e3521d719934fc83b215f20219273fa6: Status 404 returned error can't find the container with id ad07d33358850ac7a5556a388fbb5e84e3521d719934fc83b215f20219273fa6 Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.328611 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.336199 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-glxdl"] Mar 12 15:06:06 crc kubenswrapper[4869]: W0312 15:06:06.346303 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e764959_1933_4a88_b8de_fd853d49a0d3.slice/crio-5f1fd2094bef98383942a91561a5cbbf3befcfbbb8d294e9a33ae0b2e565e142 WatchSource:0}: Error finding container 5f1fd2094bef98383942a91561a5cbbf3befcfbbb8d294e9a33ae0b2e565e142: Status 404 returned error can't find the container with id 5f1fd2094bef98383942a91561a5cbbf3befcfbbb8d294e9a33ae0b2e565e142 Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.375098 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wzzxq"] Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.375375 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.502741 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 12 15:06:06 crc kubenswrapper[4869]: E0312 15:06:06.515745 4869 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 12 15:06:06 crc kubenswrapper[4869]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/e05b23e2-5c7f-4421-9b32-b70957eea944/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 12 15:06:06 crc kubenswrapper[4869]: > podSandboxID="59f46c4d644a41e9395acc36430a4c995073c7d5b05d4985882a5d9b54630e2e" Mar 12 15:06:06 crc kubenswrapper[4869]: E0312 15:06:06.515897 4869 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 12 15:06:06 crc kubenswrapper[4869]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-27lm9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-67phk_openstack(e05b23e2-5c7f-4421-9b32-b70957eea944): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/e05b23e2-5c7f-4421-9b32-b70957eea944/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 12 15:06:06 crc kubenswrapper[4869]: > logger="UnhandledError" Mar 12 15:06:06 crc kubenswrapper[4869]: E0312 15:06:06.517073 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/e05b23e2-5c7f-4421-9b32-b70957eea944/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5ccc8479f9-67phk" podUID="e05b23e2-5c7f-4421-9b32-b70957eea944" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.582444 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-jvmhv"] Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.663312 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-92rhk" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.684201 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-qlhxl" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.750387 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kphqk\" (UniqueName: \"kubernetes.io/projected/b5017c25-f09d-4525-976e-40d3cb461614-kube-api-access-kphqk\") pod \"b5017c25-f09d-4525-976e-40d3cb461614\" (UID: \"b5017c25-f09d-4525-976e-40d3cb461614\") " Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.750890 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5017c25-f09d-4525-976e-40d3cb461614-config\") pod \"b5017c25-f09d-4525-976e-40d3cb461614\" (UID: \"b5017c25-f09d-4525-976e-40d3cb461614\") " Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.751011 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4prnq\" (UniqueName: \"kubernetes.io/projected/d6485c16-2820-4120-97f6-dccc8e1700ed-kube-api-access-4prnq\") pod \"d6485c16-2820-4120-97f6-dccc8e1700ed\" (UID: \"d6485c16-2820-4120-97f6-dccc8e1700ed\") " Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.751192 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6485c16-2820-4120-97f6-dccc8e1700ed-dns-svc\") pod \"d6485c16-2820-4120-97f6-dccc8e1700ed\" (UID: \"d6485c16-2820-4120-97f6-dccc8e1700ed\") " Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.751420 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6485c16-2820-4120-97f6-dccc8e1700ed-config\") pod \"d6485c16-2820-4120-97f6-dccc8e1700ed\" (UID: \"d6485c16-2820-4120-97f6-dccc8e1700ed\") " Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.751620 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5017c25-f09d-4525-976e-40d3cb461614-config" (OuterVolumeSpecName: "config") pod "b5017c25-f09d-4525-976e-40d3cb461614" (UID: "b5017c25-f09d-4525-976e-40d3cb461614"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.751883 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6485c16-2820-4120-97f6-dccc8e1700ed-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d6485c16-2820-4120-97f6-dccc8e1700ed" (UID: "d6485c16-2820-4120-97f6-dccc8e1700ed"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.752167 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6485c16-2820-4120-97f6-dccc8e1700ed-config" (OuterVolumeSpecName: "config") pod "d6485c16-2820-4120-97f6-dccc8e1700ed" (UID: "d6485c16-2820-4120-97f6-dccc8e1700ed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.755030 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5017c25-f09d-4525-976e-40d3cb461614-config\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.755181 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6485c16-2820-4120-97f6-dccc8e1700ed-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.755315 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6485c16-2820-4120-97f6-dccc8e1700ed-config\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.756100 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6485c16-2820-4120-97f6-dccc8e1700ed-kube-api-access-4prnq" (OuterVolumeSpecName: "kube-api-access-4prnq") pod "d6485c16-2820-4120-97f6-dccc8e1700ed" (UID: "d6485c16-2820-4120-97f6-dccc8e1700ed"). InnerVolumeSpecName "kube-api-access-4prnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.757345 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5017c25-f09d-4525-976e-40d3cb461614-kube-api-access-kphqk" (OuterVolumeSpecName: "kube-api-access-kphqk") pod "b5017c25-f09d-4525-976e-40d3cb461614" (UID: "b5017c25-f09d-4525-976e-40d3cb461614"). InnerVolumeSpecName "kube-api-access-kphqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.857794 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kphqk\" (UniqueName: \"kubernetes.io/projected/b5017c25-f09d-4525-976e-40d3cb461614-kube-api-access-kphqk\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.857822 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4prnq\" (UniqueName: \"kubernetes.io/projected/d6485c16-2820-4120-97f6-dccc8e1700ed-kube-api-access-4prnq\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:06 crc kubenswrapper[4869]: I0312 15:06:06.902816 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.035417 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-hrwrb"] Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.043117 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-hrwrb" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.047864 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.057080 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-hrwrb"] Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.161302 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/6c89cdd2-7ac2-46d2-be58-77f8b79acd81-ovn-rundir\") pod \"ovn-controller-metrics-hrwrb\" (UID: \"6c89cdd2-7ac2-46d2-be58-77f8b79acd81\") " pod="openstack/ovn-controller-metrics-hrwrb" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.161384 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c89cdd2-7ac2-46d2-be58-77f8b79acd81-config\") pod \"ovn-controller-metrics-hrwrb\" (UID: \"6c89cdd2-7ac2-46d2-be58-77f8b79acd81\") " pod="openstack/ovn-controller-metrics-hrwrb" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.161436 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbb7w\" (UniqueName: \"kubernetes.io/projected/6c89cdd2-7ac2-46d2-be58-77f8b79acd81-kube-api-access-hbb7w\") pod \"ovn-controller-metrics-hrwrb\" (UID: \"6c89cdd2-7ac2-46d2-be58-77f8b79acd81\") " pod="openstack/ovn-controller-metrics-hrwrb" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.161476 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c89cdd2-7ac2-46d2-be58-77f8b79acd81-combined-ca-bundle\") pod \"ovn-controller-metrics-hrwrb\" (UID: \"6c89cdd2-7ac2-46d2-be58-77f8b79acd81\") " pod="openstack/ovn-controller-metrics-hrwrb" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.161512 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/6c89cdd2-7ac2-46d2-be58-77f8b79acd81-ovs-rundir\") pod \"ovn-controller-metrics-hrwrb\" (UID: \"6c89cdd2-7ac2-46d2-be58-77f8b79acd81\") " pod="openstack/ovn-controller-metrics-hrwrb" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.161570 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c89cdd2-7ac2-46d2-be58-77f8b79acd81-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hrwrb\" (UID: \"6c89cdd2-7ac2-46d2-be58-77f8b79acd81\") " pod="openstack/ovn-controller-metrics-hrwrb" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.174519 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-92rhk" event={"ID":"d6485c16-2820-4120-97f6-dccc8e1700ed","Type":"ContainerDied","Data":"6075ef73f3dc8134b320b552a31b58961b93d76008304139f929a2bed68d4c34"} Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.174633 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-92rhk" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.176658 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jvmhv" event={"ID":"5b0ed011-5259-4904-82e5-320adf5ff1cf","Type":"ContainerStarted","Data":"11e1c6cd018eae9668c908935669d03672f10d8d356f9cecf046bb3d1615c872"} Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.177926 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"5bf47c8a-507e-4eba-9776-b516e4555df4","Type":"ContainerStarted","Data":"dd68df9c9d8fcbd2da3ca40db99399d8ad550a4c5c516fe2796c29952b71f324"} Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.178944 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wzzxq" event={"ID":"ad973222-a042-43af-9c00-b0f6d795c7d1","Type":"ContainerStarted","Data":"044fd73770589f066e27dd970f11200ffe3c916a5281f737cd9818479751c7f7"} Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.179722 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3e764959-1933-4a88-b8de-fd853d49a0d3","Type":"ContainerStarted","Data":"5f1fd2094bef98383942a91561a5cbbf3befcfbbb8d294e9a33ae0b2e565e142"} Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.183421 4869 generic.go:334] "Generic (PLEG): container finished" podID="fb130117-6676-4748-89ff-b2e5f4a1c120" containerID="6f885d0200171ca3a39e2c2bd3be89ffdd40ff3591d713663971b4822d8574aa" exitCode=0 Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.183453 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-glxdl" event={"ID":"fb130117-6676-4748-89ff-b2e5f4a1c120","Type":"ContainerDied","Data":"6f885d0200171ca3a39e2c2bd3be89ffdd40ff3591d713663971b4822d8574aa"} Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.183480 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-glxdl" event={"ID":"fb130117-6676-4748-89ff-b2e5f4a1c120","Type":"ContainerStarted","Data":"5b324ba054356ada731bc079d048d06526bcb4d89c82ed39ea738cd43310b57d"} Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.184821 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"28b0b79e-10ab-436b-a34b-af51bd63d60a","Type":"ContainerStarted","Data":"3c1590c075177615cc2331d5667a0a8fe41a6a552b8c2c34d76d2fb064623d95"} Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.187694 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555466-j7fkr" event={"ID":"1b691259-763f-4535-98af-ef6fa62d8a0d","Type":"ContainerStarted","Data":"ad07d33358850ac7a5556a388fbb5e84e3521d719934fc83b215f20219273fa6"} Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.192308 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-qlhxl" event={"ID":"b5017c25-f09d-4525-976e-40d3cb461614","Type":"ContainerDied","Data":"c7f3a3c8e1fe1d2b8dd67e5d3e1482e3b66bc6ff50b7e89a24908b63f27c9ed1"} Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.192391 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-qlhxl" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.213077 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5c2e4870-a7a0-4271-b4dd-a68fc8a80ef3","Type":"ContainerStarted","Data":"9bc697c96f2c504a60ace19a99b1c769b5b869a718d5b7f73938d03e5925e355"} Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.262678 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4406efc2-cefd-4e44-a5f0-7384101c9b36","Type":"ContainerStarted","Data":"d313cf08717ffb990d02af2ddfe3b68df1edfc23e13e90fa46c7e81d1c4315ed"} Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.266690 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c89cdd2-7ac2-46d2-be58-77f8b79acd81-config\") pod \"ovn-controller-metrics-hrwrb\" (UID: \"6c89cdd2-7ac2-46d2-be58-77f8b79acd81\") " pod="openstack/ovn-controller-metrics-hrwrb" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.269012 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbb7w\" (UniqueName: \"kubernetes.io/projected/6c89cdd2-7ac2-46d2-be58-77f8b79acd81-kube-api-access-hbb7w\") pod \"ovn-controller-metrics-hrwrb\" (UID: \"6c89cdd2-7ac2-46d2-be58-77f8b79acd81\") " pod="openstack/ovn-controller-metrics-hrwrb" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.269160 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c89cdd2-7ac2-46d2-be58-77f8b79acd81-combined-ca-bundle\") pod \"ovn-controller-metrics-hrwrb\" (UID: \"6c89cdd2-7ac2-46d2-be58-77f8b79acd81\") " pod="openstack/ovn-controller-metrics-hrwrb" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.269535 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/6c89cdd2-7ac2-46d2-be58-77f8b79acd81-ovs-rundir\") pod \"ovn-controller-metrics-hrwrb\" (UID: \"6c89cdd2-7ac2-46d2-be58-77f8b79acd81\") " pod="openstack/ovn-controller-metrics-hrwrb" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.269670 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c89cdd2-7ac2-46d2-be58-77f8b79acd81-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hrwrb\" (UID: \"6c89cdd2-7ac2-46d2-be58-77f8b79acd81\") " pod="openstack/ovn-controller-metrics-hrwrb" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.269718 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/6c89cdd2-7ac2-46d2-be58-77f8b79acd81-ovn-rundir\") pod \"ovn-controller-metrics-hrwrb\" (UID: \"6c89cdd2-7ac2-46d2-be58-77f8b79acd81\") " pod="openstack/ovn-controller-metrics-hrwrb" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.267459 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c89cdd2-7ac2-46d2-be58-77f8b79acd81-config\") pod \"ovn-controller-metrics-hrwrb\" (UID: \"6c89cdd2-7ac2-46d2-be58-77f8b79acd81\") " pod="openstack/ovn-controller-metrics-hrwrb" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.270494 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/6c89cdd2-7ac2-46d2-be58-77f8b79acd81-ovs-rundir\") pod \"ovn-controller-metrics-hrwrb\" (UID: \"6c89cdd2-7ac2-46d2-be58-77f8b79acd81\") " pod="openstack/ovn-controller-metrics-hrwrb" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.270644 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/6c89cdd2-7ac2-46d2-be58-77f8b79acd81-ovn-rundir\") pod \"ovn-controller-metrics-hrwrb\" (UID: \"6c89cdd2-7ac2-46d2-be58-77f8b79acd81\") " pod="openstack/ovn-controller-metrics-hrwrb" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.287892 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c89cdd2-7ac2-46d2-be58-77f8b79acd81-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hrwrb\" (UID: \"6c89cdd2-7ac2-46d2-be58-77f8b79acd81\") " pod="openstack/ovn-controller-metrics-hrwrb" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.287974 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0454dc15-f63d-475c-9640-71a8d60d9e56","Type":"ContainerStarted","Data":"de53d37400738abb52ae1acb6bb2655c7e6e530cdaf061369538fd5ff82673d0"} Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.288029 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c89cdd2-7ac2-46d2-be58-77f8b79acd81-combined-ca-bundle\") pod \"ovn-controller-metrics-hrwrb\" (UID: \"6c89cdd2-7ac2-46d2-be58-77f8b79acd81\") " pod="openstack/ovn-controller-metrics-hrwrb" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.357261 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbb7w\" (UniqueName: \"kubernetes.io/projected/6c89cdd2-7ac2-46d2-be58-77f8b79acd81-kube-api-access-hbb7w\") pod \"ovn-controller-metrics-hrwrb\" (UID: \"6c89cdd2-7ac2-46d2-be58-77f8b79acd81\") " pod="openstack/ovn-controller-metrics-hrwrb" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.379813 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-hrwrb" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.437687 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-glxdl"] Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.551796 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-xlmvz"] Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.553346 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-qlhxl"] Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.553430 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-xlmvz" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.556145 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.560941 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-qlhxl"] Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.573935 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-xlmvz"] Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.587104 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-92rhk"] Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.589991 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-92rhk"] Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.603858 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-67phk"] Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.606004 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp8hf\" (UniqueName: \"kubernetes.io/projected/38272c93-d688-45cc-8dd5-1f5ae4549216-kube-api-access-bp8hf\") pod \"dnsmasq-dns-7fd796d7df-xlmvz\" (UID: \"38272c93-d688-45cc-8dd5-1f5ae4549216\") " pod="openstack/dnsmasq-dns-7fd796d7df-xlmvz" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.606055 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38272c93-d688-45cc-8dd5-1f5ae4549216-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-xlmvz\" (UID: \"38272c93-d688-45cc-8dd5-1f5ae4549216\") " pod="openstack/dnsmasq-dns-7fd796d7df-xlmvz" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.606108 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38272c93-d688-45cc-8dd5-1f5ae4549216-config\") pod \"dnsmasq-dns-7fd796d7df-xlmvz\" (UID: \"38272c93-d688-45cc-8dd5-1f5ae4549216\") " pod="openstack/dnsmasq-dns-7fd796d7df-xlmvz" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.606160 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38272c93-d688-45cc-8dd5-1f5ae4549216-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-xlmvz\" (UID: \"38272c93-d688-45cc-8dd5-1f5ae4549216\") " pod="openstack/dnsmasq-dns-7fd796d7df-xlmvz" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.623328 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-km7wm"] Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.633185 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-km7wm" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.639459 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.661075 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-km7wm"] Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.707695 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d5a854f-c386-4811-bc85-90fa35c9915a-config\") pod \"dnsmasq-dns-86db49b7ff-km7wm\" (UID: \"4d5a854f-c386-4811-bc85-90fa35c9915a\") " pod="openstack/dnsmasq-dns-86db49b7ff-km7wm" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.707736 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38272c93-d688-45cc-8dd5-1f5ae4549216-config\") pod \"dnsmasq-dns-7fd796d7df-xlmvz\" (UID: \"38272c93-d688-45cc-8dd5-1f5ae4549216\") " pod="openstack/dnsmasq-dns-7fd796d7df-xlmvz" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.707772 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p86k7\" (UniqueName: \"kubernetes.io/projected/4d5a854f-c386-4811-bc85-90fa35c9915a-kube-api-access-p86k7\") pod \"dnsmasq-dns-86db49b7ff-km7wm\" (UID: \"4d5a854f-c386-4811-bc85-90fa35c9915a\") " pod="openstack/dnsmasq-dns-86db49b7ff-km7wm" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.707813 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38272c93-d688-45cc-8dd5-1f5ae4549216-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-xlmvz\" (UID: \"38272c93-d688-45cc-8dd5-1f5ae4549216\") " pod="openstack/dnsmasq-dns-7fd796d7df-xlmvz" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.707829 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d5a854f-c386-4811-bc85-90fa35c9915a-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-km7wm\" (UID: \"4d5a854f-c386-4811-bc85-90fa35c9915a\") " pod="openstack/dnsmasq-dns-86db49b7ff-km7wm" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.707901 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp8hf\" (UniqueName: \"kubernetes.io/projected/38272c93-d688-45cc-8dd5-1f5ae4549216-kube-api-access-bp8hf\") pod \"dnsmasq-dns-7fd796d7df-xlmvz\" (UID: \"38272c93-d688-45cc-8dd5-1f5ae4549216\") " pod="openstack/dnsmasq-dns-7fd796d7df-xlmvz" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.707920 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d5a854f-c386-4811-bc85-90fa35c9915a-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-km7wm\" (UID: \"4d5a854f-c386-4811-bc85-90fa35c9915a\") " pod="openstack/dnsmasq-dns-86db49b7ff-km7wm" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.707940 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38272c93-d688-45cc-8dd5-1f5ae4549216-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-xlmvz\" (UID: \"38272c93-d688-45cc-8dd5-1f5ae4549216\") " pod="openstack/dnsmasq-dns-7fd796d7df-xlmvz" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.707958 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d5a854f-c386-4811-bc85-90fa35c9915a-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-km7wm\" (UID: \"4d5a854f-c386-4811-bc85-90fa35c9915a\") " pod="openstack/dnsmasq-dns-86db49b7ff-km7wm" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.708997 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38272c93-d688-45cc-8dd5-1f5ae4549216-config\") pod \"dnsmasq-dns-7fd796d7df-xlmvz\" (UID: \"38272c93-d688-45cc-8dd5-1f5ae4549216\") " pod="openstack/dnsmasq-dns-7fd796d7df-xlmvz" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.709097 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38272c93-d688-45cc-8dd5-1f5ae4549216-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-xlmvz\" (UID: \"38272c93-d688-45cc-8dd5-1f5ae4549216\") " pod="openstack/dnsmasq-dns-7fd796d7df-xlmvz" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.709530 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38272c93-d688-45cc-8dd5-1f5ae4549216-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-xlmvz\" (UID: \"38272c93-d688-45cc-8dd5-1f5ae4549216\") " pod="openstack/dnsmasq-dns-7fd796d7df-xlmvz" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.733374 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp8hf\" (UniqueName: \"kubernetes.io/projected/38272c93-d688-45cc-8dd5-1f5ae4549216-kube-api-access-bp8hf\") pod \"dnsmasq-dns-7fd796d7df-xlmvz\" (UID: \"38272c93-d688-45cc-8dd5-1f5ae4549216\") " pod="openstack/dnsmasq-dns-7fd796d7df-xlmvz" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.809008 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d5a854f-c386-4811-bc85-90fa35c9915a-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-km7wm\" (UID: \"4d5a854f-c386-4811-bc85-90fa35c9915a\") " pod="openstack/dnsmasq-dns-86db49b7ff-km7wm" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.809059 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d5a854f-c386-4811-bc85-90fa35c9915a-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-km7wm\" (UID: \"4d5a854f-c386-4811-bc85-90fa35c9915a\") " pod="openstack/dnsmasq-dns-86db49b7ff-km7wm" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.809088 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d5a854f-c386-4811-bc85-90fa35c9915a-config\") pod \"dnsmasq-dns-86db49b7ff-km7wm\" (UID: \"4d5a854f-c386-4811-bc85-90fa35c9915a\") " pod="openstack/dnsmasq-dns-86db49b7ff-km7wm" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.809121 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p86k7\" (UniqueName: \"kubernetes.io/projected/4d5a854f-c386-4811-bc85-90fa35c9915a-kube-api-access-p86k7\") pod \"dnsmasq-dns-86db49b7ff-km7wm\" (UID: \"4d5a854f-c386-4811-bc85-90fa35c9915a\") " pod="openstack/dnsmasq-dns-86db49b7ff-km7wm" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.809171 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d5a854f-c386-4811-bc85-90fa35c9915a-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-km7wm\" (UID: \"4d5a854f-c386-4811-bc85-90fa35c9915a\") " pod="openstack/dnsmasq-dns-86db49b7ff-km7wm" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.810009 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d5a854f-c386-4811-bc85-90fa35c9915a-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-km7wm\" (UID: \"4d5a854f-c386-4811-bc85-90fa35c9915a\") " pod="openstack/dnsmasq-dns-86db49b7ff-km7wm" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.810601 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d5a854f-c386-4811-bc85-90fa35c9915a-config\") pod \"dnsmasq-dns-86db49b7ff-km7wm\" (UID: \"4d5a854f-c386-4811-bc85-90fa35c9915a\") " pod="openstack/dnsmasq-dns-86db49b7ff-km7wm" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.811716 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d5a854f-c386-4811-bc85-90fa35c9915a-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-km7wm\" (UID: \"4d5a854f-c386-4811-bc85-90fa35c9915a\") " pod="openstack/dnsmasq-dns-86db49b7ff-km7wm" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.813141 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d5a854f-c386-4811-bc85-90fa35c9915a-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-km7wm\" (UID: \"4d5a854f-c386-4811-bc85-90fa35c9915a\") " pod="openstack/dnsmasq-dns-86db49b7ff-km7wm" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.830289 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p86k7\" (UniqueName: \"kubernetes.io/projected/4d5a854f-c386-4811-bc85-90fa35c9915a-kube-api-access-p86k7\") pod \"dnsmasq-dns-86db49b7ff-km7wm\" (UID: \"4d5a854f-c386-4811-bc85-90fa35c9915a\") " pod="openstack/dnsmasq-dns-86db49b7ff-km7wm" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.874599 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-xlmvz" Mar 12 15:06:07 crc kubenswrapper[4869]: I0312 15:06:07.984399 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-km7wm" Mar 12 15:06:08 crc kubenswrapper[4869]: I0312 15:06:08.079679 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-hrwrb"] Mar 12 15:06:08 crc kubenswrapper[4869]: I0312 15:06:08.311968 4869 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 15:06:08 crc kubenswrapper[4869]: I0312 15:06:08.315776 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-hrwrb" event={"ID":"6c89cdd2-7ac2-46d2-be58-77f8b79acd81","Type":"ContainerStarted","Data":"531f8880929e5b2ebfc78f2d0d3c6a4bba7ebf1141ba1f6d8c8254f2459d5467"} Mar 12 15:06:08 crc kubenswrapper[4869]: I0312 15:06:08.319529 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-glxdl" event={"ID":"fb130117-6676-4748-89ff-b2e5f4a1c120","Type":"ContainerStarted","Data":"3a90f46cf781bb327ec69ae8b87f2bc9735202d54b0ab621c3815cc563f27eb7"} Mar 12 15:06:08 crc kubenswrapper[4869]: I0312 15:06:08.319720 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-glxdl" podUID="fb130117-6676-4748-89ff-b2e5f4a1c120" containerName="dnsmasq-dns" containerID="cri-o://3a90f46cf781bb327ec69ae8b87f2bc9735202d54b0ab621c3815cc563f27eb7" gracePeriod=10 Mar 12 15:06:08 crc kubenswrapper[4869]: I0312 15:06:08.319978 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-glxdl" Mar 12 15:06:08 crc kubenswrapper[4869]: I0312 15:06:08.349210 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-glxdl" podStartSLOduration=16.349196405 podStartE2EDuration="16.349196405s" podCreationTimestamp="2026-03-12 15:05:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:06:08.341897266 +0000 UTC m=+1120.627122544" watchObservedRunningTime="2026-03-12 15:06:08.349196405 +0000 UTC m=+1120.634421683" Mar 12 15:06:08 crc kubenswrapper[4869]: I0312 15:06:08.362015 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5017c25-f09d-4525-976e-40d3cb461614" path="/var/lib/kubelet/pods/b5017c25-f09d-4525-976e-40d3cb461614/volumes" Mar 12 15:06:08 crc kubenswrapper[4869]: I0312 15:06:08.362445 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6485c16-2820-4120-97f6-dccc8e1700ed" path="/var/lib/kubelet/pods/d6485c16-2820-4120-97f6-dccc8e1700ed/volumes" Mar 12 15:06:09 crc kubenswrapper[4869]: I0312 15:06:09.346155 4869 generic.go:334] "Generic (PLEG): container finished" podID="fb130117-6676-4748-89ff-b2e5f4a1c120" containerID="3a90f46cf781bb327ec69ae8b87f2bc9735202d54b0ab621c3815cc563f27eb7" exitCode=0 Mar 12 15:06:09 crc kubenswrapper[4869]: I0312 15:06:09.346254 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-glxdl" event={"ID":"fb130117-6676-4748-89ff-b2e5f4a1c120","Type":"ContainerDied","Data":"3a90f46cf781bb327ec69ae8b87f2bc9735202d54b0ab621c3815cc563f27eb7"} Mar 12 15:06:13 crc kubenswrapper[4869]: I0312 15:06:13.582948 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-glxdl" Mar 12 15:06:13 crc kubenswrapper[4869]: I0312 15:06:13.715014 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qx7k\" (UniqueName: \"kubernetes.io/projected/fb130117-6676-4748-89ff-b2e5f4a1c120-kube-api-access-6qx7k\") pod \"fb130117-6676-4748-89ff-b2e5f4a1c120\" (UID: \"fb130117-6676-4748-89ff-b2e5f4a1c120\") " Mar 12 15:06:13 crc kubenswrapper[4869]: I0312 15:06:13.715525 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb130117-6676-4748-89ff-b2e5f4a1c120-config\") pod \"fb130117-6676-4748-89ff-b2e5f4a1c120\" (UID: \"fb130117-6676-4748-89ff-b2e5f4a1c120\") " Mar 12 15:06:13 crc kubenswrapper[4869]: I0312 15:06:13.715640 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb130117-6676-4748-89ff-b2e5f4a1c120-dns-svc\") pod \"fb130117-6676-4748-89ff-b2e5f4a1c120\" (UID: \"fb130117-6676-4748-89ff-b2e5f4a1c120\") " Mar 12 15:06:13 crc kubenswrapper[4869]: I0312 15:06:13.721007 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb130117-6676-4748-89ff-b2e5f4a1c120-kube-api-access-6qx7k" (OuterVolumeSpecName: "kube-api-access-6qx7k") pod "fb130117-6676-4748-89ff-b2e5f4a1c120" (UID: "fb130117-6676-4748-89ff-b2e5f4a1c120"). InnerVolumeSpecName "kube-api-access-6qx7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:06:13 crc kubenswrapper[4869]: I0312 15:06:13.757395 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb130117-6676-4748-89ff-b2e5f4a1c120-config" (OuterVolumeSpecName: "config") pod "fb130117-6676-4748-89ff-b2e5f4a1c120" (UID: "fb130117-6676-4748-89ff-b2e5f4a1c120"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:06:13 crc kubenswrapper[4869]: I0312 15:06:13.772431 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb130117-6676-4748-89ff-b2e5f4a1c120-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fb130117-6676-4748-89ff-b2e5f4a1c120" (UID: "fb130117-6676-4748-89ff-b2e5f4a1c120"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:06:13 crc kubenswrapper[4869]: I0312 15:06:13.817874 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qx7k\" (UniqueName: \"kubernetes.io/projected/fb130117-6676-4748-89ff-b2e5f4a1c120-kube-api-access-6qx7k\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:13 crc kubenswrapper[4869]: I0312 15:06:13.817914 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb130117-6676-4748-89ff-b2e5f4a1c120-config\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:13 crc kubenswrapper[4869]: I0312 15:06:13.817925 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb130117-6676-4748-89ff-b2e5f4a1c120-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:13 crc kubenswrapper[4869]: I0312 15:06:13.940656 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-km7wm"] Mar 12 15:06:14 crc kubenswrapper[4869]: I0312 15:06:14.387850 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-glxdl" event={"ID":"fb130117-6676-4748-89ff-b2e5f4a1c120","Type":"ContainerDied","Data":"5b324ba054356ada731bc079d048d06526bcb4d89c82ed39ea738cd43310b57d"} Mar 12 15:06:14 crc kubenswrapper[4869]: I0312 15:06:14.387905 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-glxdl" Mar 12 15:06:14 crc kubenswrapper[4869]: I0312 15:06:14.388152 4869 scope.go:117] "RemoveContainer" containerID="3a90f46cf781bb327ec69ae8b87f2bc9735202d54b0ab621c3815cc563f27eb7" Mar 12 15:06:14 crc kubenswrapper[4869]: I0312 15:06:14.406771 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-glxdl"] Mar 12 15:06:14 crc kubenswrapper[4869]: I0312 15:06:14.412324 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-glxdl"] Mar 12 15:06:16 crc kubenswrapper[4869]: I0312 15:06:16.014948 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-xlmvz"] Mar 12 15:06:16 crc kubenswrapper[4869]: I0312 15:06:16.346051 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb130117-6676-4748-89ff-b2e5f4a1c120" path="/var/lib/kubelet/pods/fb130117-6676-4748-89ff-b2e5f4a1c120/volumes" Mar 12 15:06:17 crc kubenswrapper[4869]: I0312 15:06:17.924104 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-glxdl" podUID="fb130117-6676-4748-89ff-b2e5f4a1c120" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.100:5353: i/o timeout" Mar 12 15:06:18 crc kubenswrapper[4869]: I0312 15:06:18.415557 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-67phk" event={"ID":"e05b23e2-5c7f-4421-9b32-b70957eea944","Type":"ContainerStarted","Data":"a8ee6ba6483402e6643252f397b2af68355b922a12755fa4a11606e941663f25"} Mar 12 15:06:18 crc kubenswrapper[4869]: I0312 15:06:18.415712 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-67phk" podUID="e05b23e2-5c7f-4421-9b32-b70957eea944" containerName="dnsmasq-dns" containerID="cri-o://a8ee6ba6483402e6643252f397b2af68355b922a12755fa4a11606e941663f25" gracePeriod=10 Mar 12 15:06:18 crc kubenswrapper[4869]: I0312 15:06:18.415850 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-67phk" Mar 12 15:06:18 crc kubenswrapper[4869]: I0312 15:06:18.439086 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-67phk" podStartSLOduration=16.497221208 podStartE2EDuration="26.439066164s" podCreationTimestamp="2026-03-12 15:05:52 +0000 UTC" firstStartedPulling="2026-03-12 15:05:55.591515416 +0000 UTC m=+1107.876740684" lastFinishedPulling="2026-03-12 15:06:05.533360362 +0000 UTC m=+1117.818585640" observedRunningTime="2026-03-12 15:06:18.433906856 +0000 UTC m=+1130.719132144" watchObservedRunningTime="2026-03-12 15:06:18.439066164 +0000 UTC m=+1130.724291452" Mar 12 15:06:18 crc kubenswrapper[4869]: W0312 15:06:18.881674 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38272c93_d688_45cc_8dd5_1f5ae4549216.slice/crio-f083100778d1a002c7f22e093b0aafd249a2e426a3022a6e426ca02eb885662c WatchSource:0}: Error finding container f083100778d1a002c7f22e093b0aafd249a2e426a3022a6e426ca02eb885662c: Status 404 returned error can't find the container with id f083100778d1a002c7f22e093b0aafd249a2e426a3022a6e426ca02eb885662c Mar 12 15:06:18 crc kubenswrapper[4869]: I0312 15:06:18.935323 4869 scope.go:117] "RemoveContainer" containerID="6f885d0200171ca3a39e2c2bd3be89ffdd40ff3591d713663971b4822d8574aa" Mar 12 15:06:19 crc kubenswrapper[4869]: I0312 15:06:19.424345 4869 generic.go:334] "Generic (PLEG): container finished" podID="e05b23e2-5c7f-4421-9b32-b70957eea944" containerID="a8ee6ba6483402e6643252f397b2af68355b922a12755fa4a11606e941663f25" exitCode=0 Mar 12 15:06:19 crc kubenswrapper[4869]: I0312 15:06:19.424419 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-67phk" event={"ID":"e05b23e2-5c7f-4421-9b32-b70957eea944","Type":"ContainerDied","Data":"a8ee6ba6483402e6643252f397b2af68355b922a12755fa4a11606e941663f25"} Mar 12 15:06:19 crc kubenswrapper[4869]: I0312 15:06:19.427131 4869 generic.go:334] "Generic (PLEG): container finished" podID="1b691259-763f-4535-98af-ef6fa62d8a0d" containerID="6b711caada6fcc2094c40c2ad7cddf69eeefb7243782e53a79bc55dfcb83f8c6" exitCode=0 Mar 12 15:06:19 crc kubenswrapper[4869]: I0312 15:06:19.427179 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555466-j7fkr" event={"ID":"1b691259-763f-4535-98af-ef6fa62d8a0d","Type":"ContainerDied","Data":"6b711caada6fcc2094c40c2ad7cddf69eeefb7243782e53a79bc55dfcb83f8c6"} Mar 12 15:06:19 crc kubenswrapper[4869]: I0312 15:06:19.428444 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-km7wm" event={"ID":"4d5a854f-c386-4811-bc85-90fa35c9915a","Type":"ContainerStarted","Data":"423ab1c0db799ce22fbc35dd25548070f51b75a91399c1cd707d4554d382b860"} Mar 12 15:06:19 crc kubenswrapper[4869]: I0312 15:06:19.432282 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-xlmvz" event={"ID":"38272c93-d688-45cc-8dd5-1f5ae4549216","Type":"ContainerStarted","Data":"f083100778d1a002c7f22e093b0aafd249a2e426a3022a6e426ca02eb885662c"} Mar 12 15:06:20 crc kubenswrapper[4869]: I0312 15:06:20.180227 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-67phk" Mar 12 15:06:20 crc kubenswrapper[4869]: I0312 15:06:20.314609 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27lm9\" (UniqueName: \"kubernetes.io/projected/e05b23e2-5c7f-4421-9b32-b70957eea944-kube-api-access-27lm9\") pod \"e05b23e2-5c7f-4421-9b32-b70957eea944\" (UID: \"e05b23e2-5c7f-4421-9b32-b70957eea944\") " Mar 12 15:06:20 crc kubenswrapper[4869]: I0312 15:06:20.314687 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e05b23e2-5c7f-4421-9b32-b70957eea944-dns-svc\") pod \"e05b23e2-5c7f-4421-9b32-b70957eea944\" (UID: \"e05b23e2-5c7f-4421-9b32-b70957eea944\") " Mar 12 15:06:20 crc kubenswrapper[4869]: I0312 15:06:20.314716 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e05b23e2-5c7f-4421-9b32-b70957eea944-config\") pod \"e05b23e2-5c7f-4421-9b32-b70957eea944\" (UID: \"e05b23e2-5c7f-4421-9b32-b70957eea944\") " Mar 12 15:06:20 crc kubenswrapper[4869]: I0312 15:06:20.332935 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e05b23e2-5c7f-4421-9b32-b70957eea944-kube-api-access-27lm9" (OuterVolumeSpecName: "kube-api-access-27lm9") pod "e05b23e2-5c7f-4421-9b32-b70957eea944" (UID: "e05b23e2-5c7f-4421-9b32-b70957eea944"). InnerVolumeSpecName "kube-api-access-27lm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:06:20 crc kubenswrapper[4869]: I0312 15:06:20.355807 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e05b23e2-5c7f-4421-9b32-b70957eea944-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e05b23e2-5c7f-4421-9b32-b70957eea944" (UID: "e05b23e2-5c7f-4421-9b32-b70957eea944"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:06:20 crc kubenswrapper[4869]: I0312 15:06:20.370600 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e05b23e2-5c7f-4421-9b32-b70957eea944-config" (OuterVolumeSpecName: "config") pod "e05b23e2-5c7f-4421-9b32-b70957eea944" (UID: "e05b23e2-5c7f-4421-9b32-b70957eea944"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:06:20 crc kubenswrapper[4869]: I0312 15:06:20.417874 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e05b23e2-5c7f-4421-9b32-b70957eea944-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:20 crc kubenswrapper[4869]: I0312 15:06:20.417905 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e05b23e2-5c7f-4421-9b32-b70957eea944-config\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:20 crc kubenswrapper[4869]: I0312 15:06:20.417917 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27lm9\" (UniqueName: \"kubernetes.io/projected/e05b23e2-5c7f-4421-9b32-b70957eea944-kube-api-access-27lm9\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:20 crc kubenswrapper[4869]: E0312 15:06:20.427683 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified" Mar 12 15:06:20 crc kubenswrapper[4869]: E0312 15:06:20.427814 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstack-network-exporter,Image:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,Command:[/app/openstack-network-exporter],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPENSTACK_NETWORK_EXPORTER_YAML,Value:/etc/config/openstack-network-exporter.yaml,ValueFrom:nil,},EnvVar{Name:CONFIG_HASH,Value:nf6h566h558hb5hd9hd6h57dh587h547h677h98hbdh5fch557h666h78h84h68ch668h687hdch677hdfh646hdh57dhf6h685h5bh5d4h5f7h58fq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovs-rundir,ReadOnly:true,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-rundir,ReadOnly:true,MountPath:/var/run/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovnmetrics.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovnmetrics.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hbb7w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-metrics-hrwrb_openstack(6c89cdd2-7ac2-46d2-be58-77f8b79acd81): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 15:06:20 crc kubenswrapper[4869]: E0312 15:06:20.429206 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstack-network-exporter\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-metrics-hrwrb" podUID="6c89cdd2-7ac2-46d2-be58-77f8b79acd81" Mar 12 15:06:20 crc kubenswrapper[4869]: I0312 15:06:20.444119 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-67phk" event={"ID":"e05b23e2-5c7f-4421-9b32-b70957eea944","Type":"ContainerDied","Data":"59f46c4d644a41e9395acc36430a4c995073c7d5b05d4985882a5d9b54630e2e"} Mar 12 15:06:20 crc kubenswrapper[4869]: I0312 15:06:20.444169 4869 scope.go:117] "RemoveContainer" containerID="a8ee6ba6483402e6643252f397b2af68355b922a12755fa4a11606e941663f25" Mar 12 15:06:20 crc kubenswrapper[4869]: I0312 15:06:20.444254 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-67phk" Mar 12 15:06:20 crc kubenswrapper[4869]: E0312 15:06:20.446737 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstack-network-exporter\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified\\\"\"" pod="openstack/ovn-controller-metrics-hrwrb" podUID="6c89cdd2-7ac2-46d2-be58-77f8b79acd81" Mar 12 15:06:20 crc kubenswrapper[4869]: I0312 15:06:20.493426 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-67phk"] Mar 12 15:06:20 crc kubenswrapper[4869]: I0312 15:06:20.499368 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-67phk"] Mar 12 15:06:20 crc kubenswrapper[4869]: I0312 15:06:20.945478 4869 scope.go:117] "RemoveContainer" containerID="130ba41f8efb128bfc76cbe6082fa320e002250ccbcf11d7536afb9a683ada25" Mar 12 15:06:21 crc kubenswrapper[4869]: I0312 15:06:21.005215 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555466-j7fkr" Mar 12 15:06:21 crc kubenswrapper[4869]: I0312 15:06:21.129294 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqvxq\" (UniqueName: \"kubernetes.io/projected/1b691259-763f-4535-98af-ef6fa62d8a0d-kube-api-access-sqvxq\") pod \"1b691259-763f-4535-98af-ef6fa62d8a0d\" (UID: \"1b691259-763f-4535-98af-ef6fa62d8a0d\") " Mar 12 15:06:21 crc kubenswrapper[4869]: I0312 15:06:21.135502 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b691259-763f-4535-98af-ef6fa62d8a0d-kube-api-access-sqvxq" (OuterVolumeSpecName: "kube-api-access-sqvxq") pod "1b691259-763f-4535-98af-ef6fa62d8a0d" (UID: "1b691259-763f-4535-98af-ef6fa62d8a0d"). InnerVolumeSpecName "kube-api-access-sqvxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:06:21 crc kubenswrapper[4869]: I0312 15:06:21.231842 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqvxq\" (UniqueName: \"kubernetes.io/projected/1b691259-763f-4535-98af-ef6fa62d8a0d-kube-api-access-sqvxq\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:21 crc kubenswrapper[4869]: I0312 15:06:21.454929 4869 generic.go:334] "Generic (PLEG): container finished" podID="4d5a854f-c386-4811-bc85-90fa35c9915a" containerID="2a3021632d133184a1533b85960f5dbf78c3786e46c74ccfef3f593fddd5db30" exitCode=0 Mar 12 15:06:21 crc kubenswrapper[4869]: I0312 15:06:21.455023 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-km7wm" event={"ID":"4d5a854f-c386-4811-bc85-90fa35c9915a","Type":"ContainerDied","Data":"2a3021632d133184a1533b85960f5dbf78c3786e46c74ccfef3f593fddd5db30"} Mar 12 15:06:21 crc kubenswrapper[4869]: I0312 15:06:21.460524 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"28b0b79e-10ab-436b-a34b-af51bd63d60a","Type":"ContainerStarted","Data":"ca6c409be3213698ab45644fd02a4ec18e9c8033e95fece01ed7fc507ace71c7"} Mar 12 15:06:21 crc kubenswrapper[4869]: I0312 15:06:21.460648 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 12 15:06:21 crc kubenswrapper[4869]: I0312 15:06:21.462741 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555466-j7fkr" event={"ID":"1b691259-763f-4535-98af-ef6fa62d8a0d","Type":"ContainerDied","Data":"ad07d33358850ac7a5556a388fbb5e84e3521d719934fc83b215f20219273fa6"} Mar 12 15:06:21 crc kubenswrapper[4869]: I0312 15:06:21.462781 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad07d33358850ac7a5556a388fbb5e84e3521d719934fc83b215f20219273fa6" Mar 12 15:06:21 crc kubenswrapper[4869]: I0312 15:06:21.462755 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555466-j7fkr" Mar 12 15:06:21 crc kubenswrapper[4869]: I0312 15:06:21.504518 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=11.94276613 podStartE2EDuration="25.504502699s" podCreationTimestamp="2026-03-12 15:05:56 +0000 UTC" firstStartedPulling="2026-03-12 15:06:06.523555845 +0000 UTC m=+1118.808781123" lastFinishedPulling="2026-03-12 15:06:20.085292404 +0000 UTC m=+1132.370517692" observedRunningTime="2026-03-12 15:06:21.491207788 +0000 UTC m=+1133.776433066" watchObservedRunningTime="2026-03-12 15:06:21.504502699 +0000 UTC m=+1133.789727977" Mar 12 15:06:22 crc kubenswrapper[4869]: I0312 15:06:22.084223 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555460-9kdz9"] Mar 12 15:06:22 crc kubenswrapper[4869]: I0312 15:06:22.092198 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555460-9kdz9"] Mar 12 15:06:22 crc kubenswrapper[4869]: I0312 15:06:22.344432 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6888aca-055f-4d64-a0a6-e42ea6011fef" path="/var/lib/kubelet/pods/a6888aca-055f-4d64-a0a6-e42ea6011fef/volumes" Mar 12 15:06:22 crc kubenswrapper[4869]: I0312 15:06:22.345120 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e05b23e2-5c7f-4421-9b32-b70957eea944" path="/var/lib/kubelet/pods/e05b23e2-5c7f-4421-9b32-b70957eea944/volumes" Mar 12 15:06:22 crc kubenswrapper[4869]: I0312 15:06:22.470486 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e0323899-ea3b-4572-baa4-3483b0d5fd86","Type":"ContainerStarted","Data":"4fcf47aa2c397bc1f0fa16216f4cc78821cbeb8188e49ef858d02fe2468e098b"} Mar 12 15:06:22 crc kubenswrapper[4869]: I0312 15:06:22.473608 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-km7wm" event={"ID":"4d5a854f-c386-4811-bc85-90fa35c9915a","Type":"ContainerStarted","Data":"d7bf8c07eb2d4e25786fb9a467291e587f7d778a305f24702f7db617158c0bf6"} Mar 12 15:06:22 crc kubenswrapper[4869]: I0312 15:06:22.473737 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-km7wm" Mar 12 15:06:22 crc kubenswrapper[4869]: I0312 15:06:22.475099 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"5bf47c8a-507e-4eba-9776-b516e4555df4","Type":"ContainerStarted","Data":"783704a9078570ea1bf10c9118be2b7f8b6dcb89ca41f60ee34b8b95840a9ddf"} Mar 12 15:06:22 crc kubenswrapper[4869]: I0312 15:06:22.476723 4869 generic.go:334] "Generic (PLEG): container finished" podID="38272c93-d688-45cc-8dd5-1f5ae4549216" containerID="38d865ea8eed4e8d885b147cba1e3ebae4509d4f55abfa4ad95f5b49d7a7b3b3" exitCode=0 Mar 12 15:06:22 crc kubenswrapper[4869]: I0312 15:06:22.476752 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-xlmvz" event={"ID":"38272c93-d688-45cc-8dd5-1f5ae4549216","Type":"ContainerDied","Data":"38d865ea8eed4e8d885b147cba1e3ebae4509d4f55abfa4ad95f5b49d7a7b3b3"} Mar 12 15:06:22 crc kubenswrapper[4869]: I0312 15:06:22.478746 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wzzxq" event={"ID":"ad973222-a042-43af-9c00-b0f6d795c7d1","Type":"ContainerStarted","Data":"46f46fa9ce05c3cd221ddf9c9c7986789256f568c485ead0256b42b876e1be0c"} Mar 12 15:06:22 crc kubenswrapper[4869]: I0312 15:06:22.479387 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-wzzxq" Mar 12 15:06:22 crc kubenswrapper[4869]: I0312 15:06:22.480405 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5c2e4870-a7a0-4271-b4dd-a68fc8a80ef3","Type":"ContainerStarted","Data":"5a1b42db99d09470784dcb7982ce70a4984212578a29d237a8442c595ca4b20e"} Mar 12 15:06:22 crc kubenswrapper[4869]: I0312 15:06:22.480864 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 12 15:06:22 crc kubenswrapper[4869]: I0312 15:06:22.491656 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4406efc2-cefd-4e44-a5f0-7384101c9b36","Type":"ContainerStarted","Data":"7d8d687401b7d08a65d5278c3b93edfb68eeb80a06ff5041fa5922f7412969d6"} Mar 12 15:06:22 crc kubenswrapper[4869]: I0312 15:06:22.497034 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0454dc15-f63d-475c-9640-71a8d60d9e56","Type":"ContainerStarted","Data":"1c747bfd75c9b8a544f889cd8f1b7a6541377108df94c1849ab4c5ead0d1d3d4"} Mar 12 15:06:22 crc kubenswrapper[4869]: I0312 15:06:22.502317 4869 generic.go:334] "Generic (PLEG): container finished" podID="5b0ed011-5259-4904-82e5-320adf5ff1cf" containerID="b95be2f92503ccfbfd9ef34e1a5aa7aa36febda711863a8344782f4bb0c9a0d5" exitCode=0 Mar 12 15:06:22 crc kubenswrapper[4869]: I0312 15:06:22.502892 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jvmhv" event={"ID":"5b0ed011-5259-4904-82e5-320adf5ff1cf","Type":"ContainerDied","Data":"b95be2f92503ccfbfd9ef34e1a5aa7aa36febda711863a8344782f4bb0c9a0d5"} Mar 12 15:06:22 crc kubenswrapper[4869]: I0312 15:06:22.506348 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3e764959-1933-4a88-b8de-fd853d49a0d3","Type":"ContainerStarted","Data":"fb475407f5163c190c98f95c63c48745bc8f6824ce9801f43a20edc1296046f5"} Mar 12 15:06:22 crc kubenswrapper[4869]: I0312 15:06:22.514805 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c8b58aad-8641-4aec-8053-f4b75d5931e8","Type":"ContainerStarted","Data":"564d9a62437282a4796601eaea7139bbe5204a88c323eacea460adbfac87d969"} Mar 12 15:06:22 crc kubenswrapper[4869]: I0312 15:06:22.529058 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-wzzxq" podStartSLOduration=7.009239214 podStartE2EDuration="21.529035105s" podCreationTimestamp="2026-03-12 15:06:01 +0000 UTC" firstStartedPulling="2026-03-12 15:06:06.40402056 +0000 UTC m=+1118.689245838" lastFinishedPulling="2026-03-12 15:06:20.923816441 +0000 UTC m=+1133.209041729" observedRunningTime="2026-03-12 15:06:22.522358873 +0000 UTC m=+1134.807584151" watchObservedRunningTime="2026-03-12 15:06:22.529035105 +0000 UTC m=+1134.814260383" Mar 12 15:06:22 crc kubenswrapper[4869]: I0312 15:06:22.565367 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-km7wm" podStartSLOduration=15.565350955 podStartE2EDuration="15.565350955s" podCreationTimestamp="2026-03-12 15:06:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:06:22.565291054 +0000 UTC m=+1134.850516342" watchObservedRunningTime="2026-03-12 15:06:22.565350955 +0000 UTC m=+1134.850576233" Mar 12 15:06:22 crc kubenswrapper[4869]: I0312 15:06:22.623100 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=9.935782349 podStartE2EDuration="24.623083719s" podCreationTimestamp="2026-03-12 15:05:58 +0000 UTC" firstStartedPulling="2026-03-12 15:06:06.403846295 +0000 UTC m=+1118.689071573" lastFinishedPulling="2026-03-12 15:06:21.091147655 +0000 UTC m=+1133.376372943" observedRunningTime="2026-03-12 15:06:22.622512983 +0000 UTC m=+1134.907738261" watchObservedRunningTime="2026-03-12 15:06:22.623083719 +0000 UTC m=+1134.908308997" Mar 12 15:06:23 crc kubenswrapper[4869]: I0312 15:06:23.526015 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0454dc15-f63d-475c-9640-71a8d60d9e56","Type":"ContainerStarted","Data":"01d4efe009a5b68579158a8210885e0e08eb99c7e445e4778b40710ecab2677f"} Mar 12 15:06:23 crc kubenswrapper[4869]: I0312 15:06:23.533445 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jvmhv" event={"ID":"5b0ed011-5259-4904-82e5-320adf5ff1cf","Type":"ContainerStarted","Data":"003e8443a71150dd67214a367dfca9e9a22b69a8e615e35fd0f41ba6a878865c"} Mar 12 15:06:23 crc kubenswrapper[4869]: I0312 15:06:23.559184 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"5bf47c8a-507e-4eba-9776-b516e4555df4","Type":"ContainerStarted","Data":"2c45cf67bb3f77130c3a3fc4ceeff221653b15c2a7f85827e86fb2ee511d70dd"} Mar 12 15:06:23 crc kubenswrapper[4869]: I0312 15:06:23.570663 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.390996664 podStartE2EDuration="19.57064s" podCreationTimestamp="2026-03-12 15:06:04 +0000 UTC" firstStartedPulling="2026-03-12 15:06:07.048402914 +0000 UTC m=+1119.333628192" lastFinishedPulling="2026-03-12 15:06:22.22804625 +0000 UTC m=+1134.513271528" observedRunningTime="2026-03-12 15:06:23.553471029 +0000 UTC m=+1135.838696347" watchObservedRunningTime="2026-03-12 15:06:23.57064 +0000 UTC m=+1135.855865288" Mar 12 15:06:23 crc kubenswrapper[4869]: I0312 15:06:23.572635 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-xlmvz" event={"ID":"38272c93-d688-45cc-8dd5-1f5ae4549216","Type":"ContainerStarted","Data":"f2f3f1c715a7ffc9d3c35c3531d52eb1bdb143b1c82b4df75c4a26a6105518b0"} Mar 12 15:06:23 crc kubenswrapper[4869]: I0312 15:06:23.595736 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=8.214297793 podStartE2EDuration="22.595711439s" podCreationTimestamp="2026-03-12 15:06:01 +0000 UTC" firstStartedPulling="2026-03-12 15:06:06.552221776 +0000 UTC m=+1118.837447054" lastFinishedPulling="2026-03-12 15:06:20.933635412 +0000 UTC m=+1133.218860700" observedRunningTime="2026-03-12 15:06:23.586425503 +0000 UTC m=+1135.871650811" watchObservedRunningTime="2026-03-12 15:06:23.595711439 +0000 UTC m=+1135.880936717" Mar 12 15:06:23 crc kubenswrapper[4869]: I0312 15:06:23.624685 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-xlmvz" podStartSLOduration=16.624668069 podStartE2EDuration="16.624668069s" podCreationTimestamp="2026-03-12 15:06:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:06:23.617216545 +0000 UTC m=+1135.902441833" watchObservedRunningTime="2026-03-12 15:06:23.624668069 +0000 UTC m=+1135.909893357" Mar 12 15:06:23 crc kubenswrapper[4869]: I0312 15:06:23.736718 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 12 15:06:24 crc kubenswrapper[4869]: I0312 15:06:24.214982 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 12 15:06:24 crc kubenswrapper[4869]: I0312 15:06:24.267193 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 12 15:06:24 crc kubenswrapper[4869]: I0312 15:06:24.585625 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jvmhv" event={"ID":"5b0ed011-5259-4904-82e5-320adf5ff1cf","Type":"ContainerStarted","Data":"24e1c0642d8658c802014672c2e2b6f0276f30c654039f065b4e20fbbca03289"} Mar 12 15:06:24 crc kubenswrapper[4869]: I0312 15:06:24.586629 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-xlmvz" Mar 12 15:06:24 crc kubenswrapper[4869]: I0312 15:06:24.586677 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 12 15:06:24 crc kubenswrapper[4869]: I0312 15:06:24.586692 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-jvmhv" Mar 12 15:06:24 crc kubenswrapper[4869]: I0312 15:06:24.626775 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-jvmhv" podStartSLOduration=10.09563144 podStartE2EDuration="23.626748722s" podCreationTimestamp="2026-03-12 15:06:01 +0000 UTC" firstStartedPulling="2026-03-12 15:06:06.591475861 +0000 UTC m=+1118.876701149" lastFinishedPulling="2026-03-12 15:06:20.122593153 +0000 UTC m=+1132.407818431" observedRunningTime="2026-03-12 15:06:24.622192811 +0000 UTC m=+1136.907418110" watchObservedRunningTime="2026-03-12 15:06:24.626748722 +0000 UTC m=+1136.911974040" Mar 12 15:06:25 crc kubenswrapper[4869]: I0312 15:06:25.595495 4869 generic.go:334] "Generic (PLEG): container finished" podID="c8b58aad-8641-4aec-8053-f4b75d5931e8" containerID="564d9a62437282a4796601eaea7139bbe5204a88c323eacea460adbfac87d969" exitCode=0 Mar 12 15:06:25 crc kubenswrapper[4869]: I0312 15:06:25.595534 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c8b58aad-8641-4aec-8053-f4b75d5931e8","Type":"ContainerDied","Data":"564d9a62437282a4796601eaea7139bbe5204a88c323eacea460adbfac87d969"} Mar 12 15:06:25 crc kubenswrapper[4869]: I0312 15:06:25.597133 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-jvmhv" Mar 12 15:06:26 crc kubenswrapper[4869]: I0312 15:06:26.255093 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 12 15:06:26 crc kubenswrapper[4869]: I0312 15:06:26.604360 4869 generic.go:334] "Generic (PLEG): container finished" podID="4406efc2-cefd-4e44-a5f0-7384101c9b36" containerID="7d8d687401b7d08a65d5278c3b93edfb68eeb80a06ff5041fa5922f7412969d6" exitCode=0 Mar 12 15:06:26 crc kubenswrapper[4869]: I0312 15:06:26.604440 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4406efc2-cefd-4e44-a5f0-7384101c9b36","Type":"ContainerDied","Data":"7d8d687401b7d08a65d5278c3b93edfb68eeb80a06ff5041fa5922f7412969d6"} Mar 12 15:06:26 crc kubenswrapper[4869]: I0312 15:06:26.606672 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c8b58aad-8641-4aec-8053-f4b75d5931e8","Type":"ContainerStarted","Data":"e95b3c28044099187a900165d4d85f3659c60478badacfcc0502befd649ea542"} Mar 12 15:06:26 crc kubenswrapper[4869]: I0312 15:06:26.664951 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=16.615379982 podStartE2EDuration="31.664924832s" podCreationTimestamp="2026-03-12 15:05:55 +0000 UTC" firstStartedPulling="2026-03-12 15:06:05.874417935 +0000 UTC m=+1118.159643213" lastFinishedPulling="2026-03-12 15:06:20.923962775 +0000 UTC m=+1133.209188063" observedRunningTime="2026-03-12 15:06:26.656153431 +0000 UTC m=+1138.941378719" watchObservedRunningTime="2026-03-12 15:06:26.664924832 +0000 UTC m=+1138.950150140" Mar 12 15:06:26 crc kubenswrapper[4869]: I0312 15:06:26.781884 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 12 15:06:26 crc kubenswrapper[4869]: I0312 15:06:26.782314 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 12 15:06:26 crc kubenswrapper[4869]: I0312 15:06:26.813837 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 12 15:06:27 crc kubenswrapper[4869]: I0312 15:06:27.615860 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4406efc2-cefd-4e44-a5f0-7384101c9b36","Type":"ContainerStarted","Data":"88b9163138c1e7b7d9a42e0823fb61b405f32ea74eb8da14da64f3a5ddb5ced1"} Mar 12 15:06:27 crc kubenswrapper[4869]: I0312 15:06:27.634656 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=20.502135723 podStartE2EDuration="34.634627577s" podCreationTimestamp="2026-03-12 15:05:53 +0000 UTC" firstStartedPulling="2026-03-12 15:06:06.380838165 +0000 UTC m=+1118.666063443" lastFinishedPulling="2026-03-12 15:06:20.513330009 +0000 UTC m=+1132.798555297" observedRunningTime="2026-03-12 15:06:27.633509005 +0000 UTC m=+1139.918734313" watchObservedRunningTime="2026-03-12 15:06:27.634627577 +0000 UTC m=+1139.919852915" Mar 12 15:06:27 crc kubenswrapper[4869]: I0312 15:06:27.654395 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 12 15:06:27 crc kubenswrapper[4869]: I0312 15:06:27.825086 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 12 15:06:27 crc kubenswrapper[4869]: E0312 15:06:27.825391 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e05b23e2-5c7f-4421-9b32-b70957eea944" containerName="init" Mar 12 15:06:27 crc kubenswrapper[4869]: I0312 15:06:27.825409 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e05b23e2-5c7f-4421-9b32-b70957eea944" containerName="init" Mar 12 15:06:27 crc kubenswrapper[4869]: E0312 15:06:27.825440 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb130117-6676-4748-89ff-b2e5f4a1c120" containerName="dnsmasq-dns" Mar 12 15:06:27 crc kubenswrapper[4869]: I0312 15:06:27.825449 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb130117-6676-4748-89ff-b2e5f4a1c120" containerName="dnsmasq-dns" Mar 12 15:06:27 crc kubenswrapper[4869]: E0312 15:06:27.825468 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e05b23e2-5c7f-4421-9b32-b70957eea944" containerName="dnsmasq-dns" Mar 12 15:06:27 crc kubenswrapper[4869]: I0312 15:06:27.825473 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e05b23e2-5c7f-4421-9b32-b70957eea944" containerName="dnsmasq-dns" Mar 12 15:06:27 crc kubenswrapper[4869]: E0312 15:06:27.825493 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b691259-763f-4535-98af-ef6fa62d8a0d" containerName="oc" Mar 12 15:06:27 crc kubenswrapper[4869]: I0312 15:06:27.825498 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b691259-763f-4535-98af-ef6fa62d8a0d" containerName="oc" Mar 12 15:06:27 crc kubenswrapper[4869]: E0312 15:06:27.825518 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb130117-6676-4748-89ff-b2e5f4a1c120" containerName="init" Mar 12 15:06:27 crc kubenswrapper[4869]: I0312 15:06:27.825523 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb130117-6676-4748-89ff-b2e5f4a1c120" containerName="init" Mar 12 15:06:27 crc kubenswrapper[4869]: I0312 15:06:27.825718 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="e05b23e2-5c7f-4421-9b32-b70957eea944" containerName="dnsmasq-dns" Mar 12 15:06:27 crc kubenswrapper[4869]: I0312 15:06:27.825729 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b691259-763f-4535-98af-ef6fa62d8a0d" containerName="oc" Mar 12 15:06:27 crc kubenswrapper[4869]: I0312 15:06:27.825735 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb130117-6676-4748-89ff-b2e5f4a1c120" containerName="dnsmasq-dns" Mar 12 15:06:27 crc kubenswrapper[4869]: I0312 15:06:27.826448 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 12 15:06:27 crc kubenswrapper[4869]: I0312 15:06:27.828436 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 12 15:06:27 crc kubenswrapper[4869]: I0312 15:06:27.828737 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-kfwrz" Mar 12 15:06:27 crc kubenswrapper[4869]: I0312 15:06:27.828843 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 12 15:06:27 crc kubenswrapper[4869]: I0312 15:06:27.828997 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 12 15:06:27 crc kubenswrapper[4869]: I0312 15:06:27.844511 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 12 15:06:27 crc kubenswrapper[4869]: I0312 15:06:27.876773 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-xlmvz" Mar 12 15:06:27 crc kubenswrapper[4869]: I0312 15:06:27.907610 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/94c54bed-21a1-4a68-8cde-f45c89a05e85-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"94c54bed-21a1-4a68-8cde-f45c89a05e85\") " pod="openstack/ovn-northd-0" Mar 12 15:06:27 crc kubenswrapper[4869]: I0312 15:06:27.907869 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c54bed-21a1-4a68-8cde-f45c89a05e85-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"94c54bed-21a1-4a68-8cde-f45c89a05e85\") " pod="openstack/ovn-northd-0" Mar 12 15:06:27 crc kubenswrapper[4869]: I0312 15:06:27.908101 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/94c54bed-21a1-4a68-8cde-f45c89a05e85-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"94c54bed-21a1-4a68-8cde-f45c89a05e85\") " pod="openstack/ovn-northd-0" Mar 12 15:06:27 crc kubenswrapper[4869]: I0312 15:06:27.908181 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj7fz\" (UniqueName: \"kubernetes.io/projected/94c54bed-21a1-4a68-8cde-f45c89a05e85-kube-api-access-lj7fz\") pod \"ovn-northd-0\" (UID: \"94c54bed-21a1-4a68-8cde-f45c89a05e85\") " pod="openstack/ovn-northd-0" Mar 12 15:06:27 crc kubenswrapper[4869]: I0312 15:06:27.908208 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94c54bed-21a1-4a68-8cde-f45c89a05e85-scripts\") pod \"ovn-northd-0\" (UID: \"94c54bed-21a1-4a68-8cde-f45c89a05e85\") " pod="openstack/ovn-northd-0" Mar 12 15:06:27 crc kubenswrapper[4869]: I0312 15:06:27.908365 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94c54bed-21a1-4a68-8cde-f45c89a05e85-config\") pod \"ovn-northd-0\" (UID: \"94c54bed-21a1-4a68-8cde-f45c89a05e85\") " pod="openstack/ovn-northd-0" Mar 12 15:06:27 crc kubenswrapper[4869]: I0312 15:06:27.908409 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/94c54bed-21a1-4a68-8cde-f45c89a05e85-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"94c54bed-21a1-4a68-8cde-f45c89a05e85\") " pod="openstack/ovn-northd-0" Mar 12 15:06:27 crc kubenswrapper[4869]: I0312 15:06:27.985722 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-km7wm" Mar 12 15:06:28 crc kubenswrapper[4869]: I0312 15:06:28.010341 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/94c54bed-21a1-4a68-8cde-f45c89a05e85-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"94c54bed-21a1-4a68-8cde-f45c89a05e85\") " pod="openstack/ovn-northd-0" Mar 12 15:06:28 crc kubenswrapper[4869]: I0312 15:06:28.010394 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj7fz\" (UniqueName: \"kubernetes.io/projected/94c54bed-21a1-4a68-8cde-f45c89a05e85-kube-api-access-lj7fz\") pod \"ovn-northd-0\" (UID: \"94c54bed-21a1-4a68-8cde-f45c89a05e85\") " pod="openstack/ovn-northd-0" Mar 12 15:06:28 crc kubenswrapper[4869]: I0312 15:06:28.010417 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94c54bed-21a1-4a68-8cde-f45c89a05e85-scripts\") pod \"ovn-northd-0\" (UID: \"94c54bed-21a1-4a68-8cde-f45c89a05e85\") " pod="openstack/ovn-northd-0" Mar 12 15:06:28 crc kubenswrapper[4869]: I0312 15:06:28.010448 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94c54bed-21a1-4a68-8cde-f45c89a05e85-config\") pod \"ovn-northd-0\" (UID: \"94c54bed-21a1-4a68-8cde-f45c89a05e85\") " pod="openstack/ovn-northd-0" Mar 12 15:06:28 crc kubenswrapper[4869]: I0312 15:06:28.010467 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/94c54bed-21a1-4a68-8cde-f45c89a05e85-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"94c54bed-21a1-4a68-8cde-f45c89a05e85\") " pod="openstack/ovn-northd-0" Mar 12 15:06:28 crc kubenswrapper[4869]: I0312 15:06:28.010494 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/94c54bed-21a1-4a68-8cde-f45c89a05e85-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"94c54bed-21a1-4a68-8cde-f45c89a05e85\") " pod="openstack/ovn-northd-0" Mar 12 15:06:28 crc kubenswrapper[4869]: I0312 15:06:28.010576 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c54bed-21a1-4a68-8cde-f45c89a05e85-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"94c54bed-21a1-4a68-8cde-f45c89a05e85\") " pod="openstack/ovn-northd-0" Mar 12 15:06:28 crc kubenswrapper[4869]: I0312 15:06:28.012010 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/94c54bed-21a1-4a68-8cde-f45c89a05e85-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"94c54bed-21a1-4a68-8cde-f45c89a05e85\") " pod="openstack/ovn-northd-0" Mar 12 15:06:28 crc kubenswrapper[4869]: I0312 15:06:28.012419 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94c54bed-21a1-4a68-8cde-f45c89a05e85-config\") pod \"ovn-northd-0\" (UID: \"94c54bed-21a1-4a68-8cde-f45c89a05e85\") " pod="openstack/ovn-northd-0" Mar 12 15:06:28 crc kubenswrapper[4869]: I0312 15:06:28.012459 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94c54bed-21a1-4a68-8cde-f45c89a05e85-scripts\") pod \"ovn-northd-0\" (UID: \"94c54bed-21a1-4a68-8cde-f45c89a05e85\") " pod="openstack/ovn-northd-0" Mar 12 15:06:28 crc kubenswrapper[4869]: I0312 15:06:28.016820 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/94c54bed-21a1-4a68-8cde-f45c89a05e85-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"94c54bed-21a1-4a68-8cde-f45c89a05e85\") " pod="openstack/ovn-northd-0" Mar 12 15:06:28 crc kubenswrapper[4869]: I0312 15:06:28.016978 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c54bed-21a1-4a68-8cde-f45c89a05e85-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"94c54bed-21a1-4a68-8cde-f45c89a05e85\") " pod="openstack/ovn-northd-0" Mar 12 15:06:28 crc kubenswrapper[4869]: I0312 15:06:28.018845 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/94c54bed-21a1-4a68-8cde-f45c89a05e85-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"94c54bed-21a1-4a68-8cde-f45c89a05e85\") " pod="openstack/ovn-northd-0" Mar 12 15:06:28 crc kubenswrapper[4869]: I0312 15:06:28.035201 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-xlmvz"] Mar 12 15:06:28 crc kubenswrapper[4869]: I0312 15:06:28.036861 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj7fz\" (UniqueName: \"kubernetes.io/projected/94c54bed-21a1-4a68-8cde-f45c89a05e85-kube-api-access-lj7fz\") pod \"ovn-northd-0\" (UID: \"94c54bed-21a1-4a68-8cde-f45c89a05e85\") " pod="openstack/ovn-northd-0" Mar 12 15:06:28 crc kubenswrapper[4869]: I0312 15:06:28.146828 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 12 15:06:28 crc kubenswrapper[4869]: I0312 15:06:28.576075 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 12 15:06:28 crc kubenswrapper[4869]: I0312 15:06:28.622167 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"94c54bed-21a1-4a68-8cde-f45c89a05e85","Type":"ContainerStarted","Data":"a9c243930e99f260222e562787ab03923024a573731f8b2abcab952ca7f5d7ec"} Mar 12 15:06:28 crc kubenswrapper[4869]: I0312 15:06:28.622300 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-xlmvz" podUID="38272c93-d688-45cc-8dd5-1f5ae4549216" containerName="dnsmasq-dns" containerID="cri-o://f2f3f1c715a7ffc9d3c35c3531d52eb1bdb143b1c82b4df75c4a26a6105518b0" gracePeriod=10 Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.129036 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.153831 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-mkmpj"] Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.155526 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-mkmpj" Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.216632 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-mkmpj"] Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.229433 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/349a0f9a-cebb-4465-9774-b497c95357b6-config\") pod \"dnsmasq-dns-698758b865-mkmpj\" (UID: \"349a0f9a-cebb-4465-9774-b497c95357b6\") " pod="openstack/dnsmasq-dns-698758b865-mkmpj" Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.229482 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/349a0f9a-cebb-4465-9774-b497c95357b6-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-mkmpj\" (UID: \"349a0f9a-cebb-4465-9774-b497c95357b6\") " pod="openstack/dnsmasq-dns-698758b865-mkmpj" Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.229527 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmvjm\" (UniqueName: \"kubernetes.io/projected/349a0f9a-cebb-4465-9774-b497c95357b6-kube-api-access-nmvjm\") pod \"dnsmasq-dns-698758b865-mkmpj\" (UID: \"349a0f9a-cebb-4465-9774-b497c95357b6\") " pod="openstack/dnsmasq-dns-698758b865-mkmpj" Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.229564 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/349a0f9a-cebb-4465-9774-b497c95357b6-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-mkmpj\" (UID: \"349a0f9a-cebb-4465-9774-b497c95357b6\") " pod="openstack/dnsmasq-dns-698758b865-mkmpj" Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.229589 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/349a0f9a-cebb-4465-9774-b497c95357b6-dns-svc\") pod \"dnsmasq-dns-698758b865-mkmpj\" (UID: \"349a0f9a-cebb-4465-9774-b497c95357b6\") " pod="openstack/dnsmasq-dns-698758b865-mkmpj" Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.331335 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/349a0f9a-cebb-4465-9774-b497c95357b6-config\") pod \"dnsmasq-dns-698758b865-mkmpj\" (UID: \"349a0f9a-cebb-4465-9774-b497c95357b6\") " pod="openstack/dnsmasq-dns-698758b865-mkmpj" Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.331394 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/349a0f9a-cebb-4465-9774-b497c95357b6-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-mkmpj\" (UID: \"349a0f9a-cebb-4465-9774-b497c95357b6\") " pod="openstack/dnsmasq-dns-698758b865-mkmpj" Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.331460 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmvjm\" (UniqueName: \"kubernetes.io/projected/349a0f9a-cebb-4465-9774-b497c95357b6-kube-api-access-nmvjm\") pod \"dnsmasq-dns-698758b865-mkmpj\" (UID: \"349a0f9a-cebb-4465-9774-b497c95357b6\") " pod="openstack/dnsmasq-dns-698758b865-mkmpj" Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.331520 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/349a0f9a-cebb-4465-9774-b497c95357b6-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-mkmpj\" (UID: \"349a0f9a-cebb-4465-9774-b497c95357b6\") " pod="openstack/dnsmasq-dns-698758b865-mkmpj" Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.331566 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/349a0f9a-cebb-4465-9774-b497c95357b6-dns-svc\") pod \"dnsmasq-dns-698758b865-mkmpj\" (UID: \"349a0f9a-cebb-4465-9774-b497c95357b6\") " pod="openstack/dnsmasq-dns-698758b865-mkmpj" Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.332455 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/349a0f9a-cebb-4465-9774-b497c95357b6-dns-svc\") pod \"dnsmasq-dns-698758b865-mkmpj\" (UID: \"349a0f9a-cebb-4465-9774-b497c95357b6\") " pod="openstack/dnsmasq-dns-698758b865-mkmpj" Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.333060 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/349a0f9a-cebb-4465-9774-b497c95357b6-config\") pod \"dnsmasq-dns-698758b865-mkmpj\" (UID: \"349a0f9a-cebb-4465-9774-b497c95357b6\") " pod="openstack/dnsmasq-dns-698758b865-mkmpj" Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.333595 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/349a0f9a-cebb-4465-9774-b497c95357b6-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-mkmpj\" (UID: \"349a0f9a-cebb-4465-9774-b497c95357b6\") " pod="openstack/dnsmasq-dns-698758b865-mkmpj" Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.334135 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/349a0f9a-cebb-4465-9774-b497c95357b6-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-mkmpj\" (UID: \"349a0f9a-cebb-4465-9774-b497c95357b6\") " pod="openstack/dnsmasq-dns-698758b865-mkmpj" Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.352044 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmvjm\" (UniqueName: \"kubernetes.io/projected/349a0f9a-cebb-4465-9774-b497c95357b6-kube-api-access-nmvjm\") pod \"dnsmasq-dns-698758b865-mkmpj\" (UID: \"349a0f9a-cebb-4465-9774-b497c95357b6\") " pod="openstack/dnsmasq-dns-698758b865-mkmpj" Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.407545 4869 scope.go:117] "RemoveContainer" containerID="897337953be2efb379efb1dd04745ec6b90bdbb52ca0003d268f70b0aae27816" Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.490128 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-mkmpj" Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.631433 4869 generic.go:334] "Generic (PLEG): container finished" podID="38272c93-d688-45cc-8dd5-1f5ae4549216" containerID="f2f3f1c715a7ffc9d3c35c3531d52eb1bdb143b1c82b4df75c4a26a6105518b0" exitCode=0 Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.631501 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-xlmvz" event={"ID":"38272c93-d688-45cc-8dd5-1f5ae4549216","Type":"ContainerDied","Data":"f2f3f1c715a7ffc9d3c35c3531d52eb1bdb143b1c82b4df75c4a26a6105518b0"} Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.794870 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-xlmvz" Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.839031 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38272c93-d688-45cc-8dd5-1f5ae4549216-ovsdbserver-nb\") pod \"38272c93-d688-45cc-8dd5-1f5ae4549216\" (UID: \"38272c93-d688-45cc-8dd5-1f5ae4549216\") " Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.839073 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38272c93-d688-45cc-8dd5-1f5ae4549216-config\") pod \"38272c93-d688-45cc-8dd5-1f5ae4549216\" (UID: \"38272c93-d688-45cc-8dd5-1f5ae4549216\") " Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.839109 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38272c93-d688-45cc-8dd5-1f5ae4549216-dns-svc\") pod \"38272c93-d688-45cc-8dd5-1f5ae4549216\" (UID: \"38272c93-d688-45cc-8dd5-1f5ae4549216\") " Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.839161 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp8hf\" (UniqueName: \"kubernetes.io/projected/38272c93-d688-45cc-8dd5-1f5ae4549216-kube-api-access-bp8hf\") pod \"38272c93-d688-45cc-8dd5-1f5ae4549216\" (UID: \"38272c93-d688-45cc-8dd5-1f5ae4549216\") " Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.851784 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38272c93-d688-45cc-8dd5-1f5ae4549216-kube-api-access-bp8hf" (OuterVolumeSpecName: "kube-api-access-bp8hf") pod "38272c93-d688-45cc-8dd5-1f5ae4549216" (UID: "38272c93-d688-45cc-8dd5-1f5ae4549216"). InnerVolumeSpecName "kube-api-access-bp8hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.871969 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38272c93-d688-45cc-8dd5-1f5ae4549216-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "38272c93-d688-45cc-8dd5-1f5ae4549216" (UID: "38272c93-d688-45cc-8dd5-1f5ae4549216"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.871964 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38272c93-d688-45cc-8dd5-1f5ae4549216-config" (OuterVolumeSpecName: "config") pod "38272c93-d688-45cc-8dd5-1f5ae4549216" (UID: "38272c93-d688-45cc-8dd5-1f5ae4549216"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.897945 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38272c93-d688-45cc-8dd5-1f5ae4549216-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "38272c93-d688-45cc-8dd5-1f5ae4549216" (UID: "38272c93-d688-45cc-8dd5-1f5ae4549216"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.941075 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38272c93-d688-45cc-8dd5-1f5ae4549216-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.941107 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38272c93-d688-45cc-8dd5-1f5ae4549216-config\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.941133 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38272c93-d688-45cc-8dd5-1f5ae4549216-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.941144 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp8hf\" (UniqueName: \"kubernetes.io/projected/38272c93-d688-45cc-8dd5-1f5ae4549216-kube-api-access-bp8hf\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:29 crc kubenswrapper[4869]: I0312 15:06:29.966194 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-mkmpj"] Mar 12 15:06:30 crc kubenswrapper[4869]: W0312 15:06:30.179323 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod349a0f9a_cebb_4465_9774_b497c95357b6.slice/crio-dbc60ef521237927abf30373117692ea48953dce7451c1ec3bf82c3b6c07fab9 WatchSource:0}: Error finding container dbc60ef521237927abf30373117692ea48953dce7451c1ec3bf82c3b6c07fab9: Status 404 returned error can't find the container with id dbc60ef521237927abf30373117692ea48953dce7451c1ec3bf82c3b6c07fab9 Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.322115 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 12 15:06:30 crc kubenswrapper[4869]: E0312 15:06:30.322449 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38272c93-d688-45cc-8dd5-1f5ae4549216" containerName="dnsmasq-dns" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.322469 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="38272c93-d688-45cc-8dd5-1f5ae4549216" containerName="dnsmasq-dns" Mar 12 15:06:30 crc kubenswrapper[4869]: E0312 15:06:30.322511 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38272c93-d688-45cc-8dd5-1f5ae4549216" containerName="init" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.322518 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="38272c93-d688-45cc-8dd5-1f5ae4549216" containerName="init" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.322672 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="38272c93-d688-45cc-8dd5-1f5ae4549216" containerName="dnsmasq-dns" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.327381 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.329291 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.329358 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-9sdwf" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.329489 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.329290 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.373331 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.451725 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c2260d9c-2497-44bb-9952-341844cf85d0-cache\") pod \"swift-storage-0\" (UID: \"c2260d9c-2497-44bb-9952-341844cf85d0\") " pod="openstack/swift-storage-0" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.452230 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"c2260d9c-2497-44bb-9952-341844cf85d0\") " pod="openstack/swift-storage-0" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.452301 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d86km\" (UniqueName: \"kubernetes.io/projected/c2260d9c-2497-44bb-9952-341844cf85d0-kube-api-access-d86km\") pod \"swift-storage-0\" (UID: \"c2260d9c-2497-44bb-9952-341844cf85d0\") " pod="openstack/swift-storage-0" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.452332 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c2260d9c-2497-44bb-9952-341844cf85d0-lock\") pod \"swift-storage-0\" (UID: \"c2260d9c-2497-44bb-9952-341844cf85d0\") " pod="openstack/swift-storage-0" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.452356 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c2260d9c-2497-44bb-9952-341844cf85d0-etc-swift\") pod \"swift-storage-0\" (UID: \"c2260d9c-2497-44bb-9952-341844cf85d0\") " pod="openstack/swift-storage-0" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.452387 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2260d9c-2497-44bb-9952-341844cf85d0-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"c2260d9c-2497-44bb-9952-341844cf85d0\") " pod="openstack/swift-storage-0" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.553595 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2260d9c-2497-44bb-9952-341844cf85d0-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"c2260d9c-2497-44bb-9952-341844cf85d0\") " pod="openstack/swift-storage-0" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.553695 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c2260d9c-2497-44bb-9952-341844cf85d0-cache\") pod \"swift-storage-0\" (UID: \"c2260d9c-2497-44bb-9952-341844cf85d0\") " pod="openstack/swift-storage-0" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.553728 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"c2260d9c-2497-44bb-9952-341844cf85d0\") " pod="openstack/swift-storage-0" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.553792 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d86km\" (UniqueName: \"kubernetes.io/projected/c2260d9c-2497-44bb-9952-341844cf85d0-kube-api-access-d86km\") pod \"swift-storage-0\" (UID: \"c2260d9c-2497-44bb-9952-341844cf85d0\") " pod="openstack/swift-storage-0" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.553823 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c2260d9c-2497-44bb-9952-341844cf85d0-lock\") pod \"swift-storage-0\" (UID: \"c2260d9c-2497-44bb-9952-341844cf85d0\") " pod="openstack/swift-storage-0" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.554118 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"c2260d9c-2497-44bb-9952-341844cf85d0\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/swift-storage-0" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.554393 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c2260d9c-2497-44bb-9952-341844cf85d0-cache\") pod \"swift-storage-0\" (UID: \"c2260d9c-2497-44bb-9952-341844cf85d0\") " pod="openstack/swift-storage-0" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.554487 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c2260d9c-2497-44bb-9952-341844cf85d0-lock\") pod \"swift-storage-0\" (UID: \"c2260d9c-2497-44bb-9952-341844cf85d0\") " pod="openstack/swift-storage-0" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.556470 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c2260d9c-2497-44bb-9952-341844cf85d0-etc-swift\") pod \"swift-storage-0\" (UID: \"c2260d9c-2497-44bb-9952-341844cf85d0\") " pod="openstack/swift-storage-0" Mar 12 15:06:30 crc kubenswrapper[4869]: E0312 15:06:30.556665 4869 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 12 15:06:30 crc kubenswrapper[4869]: E0312 15:06:30.556689 4869 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 12 15:06:30 crc kubenswrapper[4869]: E0312 15:06:30.556749 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2260d9c-2497-44bb-9952-341844cf85d0-etc-swift podName:c2260d9c-2497-44bb-9952-341844cf85d0 nodeName:}" failed. No retries permitted until 2026-03-12 15:06:31.056730045 +0000 UTC m=+1143.341955323 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c2260d9c-2497-44bb-9952-341844cf85d0-etc-swift") pod "swift-storage-0" (UID: "c2260d9c-2497-44bb-9952-341844cf85d0") : configmap "swift-ring-files" not found Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.558128 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2260d9c-2497-44bb-9952-341844cf85d0-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"c2260d9c-2497-44bb-9952-341844cf85d0\") " pod="openstack/swift-storage-0" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.570271 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d86km\" (UniqueName: \"kubernetes.io/projected/c2260d9c-2497-44bb-9952-341844cf85d0-kube-api-access-d86km\") pod \"swift-storage-0\" (UID: \"c2260d9c-2497-44bb-9952-341844cf85d0\") " pod="openstack/swift-storage-0" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.576701 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"c2260d9c-2497-44bb-9952-341844cf85d0\") " pod="openstack/swift-storage-0" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.641365 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-xlmvz" event={"ID":"38272c93-d688-45cc-8dd5-1f5ae4549216","Type":"ContainerDied","Data":"f083100778d1a002c7f22e093b0aafd249a2e426a3022a6e426ca02eb885662c"} Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.641406 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-xlmvz" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.641416 4869 scope.go:117] "RemoveContainer" containerID="f2f3f1c715a7ffc9d3c35c3531d52eb1bdb143b1c82b4df75c4a26a6105518b0" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.643404 4869 generic.go:334] "Generic (PLEG): container finished" podID="349a0f9a-cebb-4465-9774-b497c95357b6" containerID="b719cf0d2219bc98e57cc40a542a4c524ddacd6264fc96a45d45a2f39129b310" exitCode=0 Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.643472 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-mkmpj" event={"ID":"349a0f9a-cebb-4465-9774-b497c95357b6","Type":"ContainerDied","Data":"b719cf0d2219bc98e57cc40a542a4c524ddacd6264fc96a45d45a2f39129b310"} Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.643526 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-mkmpj" event={"ID":"349a0f9a-cebb-4465-9774-b497c95357b6","Type":"ContainerStarted","Data":"dbc60ef521237927abf30373117692ea48953dce7451c1ec3bf82c3b6c07fab9"} Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.648453 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"94c54bed-21a1-4a68-8cde-f45c89a05e85","Type":"ContainerStarted","Data":"e1b1ef5e4121f8d6e3bf2beef403cd4a7c382585a1735b93279e080d76421e4b"} Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.648501 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"94c54bed-21a1-4a68-8cde-f45c89a05e85","Type":"ContainerStarted","Data":"7e496d1f1ac58a2dec79b1d463bc97b55856da3ef7c579fac5206690757d8489"} Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.649117 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.658115 4869 scope.go:117] "RemoveContainer" containerID="38d865ea8eed4e8d885b147cba1e3ebae4509d4f55abfa4ad95f5b49d7a7b3b3" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.684321 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.031333969 podStartE2EDuration="3.684302151s" podCreationTimestamp="2026-03-12 15:06:27 +0000 UTC" firstStartedPulling="2026-03-12 15:06:28.582272491 +0000 UTC m=+1140.867497779" lastFinishedPulling="2026-03-12 15:06:30.235240683 +0000 UTC m=+1142.520465961" observedRunningTime="2026-03-12 15:06:30.682324564 +0000 UTC m=+1142.967549862" watchObservedRunningTime="2026-03-12 15:06:30.684302151 +0000 UTC m=+1142.969527429" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.701618 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-xlmvz"] Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.708591 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-xlmvz"] Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.833632 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-mbvj6"] Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.834771 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mbvj6" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.837404 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.837842 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.839726 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.858821 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mbvj6"] Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.965914 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/db6dbe81-f329-4b21-be85-3ac61ed4c428-dispersionconf\") pod \"swift-ring-rebalance-mbvj6\" (UID: \"db6dbe81-f329-4b21-be85-3ac61ed4c428\") " pod="openstack/swift-ring-rebalance-mbvj6" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.966207 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/db6dbe81-f329-4b21-be85-3ac61ed4c428-scripts\") pod \"swift-ring-rebalance-mbvj6\" (UID: \"db6dbe81-f329-4b21-be85-3ac61ed4c428\") " pod="openstack/swift-ring-rebalance-mbvj6" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.966326 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db6dbe81-f329-4b21-be85-3ac61ed4c428-combined-ca-bundle\") pod \"swift-ring-rebalance-mbvj6\" (UID: \"db6dbe81-f329-4b21-be85-3ac61ed4c428\") " pod="openstack/swift-ring-rebalance-mbvj6" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.966461 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/db6dbe81-f329-4b21-be85-3ac61ed4c428-swiftconf\") pod \"swift-ring-rebalance-mbvj6\" (UID: \"db6dbe81-f329-4b21-be85-3ac61ed4c428\") " pod="openstack/swift-ring-rebalance-mbvj6" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.966642 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx5r5\" (UniqueName: \"kubernetes.io/projected/db6dbe81-f329-4b21-be85-3ac61ed4c428-kube-api-access-zx5r5\") pod \"swift-ring-rebalance-mbvj6\" (UID: \"db6dbe81-f329-4b21-be85-3ac61ed4c428\") " pod="openstack/swift-ring-rebalance-mbvj6" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.966806 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/db6dbe81-f329-4b21-be85-3ac61ed4c428-etc-swift\") pod \"swift-ring-rebalance-mbvj6\" (UID: \"db6dbe81-f329-4b21-be85-3ac61ed4c428\") " pod="openstack/swift-ring-rebalance-mbvj6" Mar 12 15:06:30 crc kubenswrapper[4869]: I0312 15:06:30.967001 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/db6dbe81-f329-4b21-be85-3ac61ed4c428-ring-data-devices\") pod \"swift-ring-rebalance-mbvj6\" (UID: \"db6dbe81-f329-4b21-be85-3ac61ed4c428\") " pod="openstack/swift-ring-rebalance-mbvj6" Mar 12 15:06:31 crc kubenswrapper[4869]: I0312 15:06:31.068978 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/db6dbe81-f329-4b21-be85-3ac61ed4c428-scripts\") pod \"swift-ring-rebalance-mbvj6\" (UID: \"db6dbe81-f329-4b21-be85-3ac61ed4c428\") " pod="openstack/swift-ring-rebalance-mbvj6" Mar 12 15:06:31 crc kubenswrapper[4869]: I0312 15:06:31.069032 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db6dbe81-f329-4b21-be85-3ac61ed4c428-combined-ca-bundle\") pod \"swift-ring-rebalance-mbvj6\" (UID: \"db6dbe81-f329-4b21-be85-3ac61ed4c428\") " pod="openstack/swift-ring-rebalance-mbvj6" Mar 12 15:06:31 crc kubenswrapper[4869]: I0312 15:06:31.069056 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c2260d9c-2497-44bb-9952-341844cf85d0-etc-swift\") pod \"swift-storage-0\" (UID: \"c2260d9c-2497-44bb-9952-341844cf85d0\") " pod="openstack/swift-storage-0" Mar 12 15:06:31 crc kubenswrapper[4869]: I0312 15:06:31.069096 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/db6dbe81-f329-4b21-be85-3ac61ed4c428-swiftconf\") pod \"swift-ring-rebalance-mbvj6\" (UID: \"db6dbe81-f329-4b21-be85-3ac61ed4c428\") " pod="openstack/swift-ring-rebalance-mbvj6" Mar 12 15:06:31 crc kubenswrapper[4869]: I0312 15:06:31.069146 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx5r5\" (UniqueName: \"kubernetes.io/projected/db6dbe81-f329-4b21-be85-3ac61ed4c428-kube-api-access-zx5r5\") pod \"swift-ring-rebalance-mbvj6\" (UID: \"db6dbe81-f329-4b21-be85-3ac61ed4c428\") " pod="openstack/swift-ring-rebalance-mbvj6" Mar 12 15:06:31 crc kubenswrapper[4869]: I0312 15:06:31.069182 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/db6dbe81-f329-4b21-be85-3ac61ed4c428-etc-swift\") pod \"swift-ring-rebalance-mbvj6\" (UID: \"db6dbe81-f329-4b21-be85-3ac61ed4c428\") " pod="openstack/swift-ring-rebalance-mbvj6" Mar 12 15:06:31 crc kubenswrapper[4869]: I0312 15:06:31.069240 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/db6dbe81-f329-4b21-be85-3ac61ed4c428-ring-data-devices\") pod \"swift-ring-rebalance-mbvj6\" (UID: \"db6dbe81-f329-4b21-be85-3ac61ed4c428\") " pod="openstack/swift-ring-rebalance-mbvj6" Mar 12 15:06:31 crc kubenswrapper[4869]: I0312 15:06:31.069293 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/db6dbe81-f329-4b21-be85-3ac61ed4c428-dispersionconf\") pod \"swift-ring-rebalance-mbvj6\" (UID: \"db6dbe81-f329-4b21-be85-3ac61ed4c428\") " pod="openstack/swift-ring-rebalance-mbvj6" Mar 12 15:06:31 crc kubenswrapper[4869]: I0312 15:06:31.070010 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/db6dbe81-f329-4b21-be85-3ac61ed4c428-etc-swift\") pod \"swift-ring-rebalance-mbvj6\" (UID: \"db6dbe81-f329-4b21-be85-3ac61ed4c428\") " pod="openstack/swift-ring-rebalance-mbvj6" Mar 12 15:06:31 crc kubenswrapper[4869]: E0312 15:06:31.070308 4869 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 12 15:06:31 crc kubenswrapper[4869]: E0312 15:06:31.070339 4869 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 12 15:06:31 crc kubenswrapper[4869]: E0312 15:06:31.070399 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2260d9c-2497-44bb-9952-341844cf85d0-etc-swift podName:c2260d9c-2497-44bb-9952-341844cf85d0 nodeName:}" failed. No retries permitted until 2026-03-12 15:06:32.070379703 +0000 UTC m=+1144.355604981 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c2260d9c-2497-44bb-9952-341844cf85d0-etc-swift") pod "swift-storage-0" (UID: "c2260d9c-2497-44bb-9952-341844cf85d0") : configmap "swift-ring-files" not found Mar 12 15:06:31 crc kubenswrapper[4869]: I0312 15:06:31.070795 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/db6dbe81-f329-4b21-be85-3ac61ed4c428-scripts\") pod \"swift-ring-rebalance-mbvj6\" (UID: \"db6dbe81-f329-4b21-be85-3ac61ed4c428\") " pod="openstack/swift-ring-rebalance-mbvj6" Mar 12 15:06:31 crc kubenswrapper[4869]: I0312 15:06:31.070876 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/db6dbe81-f329-4b21-be85-3ac61ed4c428-ring-data-devices\") pod \"swift-ring-rebalance-mbvj6\" (UID: \"db6dbe81-f329-4b21-be85-3ac61ed4c428\") " pod="openstack/swift-ring-rebalance-mbvj6" Mar 12 15:06:31 crc kubenswrapper[4869]: I0312 15:06:31.073017 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/db6dbe81-f329-4b21-be85-3ac61ed4c428-dispersionconf\") pod \"swift-ring-rebalance-mbvj6\" (UID: \"db6dbe81-f329-4b21-be85-3ac61ed4c428\") " pod="openstack/swift-ring-rebalance-mbvj6" Mar 12 15:06:31 crc kubenswrapper[4869]: I0312 15:06:31.073471 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db6dbe81-f329-4b21-be85-3ac61ed4c428-combined-ca-bundle\") pod \"swift-ring-rebalance-mbvj6\" (UID: \"db6dbe81-f329-4b21-be85-3ac61ed4c428\") " pod="openstack/swift-ring-rebalance-mbvj6" Mar 12 15:06:31 crc kubenswrapper[4869]: I0312 15:06:31.073611 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/db6dbe81-f329-4b21-be85-3ac61ed4c428-swiftconf\") pod \"swift-ring-rebalance-mbvj6\" (UID: \"db6dbe81-f329-4b21-be85-3ac61ed4c428\") " pod="openstack/swift-ring-rebalance-mbvj6" Mar 12 15:06:31 crc kubenswrapper[4869]: I0312 15:06:31.087726 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx5r5\" (UniqueName: \"kubernetes.io/projected/db6dbe81-f329-4b21-be85-3ac61ed4c428-kube-api-access-zx5r5\") pod \"swift-ring-rebalance-mbvj6\" (UID: \"db6dbe81-f329-4b21-be85-3ac61ed4c428\") " pod="openstack/swift-ring-rebalance-mbvj6" Mar 12 15:06:31 crc kubenswrapper[4869]: I0312 15:06:31.154528 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mbvj6" Mar 12 15:06:31 crc kubenswrapper[4869]: I0312 15:06:31.583603 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mbvj6"] Mar 12 15:06:31 crc kubenswrapper[4869]: W0312 15:06:31.589843 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb6dbe81_f329_4b21_be85_3ac61ed4c428.slice/crio-68a317b6d2b3c98a0dccaf89378cc734295cc49067b94e65174ad5b7936fa478 WatchSource:0}: Error finding container 68a317b6d2b3c98a0dccaf89378cc734295cc49067b94e65174ad5b7936fa478: Status 404 returned error can't find the container with id 68a317b6d2b3c98a0dccaf89378cc734295cc49067b94e65174ad5b7936fa478 Mar 12 15:06:31 crc kubenswrapper[4869]: I0312 15:06:31.655668 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mbvj6" event={"ID":"db6dbe81-f329-4b21-be85-3ac61ed4c428","Type":"ContainerStarted","Data":"68a317b6d2b3c98a0dccaf89378cc734295cc49067b94e65174ad5b7936fa478"} Mar 12 15:06:31 crc kubenswrapper[4869]: I0312 15:06:31.657860 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-mkmpj" event={"ID":"349a0f9a-cebb-4465-9774-b497c95357b6","Type":"ContainerStarted","Data":"0c07fd9e49269aa67a9298f7761da7f26974c0466717a12acdcc05514613aa2a"} Mar 12 15:06:31 crc kubenswrapper[4869]: I0312 15:06:31.657994 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-mkmpj" Mar 12 15:06:31 crc kubenswrapper[4869]: I0312 15:06:31.659256 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-hrwrb" event={"ID":"6c89cdd2-7ac2-46d2-be58-77f8b79acd81","Type":"ContainerStarted","Data":"d148efdc6f3d8f9419bc7324887c57d9b2ad9ca9b59a90c81f54b9b9b3af3c07"} Mar 12 15:06:31 crc kubenswrapper[4869]: I0312 15:06:31.700132 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-mkmpj" podStartSLOduration=2.700117477 podStartE2EDuration="2.700117477s" podCreationTimestamp="2026-03-12 15:06:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:06:31.677998643 +0000 UTC m=+1143.963223921" watchObservedRunningTime="2026-03-12 15:06:31.700117477 +0000 UTC m=+1143.985342755" Mar 12 15:06:31 crc kubenswrapper[4869]: I0312 15:06:31.702442 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-hrwrb" podStartSLOduration=-9223372012.152342 podStartE2EDuration="24.702434234s" podCreationTimestamp="2026-03-12 15:06:07 +0000 UTC" firstStartedPulling="2026-03-12 15:06:08.311750032 +0000 UTC m=+1120.596975310" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:06:31.69673912 +0000 UTC m=+1143.981964428" watchObservedRunningTime="2026-03-12 15:06:31.702434234 +0000 UTC m=+1143.987659512" Mar 12 15:06:31 crc kubenswrapper[4869]: E0312 15:06:31.813823 4869 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.223:45324->38.102.83.223:34669: write tcp 38.102.83.223:45324->38.102.83.223:34669: write: broken pipe Mar 12 15:06:32 crc kubenswrapper[4869]: I0312 15:06:32.087244 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c2260d9c-2497-44bb-9952-341844cf85d0-etc-swift\") pod \"swift-storage-0\" (UID: \"c2260d9c-2497-44bb-9952-341844cf85d0\") " pod="openstack/swift-storage-0" Mar 12 15:06:32 crc kubenswrapper[4869]: E0312 15:06:32.087435 4869 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 12 15:06:32 crc kubenswrapper[4869]: E0312 15:06:32.087469 4869 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 12 15:06:32 crc kubenswrapper[4869]: E0312 15:06:32.087560 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2260d9c-2497-44bb-9952-341844cf85d0-etc-swift podName:c2260d9c-2497-44bb-9952-341844cf85d0 nodeName:}" failed. No retries permitted until 2026-03-12 15:06:34.087515627 +0000 UTC m=+1146.372740905 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c2260d9c-2497-44bb-9952-341844cf85d0-etc-swift") pod "swift-storage-0" (UID: "c2260d9c-2497-44bb-9952-341844cf85d0") : configmap "swift-ring-files" not found Mar 12 15:06:32 crc kubenswrapper[4869]: I0312 15:06:32.348156 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38272c93-d688-45cc-8dd5-1f5ae4549216" path="/var/lib/kubelet/pods/38272c93-d688-45cc-8dd5-1f5ae4549216/volumes" Mar 12 15:06:34 crc kubenswrapper[4869]: I0312 15:06:34.120186 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c2260d9c-2497-44bb-9952-341844cf85d0-etc-swift\") pod \"swift-storage-0\" (UID: \"c2260d9c-2497-44bb-9952-341844cf85d0\") " pod="openstack/swift-storage-0" Mar 12 15:06:34 crc kubenswrapper[4869]: E0312 15:06:34.120744 4869 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 12 15:06:34 crc kubenswrapper[4869]: E0312 15:06:34.120760 4869 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 12 15:06:34 crc kubenswrapper[4869]: E0312 15:06:34.120807 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2260d9c-2497-44bb-9952-341844cf85d0-etc-swift podName:c2260d9c-2497-44bb-9952-341844cf85d0 nodeName:}" failed. No retries permitted until 2026-03-12 15:06:38.120789587 +0000 UTC m=+1150.406014865 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c2260d9c-2497-44bb-9952-341844cf85d0-etc-swift") pod "swift-storage-0" (UID: "c2260d9c-2497-44bb-9952-341844cf85d0") : configmap "swift-ring-files" not found Mar 12 15:06:35 crc kubenswrapper[4869]: I0312 15:06:35.012668 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 12 15:06:35 crc kubenswrapper[4869]: I0312 15:06:35.013233 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 12 15:06:35 crc kubenswrapper[4869]: I0312 15:06:35.078388 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 12 15:06:35 crc kubenswrapper[4869]: I0312 15:06:35.692466 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mbvj6" event={"ID":"db6dbe81-f329-4b21-be85-3ac61ed4c428","Type":"ContainerStarted","Data":"a108c6e5768b08f563571e7b64d9e72cb794f59a6c7914088bdece5e9d12fc27"} Mar 12 15:06:35 crc kubenswrapper[4869]: I0312 15:06:35.710236 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-mbvj6" podStartSLOduration=2.530293295 podStartE2EDuration="5.71021632s" podCreationTimestamp="2026-03-12 15:06:30 +0000 UTC" firstStartedPulling="2026-03-12 15:06:31.591959218 +0000 UTC m=+1143.877184496" lastFinishedPulling="2026-03-12 15:06:34.771882243 +0000 UTC m=+1147.057107521" observedRunningTime="2026-03-12 15:06:35.707668737 +0000 UTC m=+1147.992894055" watchObservedRunningTime="2026-03-12 15:06:35.71021632 +0000 UTC m=+1147.995441598" Mar 12 15:06:35 crc kubenswrapper[4869]: I0312 15:06:35.779311 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 12 15:06:36 crc kubenswrapper[4869]: I0312 15:06:36.481906 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 12 15:06:36 crc kubenswrapper[4869]: I0312 15:06:36.482222 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 12 15:06:36 crc kubenswrapper[4869]: I0312 15:06:36.588490 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 12 15:06:36 crc kubenswrapper[4869]: I0312 15:06:36.763565 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.096121 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-6c3f-account-create-update-pm6md"] Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.097319 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6c3f-account-create-update-pm6md" Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.101518 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.103927 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-6c3f-account-create-update-pm6md"] Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.143621 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-z6tjp"] Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.144732 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-z6tjp" Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.152693 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-z6tjp"] Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.174669 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgj5p\" (UniqueName: \"kubernetes.io/projected/153f24a9-9165-4c37-87da-8b316a1a64e2-kube-api-access-mgj5p\") pod \"glance-6c3f-account-create-update-pm6md\" (UID: \"153f24a9-9165-4c37-87da-8b316a1a64e2\") " pod="openstack/glance-6c3f-account-create-update-pm6md" Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.174907 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/153f24a9-9165-4c37-87da-8b316a1a64e2-operator-scripts\") pod \"glance-6c3f-account-create-update-pm6md\" (UID: \"153f24a9-9165-4c37-87da-8b316a1a64e2\") " pod="openstack/glance-6c3f-account-create-update-pm6md" Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.275934 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgj5p\" (UniqueName: \"kubernetes.io/projected/153f24a9-9165-4c37-87da-8b316a1a64e2-kube-api-access-mgj5p\") pod \"glance-6c3f-account-create-update-pm6md\" (UID: \"153f24a9-9165-4c37-87da-8b316a1a64e2\") " pod="openstack/glance-6c3f-account-create-update-pm6md" Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.276280 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/153f24a9-9165-4c37-87da-8b316a1a64e2-operator-scripts\") pod \"glance-6c3f-account-create-update-pm6md\" (UID: \"153f24a9-9165-4c37-87da-8b316a1a64e2\") " pod="openstack/glance-6c3f-account-create-update-pm6md" Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.277021 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cxk4\" (UniqueName: \"kubernetes.io/projected/47703057-3858-435c-876e-79d208f7e023-kube-api-access-8cxk4\") pod \"glance-db-create-z6tjp\" (UID: \"47703057-3858-435c-876e-79d208f7e023\") " pod="openstack/glance-db-create-z6tjp" Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.277145 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47703057-3858-435c-876e-79d208f7e023-operator-scripts\") pod \"glance-db-create-z6tjp\" (UID: \"47703057-3858-435c-876e-79d208f7e023\") " pod="openstack/glance-db-create-z6tjp" Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.276942 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/153f24a9-9165-4c37-87da-8b316a1a64e2-operator-scripts\") pod \"glance-6c3f-account-create-update-pm6md\" (UID: \"153f24a9-9165-4c37-87da-8b316a1a64e2\") " pod="openstack/glance-6c3f-account-create-update-pm6md" Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.305067 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgj5p\" (UniqueName: \"kubernetes.io/projected/153f24a9-9165-4c37-87da-8b316a1a64e2-kube-api-access-mgj5p\") pod \"glance-6c3f-account-create-update-pm6md\" (UID: \"153f24a9-9165-4c37-87da-8b316a1a64e2\") " pod="openstack/glance-6c3f-account-create-update-pm6md" Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.378440 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cxk4\" (UniqueName: \"kubernetes.io/projected/47703057-3858-435c-876e-79d208f7e023-kube-api-access-8cxk4\") pod \"glance-db-create-z6tjp\" (UID: \"47703057-3858-435c-876e-79d208f7e023\") " pod="openstack/glance-db-create-z6tjp" Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.378508 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47703057-3858-435c-876e-79d208f7e023-operator-scripts\") pod \"glance-db-create-z6tjp\" (UID: \"47703057-3858-435c-876e-79d208f7e023\") " pod="openstack/glance-db-create-z6tjp" Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.380192 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47703057-3858-435c-876e-79d208f7e023-operator-scripts\") pod \"glance-db-create-z6tjp\" (UID: \"47703057-3858-435c-876e-79d208f7e023\") " pod="openstack/glance-db-create-z6tjp" Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.400495 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cxk4\" (UniqueName: \"kubernetes.io/projected/47703057-3858-435c-876e-79d208f7e023-kube-api-access-8cxk4\") pod \"glance-db-create-z6tjp\" (UID: \"47703057-3858-435c-876e-79d208f7e023\") " pod="openstack/glance-db-create-z6tjp" Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.414021 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6c3f-account-create-update-pm6md" Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.465150 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-z6tjp" Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.747335 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-hsxhc"] Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.748581 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hsxhc" Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.762253 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-hsxhc"] Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.853007 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-6c3f-account-create-update-pm6md"] Mar 12 15:06:37 crc kubenswrapper[4869]: W0312 15:06:37.862100 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod153f24a9_9165_4c37_87da_8b316a1a64e2.slice/crio-3ba351699486eb745d1f3b65c78a5be98ae70a336da45dd3a9aa8b2768bb2987 WatchSource:0}: Error finding container 3ba351699486eb745d1f3b65c78a5be98ae70a336da45dd3a9aa8b2768bb2987: Status 404 returned error can't find the container with id 3ba351699486eb745d1f3b65c78a5be98ae70a336da45dd3a9aa8b2768bb2987 Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.863914 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-0104-account-create-update-c8czb"] Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.864925 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0104-account-create-update-c8czb" Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.867482 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.873691 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0104-account-create-update-c8czb"] Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.885997 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac8a7aa9-c42d-4074-a665-a6b2bdc572ae-operator-scripts\") pod \"keystone-db-create-hsxhc\" (UID: \"ac8a7aa9-c42d-4074-a665-a6b2bdc572ae\") " pod="openstack/keystone-db-create-hsxhc" Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.886059 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7rg7\" (UniqueName: \"kubernetes.io/projected/ac8a7aa9-c42d-4074-a665-a6b2bdc572ae-kube-api-access-g7rg7\") pod \"keystone-db-create-hsxhc\" (UID: \"ac8a7aa9-c42d-4074-a665-a6b2bdc572ae\") " pod="openstack/keystone-db-create-hsxhc" Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.964228 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-8j2j6"] Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.965188 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8j2j6" Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.976685 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-8j2j6"] Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.987405 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac8a7aa9-c42d-4074-a665-a6b2bdc572ae-operator-scripts\") pod \"keystone-db-create-hsxhc\" (UID: \"ac8a7aa9-c42d-4074-a665-a6b2bdc572ae\") " pod="openstack/keystone-db-create-hsxhc" Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.987690 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7rg7\" (UniqueName: \"kubernetes.io/projected/ac8a7aa9-c42d-4074-a665-a6b2bdc572ae-kube-api-access-g7rg7\") pod \"keystone-db-create-hsxhc\" (UID: \"ac8a7aa9-c42d-4074-a665-a6b2bdc572ae\") " pod="openstack/keystone-db-create-hsxhc" Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.987727 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj85k\" (UniqueName: \"kubernetes.io/projected/90fff1a6-7608-44d4-907b-25b03dc787f1-kube-api-access-bj85k\") pod \"keystone-0104-account-create-update-c8czb\" (UID: \"90fff1a6-7608-44d4-907b-25b03dc787f1\") " pod="openstack/keystone-0104-account-create-update-c8czb" Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.988031 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90fff1a6-7608-44d4-907b-25b03dc787f1-operator-scripts\") pod \"keystone-0104-account-create-update-c8czb\" (UID: \"90fff1a6-7608-44d4-907b-25b03dc787f1\") " pod="openstack/keystone-0104-account-create-update-c8czb" Mar 12 15:06:37 crc kubenswrapper[4869]: I0312 15:06:37.989325 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac8a7aa9-c42d-4074-a665-a6b2bdc572ae-operator-scripts\") pod \"keystone-db-create-hsxhc\" (UID: \"ac8a7aa9-c42d-4074-a665-a6b2bdc572ae\") " pod="openstack/keystone-db-create-hsxhc" Mar 12 15:06:38 crc kubenswrapper[4869]: W0312 15:06:38.000829 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47703057_3858_435c_876e_79d208f7e023.slice/crio-855eb88957490486bd7a6dcbafdbddd07048738177f840898e5bdc1eedd3ab82 WatchSource:0}: Error finding container 855eb88957490486bd7a6dcbafdbddd07048738177f840898e5bdc1eedd3ab82: Status 404 returned error can't find the container with id 855eb88957490486bd7a6dcbafdbddd07048738177f840898e5bdc1eedd3ab82 Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.010106 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-z6tjp"] Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.013186 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7rg7\" (UniqueName: \"kubernetes.io/projected/ac8a7aa9-c42d-4074-a665-a6b2bdc572ae-kube-api-access-g7rg7\") pod \"keystone-db-create-hsxhc\" (UID: \"ac8a7aa9-c42d-4074-a665-a6b2bdc572ae\") " pod="openstack/keystone-db-create-hsxhc" Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.062311 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-379e-account-create-update-4fhtb"] Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.063889 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-379e-account-create-update-4fhtb" Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.066362 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.071842 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hsxhc" Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.072665 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-379e-account-create-update-4fhtb"] Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.089330 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj85k\" (UniqueName: \"kubernetes.io/projected/90fff1a6-7608-44d4-907b-25b03dc787f1-kube-api-access-bj85k\") pod \"keystone-0104-account-create-update-c8czb\" (UID: \"90fff1a6-7608-44d4-907b-25b03dc787f1\") " pod="openstack/keystone-0104-account-create-update-c8czb" Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.089374 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/031a0a0a-23c0-4e63-959a-a18b826d985c-operator-scripts\") pod \"placement-db-create-8j2j6\" (UID: \"031a0a0a-23c0-4e63-959a-a18b826d985c\") " pod="openstack/placement-db-create-8j2j6" Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.089459 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90fff1a6-7608-44d4-907b-25b03dc787f1-operator-scripts\") pod \"keystone-0104-account-create-update-c8czb\" (UID: \"90fff1a6-7608-44d4-907b-25b03dc787f1\") " pod="openstack/keystone-0104-account-create-update-c8czb" Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.089604 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cf9w\" (UniqueName: \"kubernetes.io/projected/031a0a0a-23c0-4e63-959a-a18b826d985c-kube-api-access-8cf9w\") pod \"placement-db-create-8j2j6\" (UID: \"031a0a0a-23c0-4e63-959a-a18b826d985c\") " pod="openstack/placement-db-create-8j2j6" Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.090025 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90fff1a6-7608-44d4-907b-25b03dc787f1-operator-scripts\") pod \"keystone-0104-account-create-update-c8czb\" (UID: \"90fff1a6-7608-44d4-907b-25b03dc787f1\") " pod="openstack/keystone-0104-account-create-update-c8czb" Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.108421 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj85k\" (UniqueName: \"kubernetes.io/projected/90fff1a6-7608-44d4-907b-25b03dc787f1-kube-api-access-bj85k\") pod \"keystone-0104-account-create-update-c8czb\" (UID: \"90fff1a6-7608-44d4-907b-25b03dc787f1\") " pod="openstack/keystone-0104-account-create-update-c8czb" Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.188325 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0104-account-create-update-c8czb" Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.191184 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/031a0a0a-23c0-4e63-959a-a18b826d985c-operator-scripts\") pod \"placement-db-create-8j2j6\" (UID: \"031a0a0a-23c0-4e63-959a-a18b826d985c\") " pod="openstack/placement-db-create-8j2j6" Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.191280 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aca9b94a-b7c4-46ef-afe9-a61bdf789138-operator-scripts\") pod \"placement-379e-account-create-update-4fhtb\" (UID: \"aca9b94a-b7c4-46ef-afe9-a61bdf789138\") " pod="openstack/placement-379e-account-create-update-4fhtb" Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.191313 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfzmg\" (UniqueName: \"kubernetes.io/projected/aca9b94a-b7c4-46ef-afe9-a61bdf789138-kube-api-access-cfzmg\") pod \"placement-379e-account-create-update-4fhtb\" (UID: \"aca9b94a-b7c4-46ef-afe9-a61bdf789138\") " pod="openstack/placement-379e-account-create-update-4fhtb" Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.191509 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c2260d9c-2497-44bb-9952-341844cf85d0-etc-swift\") pod \"swift-storage-0\" (UID: \"c2260d9c-2497-44bb-9952-341844cf85d0\") " pod="openstack/swift-storage-0" Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.191684 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cf9w\" (UniqueName: \"kubernetes.io/projected/031a0a0a-23c0-4e63-959a-a18b826d985c-kube-api-access-8cf9w\") pod \"placement-db-create-8j2j6\" (UID: \"031a0a0a-23c0-4e63-959a-a18b826d985c\") " pod="openstack/placement-db-create-8j2j6" Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.191871 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/031a0a0a-23c0-4e63-959a-a18b826d985c-operator-scripts\") pod \"placement-db-create-8j2j6\" (UID: \"031a0a0a-23c0-4e63-959a-a18b826d985c\") " pod="openstack/placement-db-create-8j2j6" Mar 12 15:06:38 crc kubenswrapper[4869]: E0312 15:06:38.192013 4869 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 12 15:06:38 crc kubenswrapper[4869]: E0312 15:06:38.192029 4869 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 12 15:06:38 crc kubenswrapper[4869]: E0312 15:06:38.192080 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2260d9c-2497-44bb-9952-341844cf85d0-etc-swift podName:c2260d9c-2497-44bb-9952-341844cf85d0 nodeName:}" failed. No retries permitted until 2026-03-12 15:06:46.192052802 +0000 UTC m=+1158.477278080 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c2260d9c-2497-44bb-9952-341844cf85d0-etc-swift") pod "swift-storage-0" (UID: "c2260d9c-2497-44bb-9952-341844cf85d0") : configmap "swift-ring-files" not found Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.209384 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cf9w\" (UniqueName: \"kubernetes.io/projected/031a0a0a-23c0-4e63-959a-a18b826d985c-kube-api-access-8cf9w\") pod \"placement-db-create-8j2j6\" (UID: \"031a0a0a-23c0-4e63-959a-a18b826d985c\") " pod="openstack/placement-db-create-8j2j6" Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.284690 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8j2j6" Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.292865 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aca9b94a-b7c4-46ef-afe9-a61bdf789138-operator-scripts\") pod \"placement-379e-account-create-update-4fhtb\" (UID: \"aca9b94a-b7c4-46ef-afe9-a61bdf789138\") " pod="openstack/placement-379e-account-create-update-4fhtb" Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.293152 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfzmg\" (UniqueName: \"kubernetes.io/projected/aca9b94a-b7c4-46ef-afe9-a61bdf789138-kube-api-access-cfzmg\") pod \"placement-379e-account-create-update-4fhtb\" (UID: \"aca9b94a-b7c4-46ef-afe9-a61bdf789138\") " pod="openstack/placement-379e-account-create-update-4fhtb" Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.294141 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aca9b94a-b7c4-46ef-afe9-a61bdf789138-operator-scripts\") pod \"placement-379e-account-create-update-4fhtb\" (UID: \"aca9b94a-b7c4-46ef-afe9-a61bdf789138\") " pod="openstack/placement-379e-account-create-update-4fhtb" Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.310429 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfzmg\" (UniqueName: \"kubernetes.io/projected/aca9b94a-b7c4-46ef-afe9-a61bdf789138-kube-api-access-cfzmg\") pod \"placement-379e-account-create-update-4fhtb\" (UID: \"aca9b94a-b7c4-46ef-afe9-a61bdf789138\") " pod="openstack/placement-379e-account-create-update-4fhtb" Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.393204 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-379e-account-create-update-4fhtb" Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.527279 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-hsxhc"] Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.717404 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0104-account-create-update-c8czb"] Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.721748 4869 generic.go:334] "Generic (PLEG): container finished" podID="153f24a9-9165-4c37-87da-8b316a1a64e2" containerID="57f46838f8acc49c64c26dd3e754df9d2ce140ccd168d2986aa03a401f13ba11" exitCode=0 Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.721822 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6c3f-account-create-update-pm6md" event={"ID":"153f24a9-9165-4c37-87da-8b316a1a64e2","Type":"ContainerDied","Data":"57f46838f8acc49c64c26dd3e754df9d2ce140ccd168d2986aa03a401f13ba11"} Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.721861 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6c3f-account-create-update-pm6md" event={"ID":"153f24a9-9165-4c37-87da-8b316a1a64e2","Type":"ContainerStarted","Data":"3ba351699486eb745d1f3b65c78a5be98ae70a336da45dd3a9aa8b2768bb2987"} Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.723044 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hsxhc" event={"ID":"ac8a7aa9-c42d-4074-a665-a6b2bdc572ae","Type":"ContainerStarted","Data":"0f11c17d6fda01797911d63a6fe8618956c8b677c30f987289b01ac858fca9f2"} Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.723068 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hsxhc" event={"ID":"ac8a7aa9-c42d-4074-a665-a6b2bdc572ae","Type":"ContainerStarted","Data":"5e6fb3ce951f0160a3c9e56e5decebf3b477ae08441ea70263b1466c627d8950"} Mar 12 15:06:38 crc kubenswrapper[4869]: W0312 15:06:38.725250 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90fff1a6_7608_44d4_907b_25b03dc787f1.slice/crio-1f4a57f9e7fdf3cdc97581bce6667fe126731f589e56ef4456cb562ef90250b5 WatchSource:0}: Error finding container 1f4a57f9e7fdf3cdc97581bce6667fe126731f589e56ef4456cb562ef90250b5: Status 404 returned error can't find the container with id 1f4a57f9e7fdf3cdc97581bce6667fe126731f589e56ef4456cb562ef90250b5 Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.726770 4869 generic.go:334] "Generic (PLEG): container finished" podID="47703057-3858-435c-876e-79d208f7e023" containerID="af39558788478d12f7c7ceee80794db28f43b752c6ee72bd04ea21cb05f4b98f" exitCode=0 Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.726855 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-z6tjp" event={"ID":"47703057-3858-435c-876e-79d208f7e023","Type":"ContainerDied","Data":"af39558788478d12f7c7ceee80794db28f43b752c6ee72bd04ea21cb05f4b98f"} Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.726894 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-z6tjp" event={"ID":"47703057-3858-435c-876e-79d208f7e023","Type":"ContainerStarted","Data":"855eb88957490486bd7a6dcbafdbddd07048738177f840898e5bdc1eedd3ab82"} Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.755945 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-hsxhc" podStartSLOduration=1.755929659 podStartE2EDuration="1.755929659s" podCreationTimestamp="2026-03-12 15:06:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:06:38.752994735 +0000 UTC m=+1151.038220013" watchObservedRunningTime="2026-03-12 15:06:38.755929659 +0000 UTC m=+1151.041154937" Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.872360 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-8j2j6"] Mar 12 15:06:38 crc kubenswrapper[4869]: W0312 15:06:38.876419 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod031a0a0a_23c0_4e63_959a_a18b826d985c.slice/crio-fc48e29a4cf8351551c4e5ae9905ae31b3194c17b895b89bc41846a66daa96ec WatchSource:0}: Error finding container fc48e29a4cf8351551c4e5ae9905ae31b3194c17b895b89bc41846a66daa96ec: Status 404 returned error can't find the container with id fc48e29a4cf8351551c4e5ae9905ae31b3194c17b895b89bc41846a66daa96ec Mar 12 15:06:38 crc kubenswrapper[4869]: I0312 15:06:38.894436 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-379e-account-create-update-4fhtb"] Mar 12 15:06:38 crc kubenswrapper[4869]: W0312 15:06:38.910915 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaca9b94a_b7c4_46ef_afe9_a61bdf789138.slice/crio-8ecf8dbe043fe55097bedfc7f7d5aeaf64f733f09bd4e3bcb43fcbd42bbe4192 WatchSource:0}: Error finding container 8ecf8dbe043fe55097bedfc7f7d5aeaf64f733f09bd4e3bcb43fcbd42bbe4192: Status 404 returned error can't find the container with id 8ecf8dbe043fe55097bedfc7f7d5aeaf64f733f09bd4e3bcb43fcbd42bbe4192 Mar 12 15:06:39 crc kubenswrapper[4869]: I0312 15:06:39.491650 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-mkmpj" Mar 12 15:06:39 crc kubenswrapper[4869]: I0312 15:06:39.565978 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-km7wm"] Mar 12 15:06:39 crc kubenswrapper[4869]: I0312 15:06:39.566197 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-km7wm" podUID="4d5a854f-c386-4811-bc85-90fa35c9915a" containerName="dnsmasq-dns" containerID="cri-o://d7bf8c07eb2d4e25786fb9a467291e587f7d778a305f24702f7db617158c0bf6" gracePeriod=10 Mar 12 15:06:39 crc kubenswrapper[4869]: I0312 15:06:39.739498 4869 generic.go:334] "Generic (PLEG): container finished" podID="4d5a854f-c386-4811-bc85-90fa35c9915a" containerID="d7bf8c07eb2d4e25786fb9a467291e587f7d778a305f24702f7db617158c0bf6" exitCode=0 Mar 12 15:06:39 crc kubenswrapper[4869]: I0312 15:06:39.739680 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-km7wm" event={"ID":"4d5a854f-c386-4811-bc85-90fa35c9915a","Type":"ContainerDied","Data":"d7bf8c07eb2d4e25786fb9a467291e587f7d778a305f24702f7db617158c0bf6"} Mar 12 15:06:39 crc kubenswrapper[4869]: I0312 15:06:39.741250 4869 generic.go:334] "Generic (PLEG): container finished" podID="031a0a0a-23c0-4e63-959a-a18b826d985c" containerID="6af08a6ff9b419383ee5c8d8382ee8e3ee2134a04856f377c4930ee11e803f97" exitCode=0 Mar 12 15:06:39 crc kubenswrapper[4869]: I0312 15:06:39.741303 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8j2j6" event={"ID":"031a0a0a-23c0-4e63-959a-a18b826d985c","Type":"ContainerDied","Data":"6af08a6ff9b419383ee5c8d8382ee8e3ee2134a04856f377c4930ee11e803f97"} Mar 12 15:06:39 crc kubenswrapper[4869]: I0312 15:06:39.741326 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8j2j6" event={"ID":"031a0a0a-23c0-4e63-959a-a18b826d985c","Type":"ContainerStarted","Data":"fc48e29a4cf8351551c4e5ae9905ae31b3194c17b895b89bc41846a66daa96ec"} Mar 12 15:06:39 crc kubenswrapper[4869]: I0312 15:06:39.772126 4869 generic.go:334] "Generic (PLEG): container finished" podID="90fff1a6-7608-44d4-907b-25b03dc787f1" containerID="d8b7f085be38379d8095677d18d685b0df4208cd2575cd83e383228f23a6875c" exitCode=0 Mar 12 15:06:39 crc kubenswrapper[4869]: I0312 15:06:39.772244 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0104-account-create-update-c8czb" event={"ID":"90fff1a6-7608-44d4-907b-25b03dc787f1","Type":"ContainerDied","Data":"d8b7f085be38379d8095677d18d685b0df4208cd2575cd83e383228f23a6875c"} Mar 12 15:06:39 crc kubenswrapper[4869]: I0312 15:06:39.772274 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0104-account-create-update-c8czb" event={"ID":"90fff1a6-7608-44d4-907b-25b03dc787f1","Type":"ContainerStarted","Data":"1f4a57f9e7fdf3cdc97581bce6667fe126731f589e56ef4456cb562ef90250b5"} Mar 12 15:06:39 crc kubenswrapper[4869]: I0312 15:06:39.781190 4869 generic.go:334] "Generic (PLEG): container finished" podID="aca9b94a-b7c4-46ef-afe9-a61bdf789138" containerID="85735fec3a2ab9087b7382ec8eb98e9ade6eecc5dc90c0f8a1262267adcfe7dd" exitCode=0 Mar 12 15:06:39 crc kubenswrapper[4869]: I0312 15:06:39.781288 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-379e-account-create-update-4fhtb" event={"ID":"aca9b94a-b7c4-46ef-afe9-a61bdf789138","Type":"ContainerDied","Data":"85735fec3a2ab9087b7382ec8eb98e9ade6eecc5dc90c0f8a1262267adcfe7dd"} Mar 12 15:06:39 crc kubenswrapper[4869]: I0312 15:06:39.781319 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-379e-account-create-update-4fhtb" event={"ID":"aca9b94a-b7c4-46ef-afe9-a61bdf789138","Type":"ContainerStarted","Data":"8ecf8dbe043fe55097bedfc7f7d5aeaf64f733f09bd4e3bcb43fcbd42bbe4192"} Mar 12 15:06:39 crc kubenswrapper[4869]: I0312 15:06:39.786875 4869 generic.go:334] "Generic (PLEG): container finished" podID="ac8a7aa9-c42d-4074-a665-a6b2bdc572ae" containerID="0f11c17d6fda01797911d63a6fe8618956c8b677c30f987289b01ac858fca9f2" exitCode=0 Mar 12 15:06:39 crc kubenswrapper[4869]: I0312 15:06:39.787174 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hsxhc" event={"ID":"ac8a7aa9-c42d-4074-a665-a6b2bdc572ae","Type":"ContainerDied","Data":"0f11c17d6fda01797911d63a6fe8618956c8b677c30f987289b01ac858fca9f2"} Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.131696 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-km7wm" Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.234223 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-z6tjp" Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.235126 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d5a854f-c386-4811-bc85-90fa35c9915a-config\") pod \"4d5a854f-c386-4811-bc85-90fa35c9915a\" (UID: \"4d5a854f-c386-4811-bc85-90fa35c9915a\") " Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.235223 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p86k7\" (UniqueName: \"kubernetes.io/projected/4d5a854f-c386-4811-bc85-90fa35c9915a-kube-api-access-p86k7\") pod \"4d5a854f-c386-4811-bc85-90fa35c9915a\" (UID: \"4d5a854f-c386-4811-bc85-90fa35c9915a\") " Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.235330 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d5a854f-c386-4811-bc85-90fa35c9915a-ovsdbserver-sb\") pod \"4d5a854f-c386-4811-bc85-90fa35c9915a\" (UID: \"4d5a854f-c386-4811-bc85-90fa35c9915a\") " Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.235359 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d5a854f-c386-4811-bc85-90fa35c9915a-dns-svc\") pod \"4d5a854f-c386-4811-bc85-90fa35c9915a\" (UID: \"4d5a854f-c386-4811-bc85-90fa35c9915a\") " Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.235386 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d5a854f-c386-4811-bc85-90fa35c9915a-ovsdbserver-nb\") pod \"4d5a854f-c386-4811-bc85-90fa35c9915a\" (UID: \"4d5a854f-c386-4811-bc85-90fa35c9915a\") " Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.253419 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d5a854f-c386-4811-bc85-90fa35c9915a-kube-api-access-p86k7" (OuterVolumeSpecName: "kube-api-access-p86k7") pod "4d5a854f-c386-4811-bc85-90fa35c9915a" (UID: "4d5a854f-c386-4811-bc85-90fa35c9915a"). InnerVolumeSpecName "kube-api-access-p86k7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.274762 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6c3f-account-create-update-pm6md" Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.278098 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d5a854f-c386-4811-bc85-90fa35c9915a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4d5a854f-c386-4811-bc85-90fa35c9915a" (UID: "4d5a854f-c386-4811-bc85-90fa35c9915a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.279866 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d5a854f-c386-4811-bc85-90fa35c9915a-config" (OuterVolumeSpecName: "config") pod "4d5a854f-c386-4811-bc85-90fa35c9915a" (UID: "4d5a854f-c386-4811-bc85-90fa35c9915a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.299762 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d5a854f-c386-4811-bc85-90fa35c9915a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4d5a854f-c386-4811-bc85-90fa35c9915a" (UID: "4d5a854f-c386-4811-bc85-90fa35c9915a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.312020 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d5a854f-c386-4811-bc85-90fa35c9915a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4d5a854f-c386-4811-bc85-90fa35c9915a" (UID: "4d5a854f-c386-4811-bc85-90fa35c9915a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.336318 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgj5p\" (UniqueName: \"kubernetes.io/projected/153f24a9-9165-4c37-87da-8b316a1a64e2-kube-api-access-mgj5p\") pod \"153f24a9-9165-4c37-87da-8b316a1a64e2\" (UID: \"153f24a9-9165-4c37-87da-8b316a1a64e2\") " Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.336396 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cxk4\" (UniqueName: \"kubernetes.io/projected/47703057-3858-435c-876e-79d208f7e023-kube-api-access-8cxk4\") pod \"47703057-3858-435c-876e-79d208f7e023\" (UID: \"47703057-3858-435c-876e-79d208f7e023\") " Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.336440 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47703057-3858-435c-876e-79d208f7e023-operator-scripts\") pod \"47703057-3858-435c-876e-79d208f7e023\" (UID: \"47703057-3858-435c-876e-79d208f7e023\") " Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.336460 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/153f24a9-9165-4c37-87da-8b316a1a64e2-operator-scripts\") pod \"153f24a9-9165-4c37-87da-8b316a1a64e2\" (UID: \"153f24a9-9165-4c37-87da-8b316a1a64e2\") " Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.336979 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d5a854f-c386-4811-bc85-90fa35c9915a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.336999 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d5a854f-c386-4811-bc85-90fa35c9915a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.337012 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d5a854f-c386-4811-bc85-90fa35c9915a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.337023 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d5a854f-c386-4811-bc85-90fa35c9915a-config\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.337037 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p86k7\" (UniqueName: \"kubernetes.io/projected/4d5a854f-c386-4811-bc85-90fa35c9915a-kube-api-access-p86k7\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.337769 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47703057-3858-435c-876e-79d208f7e023-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "47703057-3858-435c-876e-79d208f7e023" (UID: "47703057-3858-435c-876e-79d208f7e023"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.337955 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/153f24a9-9165-4c37-87da-8b316a1a64e2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "153f24a9-9165-4c37-87da-8b316a1a64e2" (UID: "153f24a9-9165-4c37-87da-8b316a1a64e2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.339464 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/153f24a9-9165-4c37-87da-8b316a1a64e2-kube-api-access-mgj5p" (OuterVolumeSpecName: "kube-api-access-mgj5p") pod "153f24a9-9165-4c37-87da-8b316a1a64e2" (UID: "153f24a9-9165-4c37-87da-8b316a1a64e2"). InnerVolumeSpecName "kube-api-access-mgj5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.342079 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47703057-3858-435c-876e-79d208f7e023-kube-api-access-8cxk4" (OuterVolumeSpecName: "kube-api-access-8cxk4") pod "47703057-3858-435c-876e-79d208f7e023" (UID: "47703057-3858-435c-876e-79d208f7e023"). InnerVolumeSpecName "kube-api-access-8cxk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.439067 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cxk4\" (UniqueName: \"kubernetes.io/projected/47703057-3858-435c-876e-79d208f7e023-kube-api-access-8cxk4\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.439115 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47703057-3858-435c-876e-79d208f7e023-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.439177 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/153f24a9-9165-4c37-87da-8b316a1a64e2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.439191 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgj5p\" (UniqueName: \"kubernetes.io/projected/153f24a9-9165-4c37-87da-8b316a1a64e2-kube-api-access-mgj5p\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.796263 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-km7wm" event={"ID":"4d5a854f-c386-4811-bc85-90fa35c9915a","Type":"ContainerDied","Data":"423ab1c0db799ce22fbc35dd25548070f51b75a91399c1cd707d4554d382b860"} Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.796321 4869 scope.go:117] "RemoveContainer" containerID="d7bf8c07eb2d4e25786fb9a467291e587f7d778a305f24702f7db617158c0bf6" Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.796367 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-km7wm" Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.799307 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6c3f-account-create-update-pm6md" event={"ID":"153f24a9-9165-4c37-87da-8b316a1a64e2","Type":"ContainerDied","Data":"3ba351699486eb745d1f3b65c78a5be98ae70a336da45dd3a9aa8b2768bb2987"} Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.799345 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ba351699486eb745d1f3b65c78a5be98ae70a336da45dd3a9aa8b2768bb2987" Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.799389 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6c3f-account-create-update-pm6md" Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.802142 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-z6tjp" event={"ID":"47703057-3858-435c-876e-79d208f7e023","Type":"ContainerDied","Data":"855eb88957490486bd7a6dcbafdbddd07048738177f840898e5bdc1eedd3ab82"} Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.802178 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="855eb88957490486bd7a6dcbafdbddd07048738177f840898e5bdc1eedd3ab82" Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.802129 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-z6tjp" Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.818500 4869 scope.go:117] "RemoveContainer" containerID="2a3021632d133184a1533b85960f5dbf78c3786e46c74ccfef3f593fddd5db30" Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.827814 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-km7wm"] Mar 12 15:06:40 crc kubenswrapper[4869]: I0312 15:06:40.835274 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-km7wm"] Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.118359 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-379e-account-create-update-4fhtb" Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.254113 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aca9b94a-b7c4-46ef-afe9-a61bdf789138-operator-scripts\") pod \"aca9b94a-b7c4-46ef-afe9-a61bdf789138\" (UID: \"aca9b94a-b7c4-46ef-afe9-a61bdf789138\") " Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.254479 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfzmg\" (UniqueName: \"kubernetes.io/projected/aca9b94a-b7c4-46ef-afe9-a61bdf789138-kube-api-access-cfzmg\") pod \"aca9b94a-b7c4-46ef-afe9-a61bdf789138\" (UID: \"aca9b94a-b7c4-46ef-afe9-a61bdf789138\") " Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.254885 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aca9b94a-b7c4-46ef-afe9-a61bdf789138-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aca9b94a-b7c4-46ef-afe9-a61bdf789138" (UID: "aca9b94a-b7c4-46ef-afe9-a61bdf789138"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.258781 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aca9b94a-b7c4-46ef-afe9-a61bdf789138-kube-api-access-cfzmg" (OuterVolumeSpecName: "kube-api-access-cfzmg") pod "aca9b94a-b7c4-46ef-afe9-a61bdf789138" (UID: "aca9b94a-b7c4-46ef-afe9-a61bdf789138"). InnerVolumeSpecName "kube-api-access-cfzmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.295696 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8j2j6" Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.314200 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0104-account-create-update-c8czb" Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.327323 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hsxhc" Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.363093 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cf9w\" (UniqueName: \"kubernetes.io/projected/031a0a0a-23c0-4e63-959a-a18b826d985c-kube-api-access-8cf9w\") pod \"031a0a0a-23c0-4e63-959a-a18b826d985c\" (UID: \"031a0a0a-23c0-4e63-959a-a18b826d985c\") " Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.363295 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bj85k\" (UniqueName: \"kubernetes.io/projected/90fff1a6-7608-44d4-907b-25b03dc787f1-kube-api-access-bj85k\") pod \"90fff1a6-7608-44d4-907b-25b03dc787f1\" (UID: \"90fff1a6-7608-44d4-907b-25b03dc787f1\") " Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.363343 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90fff1a6-7608-44d4-907b-25b03dc787f1-operator-scripts\") pod \"90fff1a6-7608-44d4-907b-25b03dc787f1\" (UID: \"90fff1a6-7608-44d4-907b-25b03dc787f1\") " Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.363371 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/031a0a0a-23c0-4e63-959a-a18b826d985c-operator-scripts\") pod \"031a0a0a-23c0-4e63-959a-a18b826d985c\" (UID: \"031a0a0a-23c0-4e63-959a-a18b826d985c\") " Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.363697 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfzmg\" (UniqueName: \"kubernetes.io/projected/aca9b94a-b7c4-46ef-afe9-a61bdf789138-kube-api-access-cfzmg\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.363711 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aca9b94a-b7c4-46ef-afe9-a61bdf789138-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.364254 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/031a0a0a-23c0-4e63-959a-a18b826d985c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "031a0a0a-23c0-4e63-959a-a18b826d985c" (UID: "031a0a0a-23c0-4e63-959a-a18b826d985c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.364656 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90fff1a6-7608-44d4-907b-25b03dc787f1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "90fff1a6-7608-44d4-907b-25b03dc787f1" (UID: "90fff1a6-7608-44d4-907b-25b03dc787f1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.370392 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90fff1a6-7608-44d4-907b-25b03dc787f1-kube-api-access-bj85k" (OuterVolumeSpecName: "kube-api-access-bj85k") pod "90fff1a6-7608-44d4-907b-25b03dc787f1" (UID: "90fff1a6-7608-44d4-907b-25b03dc787f1"). InnerVolumeSpecName "kube-api-access-bj85k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.376794 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/031a0a0a-23c0-4e63-959a-a18b826d985c-kube-api-access-8cf9w" (OuterVolumeSpecName: "kube-api-access-8cf9w") pod "031a0a0a-23c0-4e63-959a-a18b826d985c" (UID: "031a0a0a-23c0-4e63-959a-a18b826d985c"). InnerVolumeSpecName "kube-api-access-8cf9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.464969 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7rg7\" (UniqueName: \"kubernetes.io/projected/ac8a7aa9-c42d-4074-a665-a6b2bdc572ae-kube-api-access-g7rg7\") pod \"ac8a7aa9-c42d-4074-a665-a6b2bdc572ae\" (UID: \"ac8a7aa9-c42d-4074-a665-a6b2bdc572ae\") " Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.465117 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac8a7aa9-c42d-4074-a665-a6b2bdc572ae-operator-scripts\") pod \"ac8a7aa9-c42d-4074-a665-a6b2bdc572ae\" (UID: \"ac8a7aa9-c42d-4074-a665-a6b2bdc572ae\") " Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.465501 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bj85k\" (UniqueName: \"kubernetes.io/projected/90fff1a6-7608-44d4-907b-25b03dc787f1-kube-api-access-bj85k\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.465522 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90fff1a6-7608-44d4-907b-25b03dc787f1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.465530 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/031a0a0a-23c0-4e63-959a-a18b826d985c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.465558 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cf9w\" (UniqueName: \"kubernetes.io/projected/031a0a0a-23c0-4e63-959a-a18b826d985c-kube-api-access-8cf9w\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.465826 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac8a7aa9-c42d-4074-a665-a6b2bdc572ae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ac8a7aa9-c42d-4074-a665-a6b2bdc572ae" (UID: "ac8a7aa9-c42d-4074-a665-a6b2bdc572ae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.467587 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac8a7aa9-c42d-4074-a665-a6b2bdc572ae-kube-api-access-g7rg7" (OuterVolumeSpecName: "kube-api-access-g7rg7") pod "ac8a7aa9-c42d-4074-a665-a6b2bdc572ae" (UID: "ac8a7aa9-c42d-4074-a665-a6b2bdc572ae"). InnerVolumeSpecName "kube-api-access-g7rg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.568831 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7rg7\" (UniqueName: \"kubernetes.io/projected/ac8a7aa9-c42d-4074-a665-a6b2bdc572ae-kube-api-access-g7rg7\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.568875 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac8a7aa9-c42d-4074-a665-a6b2bdc572ae-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.810489 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8j2j6" Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.810483 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8j2j6" event={"ID":"031a0a0a-23c0-4e63-959a-a18b826d985c","Type":"ContainerDied","Data":"fc48e29a4cf8351551c4e5ae9905ae31b3194c17b895b89bc41846a66daa96ec"} Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.810567 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc48e29a4cf8351551c4e5ae9905ae31b3194c17b895b89bc41846a66daa96ec" Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.812137 4869 generic.go:334] "Generic (PLEG): container finished" podID="db6dbe81-f329-4b21-be85-3ac61ed4c428" containerID="a108c6e5768b08f563571e7b64d9e72cb794f59a6c7914088bdece5e9d12fc27" exitCode=0 Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.812224 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mbvj6" event={"ID":"db6dbe81-f329-4b21-be85-3ac61ed4c428","Type":"ContainerDied","Data":"a108c6e5768b08f563571e7b64d9e72cb794f59a6c7914088bdece5e9d12fc27"} Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.813977 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0104-account-create-update-c8czb" event={"ID":"90fff1a6-7608-44d4-907b-25b03dc787f1","Type":"ContainerDied","Data":"1f4a57f9e7fdf3cdc97581bce6667fe126731f589e56ef4456cb562ef90250b5"} Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.814001 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0104-account-create-update-c8czb" Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.814008 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f4a57f9e7fdf3cdc97581bce6667fe126731f589e56ef4456cb562ef90250b5" Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.815965 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-379e-account-create-update-4fhtb" Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.815990 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-379e-account-create-update-4fhtb" event={"ID":"aca9b94a-b7c4-46ef-afe9-a61bdf789138","Type":"ContainerDied","Data":"8ecf8dbe043fe55097bedfc7f7d5aeaf64f733f09bd4e3bcb43fcbd42bbe4192"} Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.816005 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ecf8dbe043fe55097bedfc7f7d5aeaf64f733f09bd4e3bcb43fcbd42bbe4192" Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.816827 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-hsxhc" event={"ID":"ac8a7aa9-c42d-4074-a665-a6b2bdc572ae","Type":"ContainerDied","Data":"5e6fb3ce951f0160a3c9e56e5decebf3b477ae08441ea70263b1466c627d8950"} Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.816845 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e6fb3ce951f0160a3c9e56e5decebf3b477ae08441ea70263b1466c627d8950" Mar 12 15:06:41 crc kubenswrapper[4869]: I0312 15:06:41.816894 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-hsxhc" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.345361 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d5a854f-c386-4811-bc85-90fa35c9915a" path="/var/lib/kubelet/pods/4d5a854f-c386-4811-bc85-90fa35c9915a/volumes" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.347201 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-zn9wl"] Mar 12 15:06:42 crc kubenswrapper[4869]: E0312 15:06:42.347503 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90fff1a6-7608-44d4-907b-25b03dc787f1" containerName="mariadb-account-create-update" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.347523 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="90fff1a6-7608-44d4-907b-25b03dc787f1" containerName="mariadb-account-create-update" Mar 12 15:06:42 crc kubenswrapper[4869]: E0312 15:06:42.347557 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d5a854f-c386-4811-bc85-90fa35c9915a" containerName="init" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.347566 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d5a854f-c386-4811-bc85-90fa35c9915a" containerName="init" Mar 12 15:06:42 crc kubenswrapper[4869]: E0312 15:06:42.347576 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aca9b94a-b7c4-46ef-afe9-a61bdf789138" containerName="mariadb-account-create-update" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.347582 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="aca9b94a-b7c4-46ef-afe9-a61bdf789138" containerName="mariadb-account-create-update" Mar 12 15:06:42 crc kubenswrapper[4869]: E0312 15:06:42.347593 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac8a7aa9-c42d-4074-a665-a6b2bdc572ae" containerName="mariadb-database-create" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.347598 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac8a7aa9-c42d-4074-a665-a6b2bdc572ae" containerName="mariadb-database-create" Mar 12 15:06:42 crc kubenswrapper[4869]: E0312 15:06:42.347609 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d5a854f-c386-4811-bc85-90fa35c9915a" containerName="dnsmasq-dns" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.347616 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d5a854f-c386-4811-bc85-90fa35c9915a" containerName="dnsmasq-dns" Mar 12 15:06:42 crc kubenswrapper[4869]: E0312 15:06:42.347629 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47703057-3858-435c-876e-79d208f7e023" containerName="mariadb-database-create" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.347637 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="47703057-3858-435c-876e-79d208f7e023" containerName="mariadb-database-create" Mar 12 15:06:42 crc kubenswrapper[4869]: E0312 15:06:42.347648 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="153f24a9-9165-4c37-87da-8b316a1a64e2" containerName="mariadb-account-create-update" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.347655 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="153f24a9-9165-4c37-87da-8b316a1a64e2" containerName="mariadb-account-create-update" Mar 12 15:06:42 crc kubenswrapper[4869]: E0312 15:06:42.347687 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="031a0a0a-23c0-4e63-959a-a18b826d985c" containerName="mariadb-database-create" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.347694 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="031a0a0a-23c0-4e63-959a-a18b826d985c" containerName="mariadb-database-create" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.347852 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="031a0a0a-23c0-4e63-959a-a18b826d985c" containerName="mariadb-database-create" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.347871 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="90fff1a6-7608-44d4-907b-25b03dc787f1" containerName="mariadb-account-create-update" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.347882 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d5a854f-c386-4811-bc85-90fa35c9915a" containerName="dnsmasq-dns" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.347893 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="aca9b94a-b7c4-46ef-afe9-a61bdf789138" containerName="mariadb-account-create-update" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.347905 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="153f24a9-9165-4c37-87da-8b316a1a64e2" containerName="mariadb-account-create-update" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.347918 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="47703057-3858-435c-876e-79d208f7e023" containerName="mariadb-database-create" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.347927 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac8a7aa9-c42d-4074-a665-a6b2bdc572ae" containerName="mariadb-database-create" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.348488 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zn9wl" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.351224 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-g4pjv" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.351356 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.358114 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-zn9wl"] Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.484107 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f34b7c-48b6-4191-b62a-8d8c373b3197-combined-ca-bundle\") pod \"glance-db-sync-zn9wl\" (UID: \"d3f34b7c-48b6-4191-b62a-8d8c373b3197\") " pod="openstack/glance-db-sync-zn9wl" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.484198 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3f34b7c-48b6-4191-b62a-8d8c373b3197-config-data\") pod \"glance-db-sync-zn9wl\" (UID: \"d3f34b7c-48b6-4191-b62a-8d8c373b3197\") " pod="openstack/glance-db-sync-zn9wl" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.484363 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksjfs\" (UniqueName: \"kubernetes.io/projected/d3f34b7c-48b6-4191-b62a-8d8c373b3197-kube-api-access-ksjfs\") pod \"glance-db-sync-zn9wl\" (UID: \"d3f34b7c-48b6-4191-b62a-8d8c373b3197\") " pod="openstack/glance-db-sync-zn9wl" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.484527 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d3f34b7c-48b6-4191-b62a-8d8c373b3197-db-sync-config-data\") pod \"glance-db-sync-zn9wl\" (UID: \"d3f34b7c-48b6-4191-b62a-8d8c373b3197\") " pod="openstack/glance-db-sync-zn9wl" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.586251 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f34b7c-48b6-4191-b62a-8d8c373b3197-combined-ca-bundle\") pod \"glance-db-sync-zn9wl\" (UID: \"d3f34b7c-48b6-4191-b62a-8d8c373b3197\") " pod="openstack/glance-db-sync-zn9wl" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.586320 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3f34b7c-48b6-4191-b62a-8d8c373b3197-config-data\") pod \"glance-db-sync-zn9wl\" (UID: \"d3f34b7c-48b6-4191-b62a-8d8c373b3197\") " pod="openstack/glance-db-sync-zn9wl" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.586353 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksjfs\" (UniqueName: \"kubernetes.io/projected/d3f34b7c-48b6-4191-b62a-8d8c373b3197-kube-api-access-ksjfs\") pod \"glance-db-sync-zn9wl\" (UID: \"d3f34b7c-48b6-4191-b62a-8d8c373b3197\") " pod="openstack/glance-db-sync-zn9wl" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.586393 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d3f34b7c-48b6-4191-b62a-8d8c373b3197-db-sync-config-data\") pod \"glance-db-sync-zn9wl\" (UID: \"d3f34b7c-48b6-4191-b62a-8d8c373b3197\") " pod="openstack/glance-db-sync-zn9wl" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.592133 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3f34b7c-48b6-4191-b62a-8d8c373b3197-config-data\") pod \"glance-db-sync-zn9wl\" (UID: \"d3f34b7c-48b6-4191-b62a-8d8c373b3197\") " pod="openstack/glance-db-sync-zn9wl" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.592253 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f34b7c-48b6-4191-b62a-8d8c373b3197-combined-ca-bundle\") pod \"glance-db-sync-zn9wl\" (UID: \"d3f34b7c-48b6-4191-b62a-8d8c373b3197\") " pod="openstack/glance-db-sync-zn9wl" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.593610 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d3f34b7c-48b6-4191-b62a-8d8c373b3197-db-sync-config-data\") pod \"glance-db-sync-zn9wl\" (UID: \"d3f34b7c-48b6-4191-b62a-8d8c373b3197\") " pod="openstack/glance-db-sync-zn9wl" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.604074 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksjfs\" (UniqueName: \"kubernetes.io/projected/d3f34b7c-48b6-4191-b62a-8d8c373b3197-kube-api-access-ksjfs\") pod \"glance-db-sync-zn9wl\" (UID: \"d3f34b7c-48b6-4191-b62a-8d8c373b3197\") " pod="openstack/glance-db-sync-zn9wl" Mar 12 15:06:42 crc kubenswrapper[4869]: I0312 15:06:42.663992 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zn9wl" Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.118507 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mbvj6" Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.161180 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-zn9wl"] Mar 12 15:06:43 crc kubenswrapper[4869]: W0312 15:06:43.163357 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3f34b7c_48b6_4191_b62a_8d8c373b3197.slice/crio-aaa7d15f7f333b0dfe3e3c6e65c0fbb63feb9d24e6a2c1636d3e34904cf55f27 WatchSource:0}: Error finding container aaa7d15f7f333b0dfe3e3c6e65c0fbb63feb9d24e6a2c1636d3e34904cf55f27: Status 404 returned error can't find the container with id aaa7d15f7f333b0dfe3e3c6e65c0fbb63feb9d24e6a2c1636d3e34904cf55f27 Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.204451 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/db6dbe81-f329-4b21-be85-3ac61ed4c428-etc-swift\") pod \"db6dbe81-f329-4b21-be85-3ac61ed4c428\" (UID: \"db6dbe81-f329-4b21-be85-3ac61ed4c428\") " Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.204528 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/db6dbe81-f329-4b21-be85-3ac61ed4c428-dispersionconf\") pod \"db6dbe81-f329-4b21-be85-3ac61ed4c428\" (UID: \"db6dbe81-f329-4b21-be85-3ac61ed4c428\") " Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.204604 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db6dbe81-f329-4b21-be85-3ac61ed4c428-combined-ca-bundle\") pod \"db6dbe81-f329-4b21-be85-3ac61ed4c428\" (UID: \"db6dbe81-f329-4b21-be85-3ac61ed4c428\") " Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.204636 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/db6dbe81-f329-4b21-be85-3ac61ed4c428-scripts\") pod \"db6dbe81-f329-4b21-be85-3ac61ed4c428\" (UID: \"db6dbe81-f329-4b21-be85-3ac61ed4c428\") " Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.204680 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zx5r5\" (UniqueName: \"kubernetes.io/projected/db6dbe81-f329-4b21-be85-3ac61ed4c428-kube-api-access-zx5r5\") pod \"db6dbe81-f329-4b21-be85-3ac61ed4c428\" (UID: \"db6dbe81-f329-4b21-be85-3ac61ed4c428\") " Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.204696 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/db6dbe81-f329-4b21-be85-3ac61ed4c428-ring-data-devices\") pod \"db6dbe81-f329-4b21-be85-3ac61ed4c428\" (UID: \"db6dbe81-f329-4b21-be85-3ac61ed4c428\") " Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.204754 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/db6dbe81-f329-4b21-be85-3ac61ed4c428-swiftconf\") pod \"db6dbe81-f329-4b21-be85-3ac61ed4c428\" (UID: \"db6dbe81-f329-4b21-be85-3ac61ed4c428\") " Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.206285 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db6dbe81-f329-4b21-be85-3ac61ed4c428-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "db6dbe81-f329-4b21-be85-3ac61ed4c428" (UID: "db6dbe81-f329-4b21-be85-3ac61ed4c428"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.206903 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db6dbe81-f329-4b21-be85-3ac61ed4c428-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "db6dbe81-f329-4b21-be85-3ac61ed4c428" (UID: "db6dbe81-f329-4b21-be85-3ac61ed4c428"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.211819 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db6dbe81-f329-4b21-be85-3ac61ed4c428-kube-api-access-zx5r5" (OuterVolumeSpecName: "kube-api-access-zx5r5") pod "db6dbe81-f329-4b21-be85-3ac61ed4c428" (UID: "db6dbe81-f329-4b21-be85-3ac61ed4c428"). InnerVolumeSpecName "kube-api-access-zx5r5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.212096 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db6dbe81-f329-4b21-be85-3ac61ed4c428-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "db6dbe81-f329-4b21-be85-3ac61ed4c428" (UID: "db6dbe81-f329-4b21-be85-3ac61ed4c428"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.224660 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db6dbe81-f329-4b21-be85-3ac61ed4c428-scripts" (OuterVolumeSpecName: "scripts") pod "db6dbe81-f329-4b21-be85-3ac61ed4c428" (UID: "db6dbe81-f329-4b21-be85-3ac61ed4c428"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.225874 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db6dbe81-f329-4b21-be85-3ac61ed4c428-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db6dbe81-f329-4b21-be85-3ac61ed4c428" (UID: "db6dbe81-f329-4b21-be85-3ac61ed4c428"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.236601 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db6dbe81-f329-4b21-be85-3ac61ed4c428-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "db6dbe81-f329-4b21-be85-3ac61ed4c428" (UID: "db6dbe81-f329-4b21-be85-3ac61ed4c428"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.306012 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zx5r5\" (UniqueName: \"kubernetes.io/projected/db6dbe81-f329-4b21-be85-3ac61ed4c428-kube-api-access-zx5r5\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.306042 4869 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/db6dbe81-f329-4b21-be85-3ac61ed4c428-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.306051 4869 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/db6dbe81-f329-4b21-be85-3ac61ed4c428-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.306059 4869 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/db6dbe81-f329-4b21-be85-3ac61ed4c428-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.306067 4869 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/db6dbe81-f329-4b21-be85-3ac61ed4c428-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.306077 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db6dbe81-f329-4b21-be85-3ac61ed4c428-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.306087 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/db6dbe81-f329-4b21-be85-3ac61ed4c428-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.677363 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-84tzk"] Mar 12 15:06:43 crc kubenswrapper[4869]: E0312 15:06:43.677719 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db6dbe81-f329-4b21-be85-3ac61ed4c428" containerName="swift-ring-rebalance" Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.677733 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="db6dbe81-f329-4b21-be85-3ac61ed4c428" containerName="swift-ring-rebalance" Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.677909 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="db6dbe81-f329-4b21-be85-3ac61ed4c428" containerName="swift-ring-rebalance" Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.678556 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-84tzk" Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.681642 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.688063 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-84tzk"] Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.813132 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee3ac95d-a850-4fbe-8b50-32e1f893ed4d-operator-scripts\") pod \"root-account-create-update-84tzk\" (UID: \"ee3ac95d-a850-4fbe-8b50-32e1f893ed4d\") " pod="openstack/root-account-create-update-84tzk" Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.813194 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm427\" (UniqueName: \"kubernetes.io/projected/ee3ac95d-a850-4fbe-8b50-32e1f893ed4d-kube-api-access-mm427\") pod \"root-account-create-update-84tzk\" (UID: \"ee3ac95d-a850-4fbe-8b50-32e1f893ed4d\") " pod="openstack/root-account-create-update-84tzk" Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.838508 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mbvj6" event={"ID":"db6dbe81-f329-4b21-be85-3ac61ed4c428","Type":"ContainerDied","Data":"68a317b6d2b3c98a0dccaf89378cc734295cc49067b94e65174ad5b7936fa478"} Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.838535 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mbvj6" Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.838564 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68a317b6d2b3c98a0dccaf89378cc734295cc49067b94e65174ad5b7936fa478" Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.839974 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zn9wl" event={"ID":"d3f34b7c-48b6-4191-b62a-8d8c373b3197","Type":"ContainerStarted","Data":"aaa7d15f7f333b0dfe3e3c6e65c0fbb63feb9d24e6a2c1636d3e34904cf55f27"} Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.915196 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee3ac95d-a850-4fbe-8b50-32e1f893ed4d-operator-scripts\") pod \"root-account-create-update-84tzk\" (UID: \"ee3ac95d-a850-4fbe-8b50-32e1f893ed4d\") " pod="openstack/root-account-create-update-84tzk" Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.915254 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm427\" (UniqueName: \"kubernetes.io/projected/ee3ac95d-a850-4fbe-8b50-32e1f893ed4d-kube-api-access-mm427\") pod \"root-account-create-update-84tzk\" (UID: \"ee3ac95d-a850-4fbe-8b50-32e1f893ed4d\") " pod="openstack/root-account-create-update-84tzk" Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.916380 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee3ac95d-a850-4fbe-8b50-32e1f893ed4d-operator-scripts\") pod \"root-account-create-update-84tzk\" (UID: \"ee3ac95d-a850-4fbe-8b50-32e1f893ed4d\") " pod="openstack/root-account-create-update-84tzk" Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.931525 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm427\" (UniqueName: \"kubernetes.io/projected/ee3ac95d-a850-4fbe-8b50-32e1f893ed4d-kube-api-access-mm427\") pod \"root-account-create-update-84tzk\" (UID: \"ee3ac95d-a850-4fbe-8b50-32e1f893ed4d\") " pod="openstack/root-account-create-update-84tzk" Mar 12 15:06:43 crc kubenswrapper[4869]: I0312 15:06:43.996086 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-84tzk" Mar 12 15:06:44 crc kubenswrapper[4869]: I0312 15:06:44.272392 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-84tzk"] Mar 12 15:06:44 crc kubenswrapper[4869]: W0312 15:06:44.275732 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee3ac95d_a850_4fbe_8b50_32e1f893ed4d.slice/crio-2e764b56c8996ba42b0ed077e39f1e2d7ddb8fc712de1f5d95f4a5f312f30baf WatchSource:0}: Error finding container 2e764b56c8996ba42b0ed077e39f1e2d7ddb8fc712de1f5d95f4a5f312f30baf: Status 404 returned error can't find the container with id 2e764b56c8996ba42b0ed077e39f1e2d7ddb8fc712de1f5d95f4a5f312f30baf Mar 12 15:06:44 crc kubenswrapper[4869]: I0312 15:06:44.849199 4869 generic.go:334] "Generic (PLEG): container finished" podID="ee3ac95d-a850-4fbe-8b50-32e1f893ed4d" containerID="7f4f5d1dfdf83a25a682596cf7fa1dab03645541b1862af39fad01aa7f207ea7" exitCode=0 Mar 12 15:06:44 crc kubenswrapper[4869]: I0312 15:06:44.849241 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-84tzk" event={"ID":"ee3ac95d-a850-4fbe-8b50-32e1f893ed4d","Type":"ContainerDied","Data":"7f4f5d1dfdf83a25a682596cf7fa1dab03645541b1862af39fad01aa7f207ea7"} Mar 12 15:06:44 crc kubenswrapper[4869]: I0312 15:06:44.849612 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-84tzk" event={"ID":"ee3ac95d-a850-4fbe-8b50-32e1f893ed4d","Type":"ContainerStarted","Data":"2e764b56c8996ba42b0ed077e39f1e2d7ddb8fc712de1f5d95f4a5f312f30baf"} Mar 12 15:06:46 crc kubenswrapper[4869]: I0312 15:06:46.161672 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-84tzk" Mar 12 15:06:46 crc kubenswrapper[4869]: I0312 15:06:46.264406 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee3ac95d-a850-4fbe-8b50-32e1f893ed4d-operator-scripts\") pod \"ee3ac95d-a850-4fbe-8b50-32e1f893ed4d\" (UID: \"ee3ac95d-a850-4fbe-8b50-32e1f893ed4d\") " Mar 12 15:06:46 crc kubenswrapper[4869]: I0312 15:06:46.264838 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm427\" (UniqueName: \"kubernetes.io/projected/ee3ac95d-a850-4fbe-8b50-32e1f893ed4d-kube-api-access-mm427\") pod \"ee3ac95d-a850-4fbe-8b50-32e1f893ed4d\" (UID: \"ee3ac95d-a850-4fbe-8b50-32e1f893ed4d\") " Mar 12 15:06:46 crc kubenswrapper[4869]: I0312 15:06:46.265153 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c2260d9c-2497-44bb-9952-341844cf85d0-etc-swift\") pod \"swift-storage-0\" (UID: \"c2260d9c-2497-44bb-9952-341844cf85d0\") " pod="openstack/swift-storage-0" Mar 12 15:06:46 crc kubenswrapper[4869]: I0312 15:06:46.266678 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee3ac95d-a850-4fbe-8b50-32e1f893ed4d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ee3ac95d-a850-4fbe-8b50-32e1f893ed4d" (UID: "ee3ac95d-a850-4fbe-8b50-32e1f893ed4d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:06:46 crc kubenswrapper[4869]: I0312 15:06:46.271292 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee3ac95d-a850-4fbe-8b50-32e1f893ed4d-kube-api-access-mm427" (OuterVolumeSpecName: "kube-api-access-mm427") pod "ee3ac95d-a850-4fbe-8b50-32e1f893ed4d" (UID: "ee3ac95d-a850-4fbe-8b50-32e1f893ed4d"). InnerVolumeSpecName "kube-api-access-mm427". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:06:46 crc kubenswrapper[4869]: I0312 15:06:46.275250 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c2260d9c-2497-44bb-9952-341844cf85d0-etc-swift\") pod \"swift-storage-0\" (UID: \"c2260d9c-2497-44bb-9952-341844cf85d0\") " pod="openstack/swift-storage-0" Mar 12 15:06:46 crc kubenswrapper[4869]: I0312 15:06:46.288705 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 12 15:06:46 crc kubenswrapper[4869]: I0312 15:06:46.366974 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm427\" (UniqueName: \"kubernetes.io/projected/ee3ac95d-a850-4fbe-8b50-32e1f893ed4d-kube-api-access-mm427\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:46 crc kubenswrapper[4869]: I0312 15:06:46.367003 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee3ac95d-a850-4fbe-8b50-32e1f893ed4d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:46 crc kubenswrapper[4869]: I0312 15:06:46.782962 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 12 15:06:46 crc kubenswrapper[4869]: I0312 15:06:46.866378 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c2260d9c-2497-44bb-9952-341844cf85d0","Type":"ContainerStarted","Data":"3c4723d1bb34ec95af041b302a75e82e2cb7a763c81e10398f5b01a9fe38e16c"} Mar 12 15:06:46 crc kubenswrapper[4869]: I0312 15:06:46.868464 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-84tzk" event={"ID":"ee3ac95d-a850-4fbe-8b50-32e1f893ed4d","Type":"ContainerDied","Data":"2e764b56c8996ba42b0ed077e39f1e2d7ddb8fc712de1f5d95f4a5f312f30baf"} Mar 12 15:06:46 crc kubenswrapper[4869]: I0312 15:06:46.868515 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e764b56c8996ba42b0ed077e39f1e2d7ddb8fc712de1f5d95f4a5f312f30baf" Mar 12 15:06:46 crc kubenswrapper[4869]: I0312 15:06:46.868524 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-84tzk" Mar 12 15:06:48 crc kubenswrapper[4869]: I0312 15:06:48.209111 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 12 15:06:50 crc kubenswrapper[4869]: I0312 15:06:50.136231 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-84tzk"] Mar 12 15:06:50 crc kubenswrapper[4869]: I0312 15:06:50.141806 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-84tzk"] Mar 12 15:06:50 crc kubenswrapper[4869]: I0312 15:06:50.348479 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee3ac95d-a850-4fbe-8b50-32e1f893ed4d" path="/var/lib/kubelet/pods/ee3ac95d-a850-4fbe-8b50-32e1f893ed4d/volumes" Mar 12 15:06:52 crc kubenswrapper[4869]: I0312 15:06:52.000818 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-wzzxq" podUID="ad973222-a042-43af-9c00-b0f6d795c7d1" containerName="ovn-controller" probeResult="failure" output=< Mar 12 15:06:52 crc kubenswrapper[4869]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 12 15:06:52 crc kubenswrapper[4869]: > Mar 12 15:06:52 crc kubenswrapper[4869]: I0312 15:06:52.052259 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-jvmhv" Mar 12 15:06:53 crc kubenswrapper[4869]: I0312 15:06:53.931979 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c2260d9c-2497-44bb-9952-341844cf85d0","Type":"ContainerStarted","Data":"01341cd6f2e663abd08f832a77b9be1c8492ee438190b20e66013a803b8e4223"} Mar 12 15:06:53 crc kubenswrapper[4869]: I0312 15:06:53.932345 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c2260d9c-2497-44bb-9952-341844cf85d0","Type":"ContainerStarted","Data":"7346fac71244f33c7c31b96c26f89adc59eba371911ab596c61f3f5c73666a13"} Mar 12 15:06:54 crc kubenswrapper[4869]: I0312 15:06:54.943823 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c2260d9c-2497-44bb-9952-341844cf85d0","Type":"ContainerStarted","Data":"2b480238ae951eea047143565144418a1af25781990172ea1b8e33d2a7434a09"} Mar 12 15:06:54 crc kubenswrapper[4869]: I0312 15:06:54.944410 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c2260d9c-2497-44bb-9952-341844cf85d0","Type":"ContainerStarted","Data":"5a08cafbf8e7537c8dce3478a0a312b11bb24ee7ee83dc6f93ff4aaf27a19e66"} Mar 12 15:06:54 crc kubenswrapper[4869]: I0312 15:06:54.946412 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zn9wl" event={"ID":"d3f34b7c-48b6-4191-b62a-8d8c373b3197","Type":"ContainerStarted","Data":"a481a856bf43915cc69cc9fb7c87bdace3c38e60a67f180da8a40a46b4c0e65e"} Mar 12 15:06:54 crc kubenswrapper[4869]: I0312 15:06:54.953831 4869 generic.go:334] "Generic (PLEG): container finished" podID="e0323899-ea3b-4572-baa4-3483b0d5fd86" containerID="4fcf47aa2c397bc1f0fa16216f4cc78821cbeb8188e49ef858d02fe2468e098b" exitCode=0 Mar 12 15:06:54 crc kubenswrapper[4869]: I0312 15:06:54.953901 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e0323899-ea3b-4572-baa4-3483b0d5fd86","Type":"ContainerDied","Data":"4fcf47aa2c397bc1f0fa16216f4cc78821cbeb8188e49ef858d02fe2468e098b"} Mar 12 15:06:54 crc kubenswrapper[4869]: I0312 15:06:54.956116 4869 generic.go:334] "Generic (PLEG): container finished" podID="3e764959-1933-4a88-b8de-fd853d49a0d3" containerID="fb475407f5163c190c98f95c63c48745bc8f6824ce9801f43a20edc1296046f5" exitCode=0 Mar 12 15:06:54 crc kubenswrapper[4869]: I0312 15:06:54.956147 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3e764959-1933-4a88-b8de-fd853d49a0d3","Type":"ContainerDied","Data":"fb475407f5163c190c98f95c63c48745bc8f6824ce9801f43a20edc1296046f5"} Mar 12 15:06:54 crc kubenswrapper[4869]: I0312 15:06:54.967813 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-zn9wl" podStartSLOduration=2.581369337 podStartE2EDuration="12.967797394s" podCreationTimestamp="2026-03-12 15:06:42 +0000 UTC" firstStartedPulling="2026-03-12 15:06:43.1652079 +0000 UTC m=+1155.450433178" lastFinishedPulling="2026-03-12 15:06:53.551635957 +0000 UTC m=+1165.836861235" observedRunningTime="2026-03-12 15:06:54.965797526 +0000 UTC m=+1167.251022804" watchObservedRunningTime="2026-03-12 15:06:54.967797394 +0000 UTC m=+1167.253022692" Mar 12 15:06:55 crc kubenswrapper[4869]: I0312 15:06:55.138468 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-z6955"] Mar 12 15:06:55 crc kubenswrapper[4869]: E0312 15:06:55.138795 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee3ac95d-a850-4fbe-8b50-32e1f893ed4d" containerName="mariadb-account-create-update" Mar 12 15:06:55 crc kubenswrapper[4869]: I0312 15:06:55.138813 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee3ac95d-a850-4fbe-8b50-32e1f893ed4d" containerName="mariadb-account-create-update" Mar 12 15:06:55 crc kubenswrapper[4869]: I0312 15:06:55.138993 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee3ac95d-a850-4fbe-8b50-32e1f893ed4d" containerName="mariadb-account-create-update" Mar 12 15:06:55 crc kubenswrapper[4869]: I0312 15:06:55.139617 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z6955" Mar 12 15:06:55 crc kubenswrapper[4869]: I0312 15:06:55.142321 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 12 15:06:55 crc kubenswrapper[4869]: I0312 15:06:55.167614 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-z6955"] Mar 12 15:06:55 crc kubenswrapper[4869]: I0312 15:06:55.242716 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk58l\" (UniqueName: \"kubernetes.io/projected/f8d7822f-0666-4c82-9dc3-8f2074cdc5e0-kube-api-access-nk58l\") pod \"root-account-create-update-z6955\" (UID: \"f8d7822f-0666-4c82-9dc3-8f2074cdc5e0\") " pod="openstack/root-account-create-update-z6955" Mar 12 15:06:55 crc kubenswrapper[4869]: I0312 15:06:55.242924 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8d7822f-0666-4c82-9dc3-8f2074cdc5e0-operator-scripts\") pod \"root-account-create-update-z6955\" (UID: \"f8d7822f-0666-4c82-9dc3-8f2074cdc5e0\") " pod="openstack/root-account-create-update-z6955" Mar 12 15:06:55 crc kubenswrapper[4869]: I0312 15:06:55.344383 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk58l\" (UniqueName: \"kubernetes.io/projected/f8d7822f-0666-4c82-9dc3-8f2074cdc5e0-kube-api-access-nk58l\") pod \"root-account-create-update-z6955\" (UID: \"f8d7822f-0666-4c82-9dc3-8f2074cdc5e0\") " pod="openstack/root-account-create-update-z6955" Mar 12 15:06:55 crc kubenswrapper[4869]: I0312 15:06:55.344839 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8d7822f-0666-4c82-9dc3-8f2074cdc5e0-operator-scripts\") pod \"root-account-create-update-z6955\" (UID: \"f8d7822f-0666-4c82-9dc3-8f2074cdc5e0\") " pod="openstack/root-account-create-update-z6955" Mar 12 15:06:55 crc kubenswrapper[4869]: I0312 15:06:55.346027 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8d7822f-0666-4c82-9dc3-8f2074cdc5e0-operator-scripts\") pod \"root-account-create-update-z6955\" (UID: \"f8d7822f-0666-4c82-9dc3-8f2074cdc5e0\") " pod="openstack/root-account-create-update-z6955" Mar 12 15:06:55 crc kubenswrapper[4869]: I0312 15:06:55.364903 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk58l\" (UniqueName: \"kubernetes.io/projected/f8d7822f-0666-4c82-9dc3-8f2074cdc5e0-kube-api-access-nk58l\") pod \"root-account-create-update-z6955\" (UID: \"f8d7822f-0666-4c82-9dc3-8f2074cdc5e0\") " pod="openstack/root-account-create-update-z6955" Mar 12 15:06:55 crc kubenswrapper[4869]: I0312 15:06:55.573397 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z6955" Mar 12 15:06:55 crc kubenswrapper[4869]: I0312 15:06:55.971456 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e0323899-ea3b-4572-baa4-3483b0d5fd86","Type":"ContainerStarted","Data":"2697723ecc4b8b6205845e7410eff3ca8ae4c708e1ae9679c82243acb8b0acc3"} Mar 12 15:06:55 crc kubenswrapper[4869]: I0312 15:06:55.972085 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:06:55 crc kubenswrapper[4869]: I0312 15:06:55.973622 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3e764959-1933-4a88-b8de-fd853d49a0d3","Type":"ContainerStarted","Data":"d5177b769d612b137b3086add7c6a0909cc0fd2ea3b64c3ba4f6533ae01b1548"} Mar 12 15:06:55 crc kubenswrapper[4869]: I0312 15:06:55.974146 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 12 15:06:55 crc kubenswrapper[4869]: I0312 15:06:55.980386 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c2260d9c-2497-44bb-9952-341844cf85d0","Type":"ContainerStarted","Data":"68f4ebeac2a02e70975f0554421c1348ae1aa3f88a6ec0c76cd63407e8581064"} Mar 12 15:06:55 crc kubenswrapper[4869]: I0312 15:06:55.980436 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c2260d9c-2497-44bb-9952-341844cf85d0","Type":"ContainerStarted","Data":"673ff282a5f2da6f8fda6a0fc26b046c537939f18080271f4d4fff00fe2382b4"} Mar 12 15:06:55 crc kubenswrapper[4869]: I0312 15:06:55.980452 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c2260d9c-2497-44bb-9952-341844cf85d0","Type":"ContainerStarted","Data":"f65f631e364899c9a0974848667744d844d07e80d9ae72ac3ff408b600cd204d"} Mar 12 15:06:55 crc kubenswrapper[4869]: I0312 15:06:55.997329 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=49.887807689 podStartE2EDuration="1m3.997310583s" podCreationTimestamp="2026-03-12 15:05:52 +0000 UTC" firstStartedPulling="2026-03-12 15:06:05.362866987 +0000 UTC m=+1117.648092265" lastFinishedPulling="2026-03-12 15:06:19.472369881 +0000 UTC m=+1131.757595159" observedRunningTime="2026-03-12 15:06:55.995286765 +0000 UTC m=+1168.280512053" watchObservedRunningTime="2026-03-12 15:06:55.997310583 +0000 UTC m=+1168.282535861" Mar 12 15:06:56 crc kubenswrapper[4869]: I0312 15:06:56.030480 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=51.221245647 podStartE2EDuration="1m4.030460193s" podCreationTimestamp="2026-03-12 15:05:52 +0000 UTC" firstStartedPulling="2026-03-12 15:06:06.373240598 +0000 UTC m=+1118.658465876" lastFinishedPulling="2026-03-12 15:06:19.182455144 +0000 UTC m=+1131.467680422" observedRunningTime="2026-03-12 15:06:56.024339007 +0000 UTC m=+1168.309564295" watchObservedRunningTime="2026-03-12 15:06:56.030460193 +0000 UTC m=+1168.315685471" Mar 12 15:06:56 crc kubenswrapper[4869]: I0312 15:06:56.043539 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-z6955"] Mar 12 15:06:56 crc kubenswrapper[4869]: W0312 15:06:56.044714 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8d7822f_0666_4c82_9dc3_8f2074cdc5e0.slice/crio-0b75bafdf391073888b58b5860a61f1d26574cc2fa79eb305536233d6dbcfdfc WatchSource:0}: Error finding container 0b75bafdf391073888b58b5860a61f1d26574cc2fa79eb305536233d6dbcfdfc: Status 404 returned error can't find the container with id 0b75bafdf391073888b58b5860a61f1d26574cc2fa79eb305536233d6dbcfdfc Mar 12 15:06:56 crc kubenswrapper[4869]: I0312 15:06:56.988047 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z6955" event={"ID":"f8d7822f-0666-4c82-9dc3-8f2074cdc5e0","Type":"ContainerStarted","Data":"0b75bafdf391073888b58b5860a61f1d26574cc2fa79eb305536233d6dbcfdfc"} Mar 12 15:06:56 crc kubenswrapper[4869]: I0312 15:06:56.991515 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c2260d9c-2497-44bb-9952-341844cf85d0","Type":"ContainerStarted","Data":"2717cb5040fe4bcf0df6ca0d2204f56a05b1083851e7d0eedaecedea53513158"} Mar 12 15:06:57 crc kubenswrapper[4869]: I0312 15:06:57.006135 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-wzzxq" podUID="ad973222-a042-43af-9c00-b0f6d795c7d1" containerName="ovn-controller" probeResult="failure" output=< Mar 12 15:06:57 crc kubenswrapper[4869]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 12 15:06:57 crc kubenswrapper[4869]: > Mar 12 15:06:57 crc kubenswrapper[4869]: I0312 15:06:57.049285 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-jvmhv" Mar 12 15:06:57 crc kubenswrapper[4869]: I0312 15:06:57.270939 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-wzzxq-config-s9d8p"] Mar 12 15:06:57 crc kubenswrapper[4869]: I0312 15:06:57.272635 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wzzxq-config-s9d8p" Mar 12 15:06:57 crc kubenswrapper[4869]: I0312 15:06:57.287921 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 12 15:06:57 crc kubenswrapper[4869]: I0312 15:06:57.304230 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wzzxq-config-s9d8p"] Mar 12 15:06:57 crc kubenswrapper[4869]: I0312 15:06:57.378505 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fe8c983f-3df1-477c-9aec-93c45c582b8a-additional-scripts\") pod \"ovn-controller-wzzxq-config-s9d8p\" (UID: \"fe8c983f-3df1-477c-9aec-93c45c582b8a\") " pod="openstack/ovn-controller-wzzxq-config-s9d8p" Mar 12 15:06:57 crc kubenswrapper[4869]: I0312 15:06:57.378592 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fe8c983f-3df1-477c-9aec-93c45c582b8a-var-run\") pod \"ovn-controller-wzzxq-config-s9d8p\" (UID: \"fe8c983f-3df1-477c-9aec-93c45c582b8a\") " pod="openstack/ovn-controller-wzzxq-config-s9d8p" Mar 12 15:06:57 crc kubenswrapper[4869]: I0312 15:06:57.378633 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vv69\" (UniqueName: \"kubernetes.io/projected/fe8c983f-3df1-477c-9aec-93c45c582b8a-kube-api-access-2vv69\") pod \"ovn-controller-wzzxq-config-s9d8p\" (UID: \"fe8c983f-3df1-477c-9aec-93c45c582b8a\") " pod="openstack/ovn-controller-wzzxq-config-s9d8p" Mar 12 15:06:57 crc kubenswrapper[4869]: I0312 15:06:57.378705 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fe8c983f-3df1-477c-9aec-93c45c582b8a-var-log-ovn\") pod \"ovn-controller-wzzxq-config-s9d8p\" (UID: \"fe8c983f-3df1-477c-9aec-93c45c582b8a\") " pod="openstack/ovn-controller-wzzxq-config-s9d8p" Mar 12 15:06:57 crc kubenswrapper[4869]: I0312 15:06:57.378747 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe8c983f-3df1-477c-9aec-93c45c582b8a-scripts\") pod \"ovn-controller-wzzxq-config-s9d8p\" (UID: \"fe8c983f-3df1-477c-9aec-93c45c582b8a\") " pod="openstack/ovn-controller-wzzxq-config-s9d8p" Mar 12 15:06:57 crc kubenswrapper[4869]: I0312 15:06:57.378784 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fe8c983f-3df1-477c-9aec-93c45c582b8a-var-run-ovn\") pod \"ovn-controller-wzzxq-config-s9d8p\" (UID: \"fe8c983f-3df1-477c-9aec-93c45c582b8a\") " pod="openstack/ovn-controller-wzzxq-config-s9d8p" Mar 12 15:06:57 crc kubenswrapper[4869]: I0312 15:06:57.480610 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fe8c983f-3df1-477c-9aec-93c45c582b8a-additional-scripts\") pod \"ovn-controller-wzzxq-config-s9d8p\" (UID: \"fe8c983f-3df1-477c-9aec-93c45c582b8a\") " pod="openstack/ovn-controller-wzzxq-config-s9d8p" Mar 12 15:06:57 crc kubenswrapper[4869]: I0312 15:06:57.480679 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fe8c983f-3df1-477c-9aec-93c45c582b8a-var-run\") pod \"ovn-controller-wzzxq-config-s9d8p\" (UID: \"fe8c983f-3df1-477c-9aec-93c45c582b8a\") " pod="openstack/ovn-controller-wzzxq-config-s9d8p" Mar 12 15:06:57 crc kubenswrapper[4869]: I0312 15:06:57.480744 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vv69\" (UniqueName: \"kubernetes.io/projected/fe8c983f-3df1-477c-9aec-93c45c582b8a-kube-api-access-2vv69\") pod \"ovn-controller-wzzxq-config-s9d8p\" (UID: \"fe8c983f-3df1-477c-9aec-93c45c582b8a\") " pod="openstack/ovn-controller-wzzxq-config-s9d8p" Mar 12 15:06:57 crc kubenswrapper[4869]: I0312 15:06:57.480830 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fe8c983f-3df1-477c-9aec-93c45c582b8a-var-log-ovn\") pod \"ovn-controller-wzzxq-config-s9d8p\" (UID: \"fe8c983f-3df1-477c-9aec-93c45c582b8a\") " pod="openstack/ovn-controller-wzzxq-config-s9d8p" Mar 12 15:06:57 crc kubenswrapper[4869]: I0312 15:06:57.480865 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe8c983f-3df1-477c-9aec-93c45c582b8a-scripts\") pod \"ovn-controller-wzzxq-config-s9d8p\" (UID: \"fe8c983f-3df1-477c-9aec-93c45c582b8a\") " pod="openstack/ovn-controller-wzzxq-config-s9d8p" Mar 12 15:06:57 crc kubenswrapper[4869]: I0312 15:06:57.481047 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fe8c983f-3df1-477c-9aec-93c45c582b8a-var-run\") pod \"ovn-controller-wzzxq-config-s9d8p\" (UID: \"fe8c983f-3df1-477c-9aec-93c45c582b8a\") " pod="openstack/ovn-controller-wzzxq-config-s9d8p" Mar 12 15:06:57 crc kubenswrapper[4869]: I0312 15:06:57.481087 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fe8c983f-3df1-477c-9aec-93c45c582b8a-var-log-ovn\") pod \"ovn-controller-wzzxq-config-s9d8p\" (UID: \"fe8c983f-3df1-477c-9aec-93c45c582b8a\") " pod="openstack/ovn-controller-wzzxq-config-s9d8p" Mar 12 15:06:57 crc kubenswrapper[4869]: I0312 15:06:57.481501 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fe8c983f-3df1-477c-9aec-93c45c582b8a-var-run-ovn\") pod \"ovn-controller-wzzxq-config-s9d8p\" (UID: \"fe8c983f-3df1-477c-9aec-93c45c582b8a\") " pod="openstack/ovn-controller-wzzxq-config-s9d8p" Mar 12 15:06:57 crc kubenswrapper[4869]: I0312 15:06:57.481617 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fe8c983f-3df1-477c-9aec-93c45c582b8a-var-run-ovn\") pod \"ovn-controller-wzzxq-config-s9d8p\" (UID: \"fe8c983f-3df1-477c-9aec-93c45c582b8a\") " pod="openstack/ovn-controller-wzzxq-config-s9d8p" Mar 12 15:06:57 crc kubenswrapper[4869]: I0312 15:06:57.481767 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fe8c983f-3df1-477c-9aec-93c45c582b8a-additional-scripts\") pod \"ovn-controller-wzzxq-config-s9d8p\" (UID: \"fe8c983f-3df1-477c-9aec-93c45c582b8a\") " pod="openstack/ovn-controller-wzzxq-config-s9d8p" Mar 12 15:06:57 crc kubenswrapper[4869]: I0312 15:06:57.483245 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe8c983f-3df1-477c-9aec-93c45c582b8a-scripts\") pod \"ovn-controller-wzzxq-config-s9d8p\" (UID: \"fe8c983f-3df1-477c-9aec-93c45c582b8a\") " pod="openstack/ovn-controller-wzzxq-config-s9d8p" Mar 12 15:06:57 crc kubenswrapper[4869]: I0312 15:06:57.522964 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vv69\" (UniqueName: \"kubernetes.io/projected/fe8c983f-3df1-477c-9aec-93c45c582b8a-kube-api-access-2vv69\") pod \"ovn-controller-wzzxq-config-s9d8p\" (UID: \"fe8c983f-3df1-477c-9aec-93c45c582b8a\") " pod="openstack/ovn-controller-wzzxq-config-s9d8p" Mar 12 15:06:57 crc kubenswrapper[4869]: I0312 15:06:57.602461 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wzzxq-config-s9d8p" Mar 12 15:06:58 crc kubenswrapper[4869]: I0312 15:06:58.092910 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wzzxq-config-s9d8p"] Mar 12 15:06:59 crc kubenswrapper[4869]: I0312 15:06:59.006315 4869 generic.go:334] "Generic (PLEG): container finished" podID="fe8c983f-3df1-477c-9aec-93c45c582b8a" containerID="03e8197617894e351966949eef740a763d3ad9420224bf1ce16c77c5602dbf98" exitCode=0 Mar 12 15:06:59 crc kubenswrapper[4869]: I0312 15:06:59.006368 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wzzxq-config-s9d8p" event={"ID":"fe8c983f-3df1-477c-9aec-93c45c582b8a","Type":"ContainerDied","Data":"03e8197617894e351966949eef740a763d3ad9420224bf1ce16c77c5602dbf98"} Mar 12 15:06:59 crc kubenswrapper[4869]: I0312 15:06:59.006855 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wzzxq-config-s9d8p" event={"ID":"fe8c983f-3df1-477c-9aec-93c45c582b8a","Type":"ContainerStarted","Data":"8c16387ac8b8f94b8a33bbea42f5f4b5169b50cb927c53c4a5cc9502e46f5481"} Mar 12 15:06:59 crc kubenswrapper[4869]: I0312 15:06:59.011689 4869 generic.go:334] "Generic (PLEG): container finished" podID="f8d7822f-0666-4c82-9dc3-8f2074cdc5e0" containerID="af10a209726e9809049a6e11300a42688ca9981619c3653fcc5a079eff069a77" exitCode=0 Mar 12 15:06:59 crc kubenswrapper[4869]: I0312 15:06:59.011742 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z6955" event={"ID":"f8d7822f-0666-4c82-9dc3-8f2074cdc5e0","Type":"ContainerDied","Data":"af10a209726e9809049a6e11300a42688ca9981619c3653fcc5a079eff069a77"} Mar 12 15:07:00 crc kubenswrapper[4869]: I0312 15:07:00.028012 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c2260d9c-2497-44bb-9952-341844cf85d0","Type":"ContainerStarted","Data":"a15c5cdb3a781cae4d56d8fc515cbef410b3a94a7e414a59280b8af27e244978"} Mar 12 15:07:00 crc kubenswrapper[4869]: I0312 15:07:00.028060 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c2260d9c-2497-44bb-9952-341844cf85d0","Type":"ContainerStarted","Data":"11205be22f5ff540fcfa8ad9425f77d1293bb565f6fa8f7c1ce3709625e5d73e"} Mar 12 15:07:00 crc kubenswrapper[4869]: I0312 15:07:00.028074 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c2260d9c-2497-44bb-9952-341844cf85d0","Type":"ContainerStarted","Data":"9077ae6c24dd1bfd7a806dc6f9af2705600b7a950e20f5462384eb32d1a52aa4"} Mar 12 15:07:00 crc kubenswrapper[4869]: I0312 15:07:00.028087 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c2260d9c-2497-44bb-9952-341844cf85d0","Type":"ContainerStarted","Data":"de0d225cac5cd47855646beaff188f62103e7c719c22b1af3cbf6294c9dfc598"} Mar 12 15:07:00 crc kubenswrapper[4869]: I0312 15:07:00.028098 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c2260d9c-2497-44bb-9952-341844cf85d0","Type":"ContainerStarted","Data":"a266f8a4ecddab624dc2781e31bf2cc91d33c148f59311676ba44ad522ed3120"} Mar 12 15:07:00 crc kubenswrapper[4869]: I0312 15:07:00.028110 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c2260d9c-2497-44bb-9952-341844cf85d0","Type":"ContainerStarted","Data":"0262ef6208c108ab6655d6e9fb512e2ec4ecae31b27f0951b15db71caffa27bd"} Mar 12 15:07:00 crc kubenswrapper[4869]: I0312 15:07:00.429863 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wzzxq-config-s9d8p" Mar 12 15:07:00 crc kubenswrapper[4869]: I0312 15:07:00.442237 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z6955" Mar 12 15:07:00 crc kubenswrapper[4869]: I0312 15:07:00.529814 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8d7822f-0666-4c82-9dc3-8f2074cdc5e0-operator-scripts\") pod \"f8d7822f-0666-4c82-9dc3-8f2074cdc5e0\" (UID: \"f8d7822f-0666-4c82-9dc3-8f2074cdc5e0\") " Mar 12 15:07:00 crc kubenswrapper[4869]: I0312 15:07:00.529872 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fe8c983f-3df1-477c-9aec-93c45c582b8a-var-run-ovn\") pod \"fe8c983f-3df1-477c-9aec-93c45c582b8a\" (UID: \"fe8c983f-3df1-477c-9aec-93c45c582b8a\") " Mar 12 15:07:00 crc kubenswrapper[4869]: I0312 15:07:00.529903 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nk58l\" (UniqueName: \"kubernetes.io/projected/f8d7822f-0666-4c82-9dc3-8f2074cdc5e0-kube-api-access-nk58l\") pod \"f8d7822f-0666-4c82-9dc3-8f2074cdc5e0\" (UID: \"f8d7822f-0666-4c82-9dc3-8f2074cdc5e0\") " Mar 12 15:07:00 crc kubenswrapper[4869]: I0312 15:07:00.529945 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fe8c983f-3df1-477c-9aec-93c45c582b8a-var-log-ovn\") pod \"fe8c983f-3df1-477c-9aec-93c45c582b8a\" (UID: \"fe8c983f-3df1-477c-9aec-93c45c582b8a\") " Mar 12 15:07:00 crc kubenswrapper[4869]: I0312 15:07:00.529979 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vv69\" (UniqueName: \"kubernetes.io/projected/fe8c983f-3df1-477c-9aec-93c45c582b8a-kube-api-access-2vv69\") pod \"fe8c983f-3df1-477c-9aec-93c45c582b8a\" (UID: \"fe8c983f-3df1-477c-9aec-93c45c582b8a\") " Mar 12 15:07:00 crc kubenswrapper[4869]: I0312 15:07:00.529993 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe8c983f-3df1-477c-9aec-93c45c582b8a-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "fe8c983f-3df1-477c-9aec-93c45c582b8a" (UID: "fe8c983f-3df1-477c-9aec-93c45c582b8a"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:07:00 crc kubenswrapper[4869]: I0312 15:07:00.530028 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe8c983f-3df1-477c-9aec-93c45c582b8a-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "fe8c983f-3df1-477c-9aec-93c45c582b8a" (UID: "fe8c983f-3df1-477c-9aec-93c45c582b8a"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:07:00 crc kubenswrapper[4869]: I0312 15:07:00.530065 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe8c983f-3df1-477c-9aec-93c45c582b8a-scripts\") pod \"fe8c983f-3df1-477c-9aec-93c45c582b8a\" (UID: \"fe8c983f-3df1-477c-9aec-93c45c582b8a\") " Mar 12 15:07:00 crc kubenswrapper[4869]: I0312 15:07:00.530102 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fe8c983f-3df1-477c-9aec-93c45c582b8a-var-run\") pod \"fe8c983f-3df1-477c-9aec-93c45c582b8a\" (UID: \"fe8c983f-3df1-477c-9aec-93c45c582b8a\") " Mar 12 15:07:00 crc kubenswrapper[4869]: I0312 15:07:00.530131 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fe8c983f-3df1-477c-9aec-93c45c582b8a-additional-scripts\") pod \"fe8c983f-3df1-477c-9aec-93c45c582b8a\" (UID: \"fe8c983f-3df1-477c-9aec-93c45c582b8a\") " Mar 12 15:07:00 crc kubenswrapper[4869]: I0312 15:07:00.530159 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe8c983f-3df1-477c-9aec-93c45c582b8a-var-run" (OuterVolumeSpecName: "var-run") pod "fe8c983f-3df1-477c-9aec-93c45c582b8a" (UID: "fe8c983f-3df1-477c-9aec-93c45c582b8a"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:07:00 crc kubenswrapper[4869]: I0312 15:07:00.530584 4869 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fe8c983f-3df1-477c-9aec-93c45c582b8a-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:00 crc kubenswrapper[4869]: I0312 15:07:00.530604 4869 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fe8c983f-3df1-477c-9aec-93c45c582b8a-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:00 crc kubenswrapper[4869]: I0312 15:07:00.530616 4869 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fe8c983f-3df1-477c-9aec-93c45c582b8a-var-run\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:00 crc kubenswrapper[4869]: I0312 15:07:00.530769 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe8c983f-3df1-477c-9aec-93c45c582b8a-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "fe8c983f-3df1-477c-9aec-93c45c582b8a" (UID: "fe8c983f-3df1-477c-9aec-93c45c582b8a"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:00 crc kubenswrapper[4869]: I0312 15:07:00.530903 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8d7822f-0666-4c82-9dc3-8f2074cdc5e0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f8d7822f-0666-4c82-9dc3-8f2074cdc5e0" (UID: "f8d7822f-0666-4c82-9dc3-8f2074cdc5e0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:00 crc kubenswrapper[4869]: I0312 15:07:00.531165 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe8c983f-3df1-477c-9aec-93c45c582b8a-scripts" (OuterVolumeSpecName: "scripts") pod "fe8c983f-3df1-477c-9aec-93c45c582b8a" (UID: "fe8c983f-3df1-477c-9aec-93c45c582b8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:00 crc kubenswrapper[4869]: I0312 15:07:00.534636 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe8c983f-3df1-477c-9aec-93c45c582b8a-kube-api-access-2vv69" (OuterVolumeSpecName: "kube-api-access-2vv69") pod "fe8c983f-3df1-477c-9aec-93c45c582b8a" (UID: "fe8c983f-3df1-477c-9aec-93c45c582b8a"). InnerVolumeSpecName "kube-api-access-2vv69". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:07:00 crc kubenswrapper[4869]: I0312 15:07:00.538540 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8d7822f-0666-4c82-9dc3-8f2074cdc5e0-kube-api-access-nk58l" (OuterVolumeSpecName: "kube-api-access-nk58l") pod "f8d7822f-0666-4c82-9dc3-8f2074cdc5e0" (UID: "f8d7822f-0666-4c82-9dc3-8f2074cdc5e0"). InnerVolumeSpecName "kube-api-access-nk58l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:07:00 crc kubenswrapper[4869]: I0312 15:07:00.631549 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8d7822f-0666-4c82-9dc3-8f2074cdc5e0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:00 crc kubenswrapper[4869]: I0312 15:07:00.631857 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nk58l\" (UniqueName: \"kubernetes.io/projected/f8d7822f-0666-4c82-9dc3-8f2074cdc5e0-kube-api-access-nk58l\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:00 crc kubenswrapper[4869]: I0312 15:07:00.631874 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vv69\" (UniqueName: \"kubernetes.io/projected/fe8c983f-3df1-477c-9aec-93c45c582b8a-kube-api-access-2vv69\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:00 crc kubenswrapper[4869]: I0312 15:07:00.631890 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe8c983f-3df1-477c-9aec-93c45c582b8a-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:00 crc kubenswrapper[4869]: I0312 15:07:00.631901 4869 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fe8c983f-3df1-477c-9aec-93c45c582b8a-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.042812 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c2260d9c-2497-44bb-9952-341844cf85d0","Type":"ContainerStarted","Data":"61c6743502d5a0fd0f4d888ed93ceffdab1ed1521df9c21cd5b23d83328771dc"} Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.045704 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wzzxq-config-s9d8p" event={"ID":"fe8c983f-3df1-477c-9aec-93c45c582b8a","Type":"ContainerDied","Data":"8c16387ac8b8f94b8a33bbea42f5f4b5169b50cb927c53c4a5cc9502e46f5481"} Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.045730 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wzzxq-config-s9d8p" Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.045758 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c16387ac8b8f94b8a33bbea42f5f4b5169b50cb927c53c4a5cc9502e46f5481" Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.057243 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z6955" event={"ID":"f8d7822f-0666-4c82-9dc3-8f2074cdc5e0","Type":"ContainerDied","Data":"0b75bafdf391073888b58b5860a61f1d26574cc2fa79eb305536233d6dbcfdfc"} Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.057302 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b75bafdf391073888b58b5860a61f1d26574cc2fa79eb305536233d6dbcfdfc" Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.057329 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z6955" Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.103585 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=19.845973042 podStartE2EDuration="32.103544324s" podCreationTimestamp="2026-03-12 15:06:29 +0000 UTC" firstStartedPulling="2026-03-12 15:06:46.794701316 +0000 UTC m=+1159.079926584" lastFinishedPulling="2026-03-12 15:06:59.052272558 +0000 UTC m=+1171.337497866" observedRunningTime="2026-03-12 15:07:01.097289865 +0000 UTC m=+1173.382515143" watchObservedRunningTime="2026-03-12 15:07:01.103544324 +0000 UTC m=+1173.388769612" Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.352707 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-k4v6m"] Mar 12 15:07:01 crc kubenswrapper[4869]: E0312 15:07:01.353092 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8d7822f-0666-4c82-9dc3-8f2074cdc5e0" containerName="mariadb-account-create-update" Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.353111 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8d7822f-0666-4c82-9dc3-8f2074cdc5e0" containerName="mariadb-account-create-update" Mar 12 15:07:01 crc kubenswrapper[4869]: E0312 15:07:01.353122 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe8c983f-3df1-477c-9aec-93c45c582b8a" containerName="ovn-config" Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.353131 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe8c983f-3df1-477c-9aec-93c45c582b8a" containerName="ovn-config" Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.353331 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8d7822f-0666-4c82-9dc3-8f2074cdc5e0" containerName="mariadb-account-create-update" Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.353355 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe8c983f-3df1-477c-9aec-93c45c582b8a" containerName="ovn-config" Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.354462 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-k4v6m" Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.356409 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.416377 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-k4v6m"] Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.444249 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c10d3e5-a692-41f2-9896-67f9d062cf9f-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-k4v6m\" (UID: \"6c10d3e5-a692-41f2-9896-67f9d062cf9f\") " pod="openstack/dnsmasq-dns-764c5664d7-k4v6m" Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.444369 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c10d3e5-a692-41f2-9896-67f9d062cf9f-config\") pod \"dnsmasq-dns-764c5664d7-k4v6m\" (UID: \"6c10d3e5-a692-41f2-9896-67f9d062cf9f\") " pod="openstack/dnsmasq-dns-764c5664d7-k4v6m" Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.444404 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c10d3e5-a692-41f2-9896-67f9d062cf9f-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-k4v6m\" (UID: \"6c10d3e5-a692-41f2-9896-67f9d062cf9f\") " pod="openstack/dnsmasq-dns-764c5664d7-k4v6m" Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.444456 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c10d3e5-a692-41f2-9896-67f9d062cf9f-dns-svc\") pod \"dnsmasq-dns-764c5664d7-k4v6m\" (UID: \"6c10d3e5-a692-41f2-9896-67f9d062cf9f\") " pod="openstack/dnsmasq-dns-764c5664d7-k4v6m" Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.444487 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c10d3e5-a692-41f2-9896-67f9d062cf9f-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-k4v6m\" (UID: \"6c10d3e5-a692-41f2-9896-67f9d062cf9f\") " pod="openstack/dnsmasq-dns-764c5664d7-k4v6m" Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.444514 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flk78\" (UniqueName: \"kubernetes.io/projected/6c10d3e5-a692-41f2-9896-67f9d062cf9f-kube-api-access-flk78\") pod \"dnsmasq-dns-764c5664d7-k4v6m\" (UID: \"6c10d3e5-a692-41f2-9896-67f9d062cf9f\") " pod="openstack/dnsmasq-dns-764c5664d7-k4v6m" Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.527117 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-wzzxq-config-s9d8p"] Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.533010 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-wzzxq-config-s9d8p"] Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.546079 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c10d3e5-a692-41f2-9896-67f9d062cf9f-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-k4v6m\" (UID: \"6c10d3e5-a692-41f2-9896-67f9d062cf9f\") " pod="openstack/dnsmasq-dns-764c5664d7-k4v6m" Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.546666 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c10d3e5-a692-41f2-9896-67f9d062cf9f-config\") pod \"dnsmasq-dns-764c5664d7-k4v6m\" (UID: \"6c10d3e5-a692-41f2-9896-67f9d062cf9f\") " pod="openstack/dnsmasq-dns-764c5664d7-k4v6m" Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.546720 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c10d3e5-a692-41f2-9896-67f9d062cf9f-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-k4v6m\" (UID: \"6c10d3e5-a692-41f2-9896-67f9d062cf9f\") " pod="openstack/dnsmasq-dns-764c5664d7-k4v6m" Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.546754 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c10d3e5-a692-41f2-9896-67f9d062cf9f-dns-svc\") pod \"dnsmasq-dns-764c5664d7-k4v6m\" (UID: \"6c10d3e5-a692-41f2-9896-67f9d062cf9f\") " pod="openstack/dnsmasq-dns-764c5664d7-k4v6m" Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.546778 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c10d3e5-a692-41f2-9896-67f9d062cf9f-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-k4v6m\" (UID: \"6c10d3e5-a692-41f2-9896-67f9d062cf9f\") " pod="openstack/dnsmasq-dns-764c5664d7-k4v6m" Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.546802 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flk78\" (UniqueName: \"kubernetes.io/projected/6c10d3e5-a692-41f2-9896-67f9d062cf9f-kube-api-access-flk78\") pod \"dnsmasq-dns-764c5664d7-k4v6m\" (UID: \"6c10d3e5-a692-41f2-9896-67f9d062cf9f\") " pod="openstack/dnsmasq-dns-764c5664d7-k4v6m" Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.546919 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c10d3e5-a692-41f2-9896-67f9d062cf9f-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-k4v6m\" (UID: \"6c10d3e5-a692-41f2-9896-67f9d062cf9f\") " pod="openstack/dnsmasq-dns-764c5664d7-k4v6m" Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.547516 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c10d3e5-a692-41f2-9896-67f9d062cf9f-config\") pod \"dnsmasq-dns-764c5664d7-k4v6m\" (UID: \"6c10d3e5-a692-41f2-9896-67f9d062cf9f\") " pod="openstack/dnsmasq-dns-764c5664d7-k4v6m" Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.547652 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c10d3e5-a692-41f2-9896-67f9d062cf9f-dns-svc\") pod \"dnsmasq-dns-764c5664d7-k4v6m\" (UID: \"6c10d3e5-a692-41f2-9896-67f9d062cf9f\") " pod="openstack/dnsmasq-dns-764c5664d7-k4v6m" Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.547693 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c10d3e5-a692-41f2-9896-67f9d062cf9f-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-k4v6m\" (UID: \"6c10d3e5-a692-41f2-9896-67f9d062cf9f\") " pod="openstack/dnsmasq-dns-764c5664d7-k4v6m" Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.547810 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c10d3e5-a692-41f2-9896-67f9d062cf9f-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-k4v6m\" (UID: \"6c10d3e5-a692-41f2-9896-67f9d062cf9f\") " pod="openstack/dnsmasq-dns-764c5664d7-k4v6m" Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.563536 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flk78\" (UniqueName: \"kubernetes.io/projected/6c10d3e5-a692-41f2-9896-67f9d062cf9f-kube-api-access-flk78\") pod \"dnsmasq-dns-764c5664d7-k4v6m\" (UID: \"6c10d3e5-a692-41f2-9896-67f9d062cf9f\") " pod="openstack/dnsmasq-dns-764c5664d7-k4v6m" Mar 12 15:07:01 crc kubenswrapper[4869]: I0312 15:07:01.670256 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-k4v6m" Mar 12 15:07:02 crc kubenswrapper[4869]: I0312 15:07:02.006454 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-wzzxq" Mar 12 15:07:02 crc kubenswrapper[4869]: I0312 15:07:02.141495 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-k4v6m"] Mar 12 15:07:02 crc kubenswrapper[4869]: W0312 15:07:02.145738 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c10d3e5_a692_41f2_9896_67f9d062cf9f.slice/crio-138300e7f0b1b877f6221e48cfb5c9169a2913492d6c02fa63e4de3040cfde2d WatchSource:0}: Error finding container 138300e7f0b1b877f6221e48cfb5c9169a2913492d6c02fa63e4de3040cfde2d: Status 404 returned error can't find the container with id 138300e7f0b1b877f6221e48cfb5c9169a2913492d6c02fa63e4de3040cfde2d Mar 12 15:07:02 crc kubenswrapper[4869]: I0312 15:07:02.347391 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe8c983f-3df1-477c-9aec-93c45c582b8a" path="/var/lib/kubelet/pods/fe8c983f-3df1-477c-9aec-93c45c582b8a/volumes" Mar 12 15:07:03 crc kubenswrapper[4869]: I0312 15:07:03.072722 4869 generic.go:334] "Generic (PLEG): container finished" podID="6c10d3e5-a692-41f2-9896-67f9d062cf9f" containerID="afaf038a40867135337a9a205530d376d574eb580c7749ec339e3d2fc39891c5" exitCode=0 Mar 12 15:07:03 crc kubenswrapper[4869]: I0312 15:07:03.072785 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-k4v6m" event={"ID":"6c10d3e5-a692-41f2-9896-67f9d062cf9f","Type":"ContainerDied","Data":"afaf038a40867135337a9a205530d376d574eb580c7749ec339e3d2fc39891c5"} Mar 12 15:07:03 crc kubenswrapper[4869]: I0312 15:07:03.072809 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-k4v6m" event={"ID":"6c10d3e5-a692-41f2-9896-67f9d062cf9f","Type":"ContainerStarted","Data":"138300e7f0b1b877f6221e48cfb5c9169a2913492d6c02fa63e4de3040cfde2d"} Mar 12 15:07:03 crc kubenswrapper[4869]: I0312 15:07:03.075849 4869 generic.go:334] "Generic (PLEG): container finished" podID="d3f34b7c-48b6-4191-b62a-8d8c373b3197" containerID="a481a856bf43915cc69cc9fb7c87bdace3c38e60a67f180da8a40a46b4c0e65e" exitCode=0 Mar 12 15:07:03 crc kubenswrapper[4869]: I0312 15:07:03.075886 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zn9wl" event={"ID":"d3f34b7c-48b6-4191-b62a-8d8c373b3197","Type":"ContainerDied","Data":"a481a856bf43915cc69cc9fb7c87bdace3c38e60a67f180da8a40a46b4c0e65e"} Mar 12 15:07:04 crc kubenswrapper[4869]: I0312 15:07:04.083794 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-k4v6m" event={"ID":"6c10d3e5-a692-41f2-9896-67f9d062cf9f","Type":"ContainerStarted","Data":"8e5a3316f72a1ef07c14850a5e807e5653a17a4973a841bb9a83e9b42324dac6"} Mar 12 15:07:04 crc kubenswrapper[4869]: I0312 15:07:04.125893 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-k4v6m" podStartSLOduration=3.125785381 podStartE2EDuration="3.125785381s" podCreationTimestamp="2026-03-12 15:07:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:07:04.101425273 +0000 UTC m=+1176.386650561" watchObservedRunningTime="2026-03-12 15:07:04.125785381 +0000 UTC m=+1176.411010659" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:04.640562 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zn9wl" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:04.696430 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f34b7c-48b6-4191-b62a-8d8c373b3197-combined-ca-bundle\") pod \"d3f34b7c-48b6-4191-b62a-8d8c373b3197\" (UID: \"d3f34b7c-48b6-4191-b62a-8d8c373b3197\") " Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:04.696632 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3f34b7c-48b6-4191-b62a-8d8c373b3197-config-data\") pod \"d3f34b7c-48b6-4191-b62a-8d8c373b3197\" (UID: \"d3f34b7c-48b6-4191-b62a-8d8c373b3197\") " Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:04.696697 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d3f34b7c-48b6-4191-b62a-8d8c373b3197-db-sync-config-data\") pod \"d3f34b7c-48b6-4191-b62a-8d8c373b3197\" (UID: \"d3f34b7c-48b6-4191-b62a-8d8c373b3197\") " Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:04.696737 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksjfs\" (UniqueName: \"kubernetes.io/projected/d3f34b7c-48b6-4191-b62a-8d8c373b3197-kube-api-access-ksjfs\") pod \"d3f34b7c-48b6-4191-b62a-8d8c373b3197\" (UID: \"d3f34b7c-48b6-4191-b62a-8d8c373b3197\") " Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:04.701167 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f34b7c-48b6-4191-b62a-8d8c373b3197-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d3f34b7c-48b6-4191-b62a-8d8c373b3197" (UID: "d3f34b7c-48b6-4191-b62a-8d8c373b3197"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:04.717930 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3f34b7c-48b6-4191-b62a-8d8c373b3197-kube-api-access-ksjfs" (OuterVolumeSpecName: "kube-api-access-ksjfs") pod "d3f34b7c-48b6-4191-b62a-8d8c373b3197" (UID: "d3f34b7c-48b6-4191-b62a-8d8c373b3197"). InnerVolumeSpecName "kube-api-access-ksjfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:04.721745 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f34b7c-48b6-4191-b62a-8d8c373b3197-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3f34b7c-48b6-4191-b62a-8d8c373b3197" (UID: "d3f34b7c-48b6-4191-b62a-8d8c373b3197"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:04.741867 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f34b7c-48b6-4191-b62a-8d8c373b3197-config-data" (OuterVolumeSpecName: "config-data") pod "d3f34b7c-48b6-4191-b62a-8d8c373b3197" (UID: "d3f34b7c-48b6-4191-b62a-8d8c373b3197"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:04.798605 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3f34b7c-48b6-4191-b62a-8d8c373b3197-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:04.798637 4869 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d3f34b7c-48b6-4191-b62a-8d8c373b3197-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:04.798662 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksjfs\" (UniqueName: \"kubernetes.io/projected/d3f34b7c-48b6-4191-b62a-8d8c373b3197-kube-api-access-ksjfs\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:04.798676 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f34b7c-48b6-4191-b62a-8d8c373b3197-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:05.091525 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zn9wl" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:05.094684 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zn9wl" event={"ID":"d3f34b7c-48b6-4191-b62a-8d8c373b3197","Type":"ContainerDied","Data":"aaa7d15f7f333b0dfe3e3c6e65c0fbb63feb9d24e6a2c1636d3e34904cf55f27"} Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:05.094733 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aaa7d15f7f333b0dfe3e3c6e65c0fbb63feb9d24e6a2c1636d3e34904cf55f27" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:05.094759 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-k4v6m" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:05.474624 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-k4v6m"] Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:05.512736 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-dqt8v"] Mar 12 15:07:05 crc kubenswrapper[4869]: E0312 15:07:05.513175 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3f34b7c-48b6-4191-b62a-8d8c373b3197" containerName="glance-db-sync" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:05.513201 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3f34b7c-48b6-4191-b62a-8d8c373b3197" containerName="glance-db-sync" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:05.513415 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3f34b7c-48b6-4191-b62a-8d8c373b3197" containerName="glance-db-sync" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:05.514497 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:05.593191 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-dqt8v"] Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:05.610677 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp4wj\" (UniqueName: \"kubernetes.io/projected/687b2dbd-45fa-4c40-887e-ef9ac210561e-kube-api-access-hp4wj\") pod \"dnsmasq-dns-74f6bcbc87-dqt8v\" (UID: \"687b2dbd-45fa-4c40-887e-ef9ac210561e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:05.610722 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/687b2dbd-45fa-4c40-887e-ef9ac210561e-config\") pod \"dnsmasq-dns-74f6bcbc87-dqt8v\" (UID: \"687b2dbd-45fa-4c40-887e-ef9ac210561e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:05.610827 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/687b2dbd-45fa-4c40-887e-ef9ac210561e-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-dqt8v\" (UID: \"687b2dbd-45fa-4c40-887e-ef9ac210561e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:05.610872 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/687b2dbd-45fa-4c40-887e-ef9ac210561e-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-dqt8v\" (UID: \"687b2dbd-45fa-4c40-887e-ef9ac210561e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:05.610895 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/687b2dbd-45fa-4c40-887e-ef9ac210561e-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-dqt8v\" (UID: \"687b2dbd-45fa-4c40-887e-ef9ac210561e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:05.610913 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/687b2dbd-45fa-4c40-887e-ef9ac210561e-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-dqt8v\" (UID: \"687b2dbd-45fa-4c40-887e-ef9ac210561e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:05.712603 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/687b2dbd-45fa-4c40-887e-ef9ac210561e-config\") pod \"dnsmasq-dns-74f6bcbc87-dqt8v\" (UID: \"687b2dbd-45fa-4c40-887e-ef9ac210561e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:05.712737 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/687b2dbd-45fa-4c40-887e-ef9ac210561e-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-dqt8v\" (UID: \"687b2dbd-45fa-4c40-887e-ef9ac210561e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:05.712780 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/687b2dbd-45fa-4c40-887e-ef9ac210561e-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-dqt8v\" (UID: \"687b2dbd-45fa-4c40-887e-ef9ac210561e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:05.712807 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/687b2dbd-45fa-4c40-887e-ef9ac210561e-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-dqt8v\" (UID: \"687b2dbd-45fa-4c40-887e-ef9ac210561e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:05.712828 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/687b2dbd-45fa-4c40-887e-ef9ac210561e-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-dqt8v\" (UID: \"687b2dbd-45fa-4c40-887e-ef9ac210561e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:05.712877 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp4wj\" (UniqueName: \"kubernetes.io/projected/687b2dbd-45fa-4c40-887e-ef9ac210561e-kube-api-access-hp4wj\") pod \"dnsmasq-dns-74f6bcbc87-dqt8v\" (UID: \"687b2dbd-45fa-4c40-887e-ef9ac210561e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:05.713814 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/687b2dbd-45fa-4c40-887e-ef9ac210561e-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-dqt8v\" (UID: \"687b2dbd-45fa-4c40-887e-ef9ac210561e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:05.713884 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/687b2dbd-45fa-4c40-887e-ef9ac210561e-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-dqt8v\" (UID: \"687b2dbd-45fa-4c40-887e-ef9ac210561e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:05.714082 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/687b2dbd-45fa-4c40-887e-ef9ac210561e-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-dqt8v\" (UID: \"687b2dbd-45fa-4c40-887e-ef9ac210561e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:05.714231 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/687b2dbd-45fa-4c40-887e-ef9ac210561e-config\") pod \"dnsmasq-dns-74f6bcbc87-dqt8v\" (UID: \"687b2dbd-45fa-4c40-887e-ef9ac210561e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:05.714316 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/687b2dbd-45fa-4c40-887e-ef9ac210561e-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-dqt8v\" (UID: \"687b2dbd-45fa-4c40-887e-ef9ac210561e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:05.732329 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp4wj\" (UniqueName: \"kubernetes.io/projected/687b2dbd-45fa-4c40-887e-ef9ac210561e-kube-api-access-hp4wj\") pod \"dnsmasq-dns-74f6bcbc87-dqt8v\" (UID: \"687b2dbd-45fa-4c40-887e-ef9ac210561e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" Mar 12 15:07:05 crc kubenswrapper[4869]: I0312 15:07:05.830065 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" Mar 12 15:07:06 crc kubenswrapper[4869]: I0312 15:07:06.263533 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-dqt8v"] Mar 12 15:07:06 crc kubenswrapper[4869]: W0312 15:07:06.274416 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod687b2dbd_45fa_4c40_887e_ef9ac210561e.slice/crio-158b1a2ed592c6ece3859f48d75672fa87955cb0c45cc673b3c82337bc2bb875 WatchSource:0}: Error finding container 158b1a2ed592c6ece3859f48d75672fa87955cb0c45cc673b3c82337bc2bb875: Status 404 returned error can't find the container with id 158b1a2ed592c6ece3859f48d75672fa87955cb0c45cc673b3c82337bc2bb875 Mar 12 15:07:07 crc kubenswrapper[4869]: I0312 15:07:07.105271 4869 generic.go:334] "Generic (PLEG): container finished" podID="687b2dbd-45fa-4c40-887e-ef9ac210561e" containerID="bc00f50c343fcdba62d10919ead04d581abc1b4c00ecde72c2e7b4ba050e0b81" exitCode=0 Mar 12 15:07:07 crc kubenswrapper[4869]: I0312 15:07:07.105400 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" event={"ID":"687b2dbd-45fa-4c40-887e-ef9ac210561e","Type":"ContainerDied","Data":"bc00f50c343fcdba62d10919ead04d581abc1b4c00ecde72c2e7b4ba050e0b81"} Mar 12 15:07:07 crc kubenswrapper[4869]: I0312 15:07:07.105458 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" event={"ID":"687b2dbd-45fa-4c40-887e-ef9ac210561e","Type":"ContainerStarted","Data":"158b1a2ed592c6ece3859f48d75672fa87955cb0c45cc673b3c82337bc2bb875"} Mar 12 15:07:07 crc kubenswrapper[4869]: I0312 15:07:07.105506 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-k4v6m" podUID="6c10d3e5-a692-41f2-9896-67f9d062cf9f" containerName="dnsmasq-dns" containerID="cri-o://8e5a3316f72a1ef07c14850a5e807e5653a17a4973a841bb9a83e9b42324dac6" gracePeriod=10 Mar 12 15:07:07 crc kubenswrapper[4869]: I0312 15:07:07.547476 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-k4v6m" Mar 12 15:07:07 crc kubenswrapper[4869]: I0312 15:07:07.651680 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c10d3e5-a692-41f2-9896-67f9d062cf9f-dns-svc\") pod \"6c10d3e5-a692-41f2-9896-67f9d062cf9f\" (UID: \"6c10d3e5-a692-41f2-9896-67f9d062cf9f\") " Mar 12 15:07:07 crc kubenswrapper[4869]: I0312 15:07:07.651777 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c10d3e5-a692-41f2-9896-67f9d062cf9f-dns-swift-storage-0\") pod \"6c10d3e5-a692-41f2-9896-67f9d062cf9f\" (UID: \"6c10d3e5-a692-41f2-9896-67f9d062cf9f\") " Mar 12 15:07:07 crc kubenswrapper[4869]: I0312 15:07:07.651809 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c10d3e5-a692-41f2-9896-67f9d062cf9f-ovsdbserver-nb\") pod \"6c10d3e5-a692-41f2-9896-67f9d062cf9f\" (UID: \"6c10d3e5-a692-41f2-9896-67f9d062cf9f\") " Mar 12 15:07:07 crc kubenswrapper[4869]: I0312 15:07:07.652587 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flk78\" (UniqueName: \"kubernetes.io/projected/6c10d3e5-a692-41f2-9896-67f9d062cf9f-kube-api-access-flk78\") pod \"6c10d3e5-a692-41f2-9896-67f9d062cf9f\" (UID: \"6c10d3e5-a692-41f2-9896-67f9d062cf9f\") " Mar 12 15:07:07 crc kubenswrapper[4869]: I0312 15:07:07.652614 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c10d3e5-a692-41f2-9896-67f9d062cf9f-config\") pod \"6c10d3e5-a692-41f2-9896-67f9d062cf9f\" (UID: \"6c10d3e5-a692-41f2-9896-67f9d062cf9f\") " Mar 12 15:07:07 crc kubenswrapper[4869]: I0312 15:07:07.652639 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c10d3e5-a692-41f2-9896-67f9d062cf9f-ovsdbserver-sb\") pod \"6c10d3e5-a692-41f2-9896-67f9d062cf9f\" (UID: \"6c10d3e5-a692-41f2-9896-67f9d062cf9f\") " Mar 12 15:07:07 crc kubenswrapper[4869]: I0312 15:07:07.657667 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c10d3e5-a692-41f2-9896-67f9d062cf9f-kube-api-access-flk78" (OuterVolumeSpecName: "kube-api-access-flk78") pod "6c10d3e5-a692-41f2-9896-67f9d062cf9f" (UID: "6c10d3e5-a692-41f2-9896-67f9d062cf9f"). InnerVolumeSpecName "kube-api-access-flk78". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:07:07 crc kubenswrapper[4869]: I0312 15:07:07.703952 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c10d3e5-a692-41f2-9896-67f9d062cf9f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6c10d3e5-a692-41f2-9896-67f9d062cf9f" (UID: "6c10d3e5-a692-41f2-9896-67f9d062cf9f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:07 crc kubenswrapper[4869]: I0312 15:07:07.704099 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c10d3e5-a692-41f2-9896-67f9d062cf9f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6c10d3e5-a692-41f2-9896-67f9d062cf9f" (UID: "6c10d3e5-a692-41f2-9896-67f9d062cf9f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:07 crc kubenswrapper[4869]: I0312 15:07:07.706874 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c10d3e5-a692-41f2-9896-67f9d062cf9f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6c10d3e5-a692-41f2-9896-67f9d062cf9f" (UID: "6c10d3e5-a692-41f2-9896-67f9d062cf9f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:07 crc kubenswrapper[4869]: I0312 15:07:07.709289 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c10d3e5-a692-41f2-9896-67f9d062cf9f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6c10d3e5-a692-41f2-9896-67f9d062cf9f" (UID: "6c10d3e5-a692-41f2-9896-67f9d062cf9f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:07 crc kubenswrapper[4869]: I0312 15:07:07.712857 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c10d3e5-a692-41f2-9896-67f9d062cf9f-config" (OuterVolumeSpecName: "config") pod "6c10d3e5-a692-41f2-9896-67f9d062cf9f" (UID: "6c10d3e5-a692-41f2-9896-67f9d062cf9f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:07 crc kubenswrapper[4869]: I0312 15:07:07.754956 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flk78\" (UniqueName: \"kubernetes.io/projected/6c10d3e5-a692-41f2-9896-67f9d062cf9f-kube-api-access-flk78\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:07 crc kubenswrapper[4869]: I0312 15:07:07.755003 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c10d3e5-a692-41f2-9896-67f9d062cf9f-config\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:07 crc kubenswrapper[4869]: I0312 15:07:07.755022 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c10d3e5-a692-41f2-9896-67f9d062cf9f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:07 crc kubenswrapper[4869]: I0312 15:07:07.755034 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c10d3e5-a692-41f2-9896-67f9d062cf9f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:07 crc kubenswrapper[4869]: I0312 15:07:07.755047 4869 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c10d3e5-a692-41f2-9896-67f9d062cf9f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:07 crc kubenswrapper[4869]: I0312 15:07:07.755058 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c10d3e5-a692-41f2-9896-67f9d062cf9f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:08 crc kubenswrapper[4869]: I0312 15:07:08.113595 4869 generic.go:334] "Generic (PLEG): container finished" podID="6c10d3e5-a692-41f2-9896-67f9d062cf9f" containerID="8e5a3316f72a1ef07c14850a5e807e5653a17a4973a841bb9a83e9b42324dac6" exitCode=0 Mar 12 15:07:08 crc kubenswrapper[4869]: I0312 15:07:08.113648 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-k4v6m" Mar 12 15:07:08 crc kubenswrapper[4869]: I0312 15:07:08.113665 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-k4v6m" event={"ID":"6c10d3e5-a692-41f2-9896-67f9d062cf9f","Type":"ContainerDied","Data":"8e5a3316f72a1ef07c14850a5e807e5653a17a4973a841bb9a83e9b42324dac6"} Mar 12 15:07:08 crc kubenswrapper[4869]: I0312 15:07:08.113992 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-k4v6m" event={"ID":"6c10d3e5-a692-41f2-9896-67f9d062cf9f","Type":"ContainerDied","Data":"138300e7f0b1b877f6221e48cfb5c9169a2913492d6c02fa63e4de3040cfde2d"} Mar 12 15:07:08 crc kubenswrapper[4869]: I0312 15:07:08.114016 4869 scope.go:117] "RemoveContainer" containerID="8e5a3316f72a1ef07c14850a5e807e5653a17a4973a841bb9a83e9b42324dac6" Mar 12 15:07:08 crc kubenswrapper[4869]: I0312 15:07:08.115519 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" event={"ID":"687b2dbd-45fa-4c40-887e-ef9ac210561e","Type":"ContainerStarted","Data":"b315b65b82ba9afaa7e2b718cd9b4b04e3b2b1619ef13670192f0e8e37f67d60"} Mar 12 15:07:08 crc kubenswrapper[4869]: I0312 15:07:08.115726 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" Mar 12 15:07:08 crc kubenswrapper[4869]: I0312 15:07:08.143965 4869 scope.go:117] "RemoveContainer" containerID="afaf038a40867135337a9a205530d376d574eb580c7749ec339e3d2fc39891c5" Mar 12 15:07:08 crc kubenswrapper[4869]: I0312 15:07:08.146175 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" podStartSLOduration=3.146154058 podStartE2EDuration="3.146154058s" podCreationTimestamp="2026-03-12 15:07:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:07:08.142083951 +0000 UTC m=+1180.427309229" watchObservedRunningTime="2026-03-12 15:07:08.146154058 +0000 UTC m=+1180.431379356" Mar 12 15:07:08 crc kubenswrapper[4869]: I0312 15:07:08.166713 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-k4v6m"] Mar 12 15:07:08 crc kubenswrapper[4869]: I0312 15:07:08.169744 4869 scope.go:117] "RemoveContainer" containerID="8e5a3316f72a1ef07c14850a5e807e5653a17a4973a841bb9a83e9b42324dac6" Mar 12 15:07:08 crc kubenswrapper[4869]: E0312 15:07:08.170350 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e5a3316f72a1ef07c14850a5e807e5653a17a4973a841bb9a83e9b42324dac6\": container with ID starting with 8e5a3316f72a1ef07c14850a5e807e5653a17a4973a841bb9a83e9b42324dac6 not found: ID does not exist" containerID="8e5a3316f72a1ef07c14850a5e807e5653a17a4973a841bb9a83e9b42324dac6" Mar 12 15:07:08 crc kubenswrapper[4869]: I0312 15:07:08.170390 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e5a3316f72a1ef07c14850a5e807e5653a17a4973a841bb9a83e9b42324dac6"} err="failed to get container status \"8e5a3316f72a1ef07c14850a5e807e5653a17a4973a841bb9a83e9b42324dac6\": rpc error: code = NotFound desc = could not find container \"8e5a3316f72a1ef07c14850a5e807e5653a17a4973a841bb9a83e9b42324dac6\": container with ID starting with 8e5a3316f72a1ef07c14850a5e807e5653a17a4973a841bb9a83e9b42324dac6 not found: ID does not exist" Mar 12 15:07:08 crc kubenswrapper[4869]: I0312 15:07:08.170418 4869 scope.go:117] "RemoveContainer" containerID="afaf038a40867135337a9a205530d376d574eb580c7749ec339e3d2fc39891c5" Mar 12 15:07:08 crc kubenswrapper[4869]: E0312 15:07:08.170930 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afaf038a40867135337a9a205530d376d574eb580c7749ec339e3d2fc39891c5\": container with ID starting with afaf038a40867135337a9a205530d376d574eb580c7749ec339e3d2fc39891c5 not found: ID does not exist" containerID="afaf038a40867135337a9a205530d376d574eb580c7749ec339e3d2fc39891c5" Mar 12 15:07:08 crc kubenswrapper[4869]: I0312 15:07:08.170964 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afaf038a40867135337a9a205530d376d574eb580c7749ec339e3d2fc39891c5"} err="failed to get container status \"afaf038a40867135337a9a205530d376d574eb580c7749ec339e3d2fc39891c5\": rpc error: code = NotFound desc = could not find container \"afaf038a40867135337a9a205530d376d574eb580c7749ec339e3d2fc39891c5\": container with ID starting with afaf038a40867135337a9a205530d376d574eb580c7749ec339e3d2fc39891c5 not found: ID does not exist" Mar 12 15:07:08 crc kubenswrapper[4869]: I0312 15:07:08.175410 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-k4v6m"] Mar 12 15:07:08 crc kubenswrapper[4869]: I0312 15:07:08.348463 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c10d3e5-a692-41f2-9896-67f9d062cf9f" path="/var/lib/kubelet/pods/6c10d3e5-a692-41f2-9896-67f9d062cf9f/volumes" Mar 12 15:07:13 crc kubenswrapper[4869]: I0312 15:07:13.640875 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:07:14 crc kubenswrapper[4869]: I0312 15:07:14.010935 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.412739 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-468h4"] Mar 12 15:07:15 crc kubenswrapper[4869]: E0312 15:07:15.413332 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c10d3e5-a692-41f2-9896-67f9d062cf9f" containerName="init" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.413344 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c10d3e5-a692-41f2-9896-67f9d062cf9f" containerName="init" Mar 12 15:07:15 crc kubenswrapper[4869]: E0312 15:07:15.413367 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c10d3e5-a692-41f2-9896-67f9d062cf9f" containerName="dnsmasq-dns" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.413374 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c10d3e5-a692-41f2-9896-67f9d062cf9f" containerName="dnsmasq-dns" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.413528 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c10d3e5-a692-41f2-9896-67f9d062cf9f" containerName="dnsmasq-dns" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.414093 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-468h4" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.429488 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-468h4"] Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.491613 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e31b3289-8593-4def-be89-bed7b813ef1b-operator-scripts\") pod \"manila-db-create-468h4\" (UID: \"e31b3289-8593-4def-be89-bed7b813ef1b\") " pod="openstack/manila-db-create-468h4" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.491891 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnp9d\" (UniqueName: \"kubernetes.io/projected/e31b3289-8593-4def-be89-bed7b813ef1b-kube-api-access-hnp9d\") pod \"manila-db-create-468h4\" (UID: \"e31b3289-8593-4def-be89-bed7b813ef1b\") " pod="openstack/manila-db-create-468h4" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.499783 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-gfwwb"] Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.500761 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gfwwb" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.525490 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-gfwwb"] Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.538862 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-318e-account-create-update-zvxwz"] Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.540447 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-318e-account-create-update-zvxwz" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.544891 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.561334 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-318e-account-create-update-zvxwz"] Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.595731 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vlff\" (UniqueName: \"kubernetes.io/projected/2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce-kube-api-access-4vlff\") pod \"cinder-db-create-gfwwb\" (UID: \"2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce\") " pod="openstack/cinder-db-create-gfwwb" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.595787 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bcf123a-c619-4d5e-a18a-e9a40db85ce4-operator-scripts\") pod \"cinder-318e-account-create-update-zvxwz\" (UID: \"0bcf123a-c619-4d5e-a18a-e9a40db85ce4\") " pod="openstack/cinder-318e-account-create-update-zvxwz" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.595836 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e31b3289-8593-4def-be89-bed7b813ef1b-operator-scripts\") pod \"manila-db-create-468h4\" (UID: \"e31b3289-8593-4def-be89-bed7b813ef1b\") " pod="openstack/manila-db-create-468h4" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.596051 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76kl5\" (UniqueName: \"kubernetes.io/projected/0bcf123a-c619-4d5e-a18a-e9a40db85ce4-kube-api-access-76kl5\") pod \"cinder-318e-account-create-update-zvxwz\" (UID: \"0bcf123a-c619-4d5e-a18a-e9a40db85ce4\") " pod="openstack/cinder-318e-account-create-update-zvxwz" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.596123 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnp9d\" (UniqueName: \"kubernetes.io/projected/e31b3289-8593-4def-be89-bed7b813ef1b-kube-api-access-hnp9d\") pod \"manila-db-create-468h4\" (UID: \"e31b3289-8593-4def-be89-bed7b813ef1b\") " pod="openstack/manila-db-create-468h4" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.596177 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce-operator-scripts\") pod \"cinder-db-create-gfwwb\" (UID: \"2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce\") " pod="openstack/cinder-db-create-gfwwb" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.596443 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e31b3289-8593-4def-be89-bed7b813ef1b-operator-scripts\") pod \"manila-db-create-468h4\" (UID: \"e31b3289-8593-4def-be89-bed7b813ef1b\") " pod="openstack/manila-db-create-468h4" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.605131 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-aa40-account-create-update-dj2sh"] Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.606368 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-aa40-account-create-update-dj2sh" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.609604 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.616766 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-aa40-account-create-update-dj2sh"] Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.638770 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnp9d\" (UniqueName: \"kubernetes.io/projected/e31b3289-8593-4def-be89-bed7b813ef1b-kube-api-access-hnp9d\") pod \"manila-db-create-468h4\" (UID: \"e31b3289-8593-4def-be89-bed7b813ef1b\") " pod="openstack/manila-db-create-468h4" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.681639 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-6589-account-create-update-flcbb"] Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.683193 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-6589-account-create-update-flcbb" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.685093 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.697704 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76kl5\" (UniqueName: \"kubernetes.io/projected/0bcf123a-c619-4d5e-a18a-e9a40db85ce4-kube-api-access-76kl5\") pod \"cinder-318e-account-create-update-zvxwz\" (UID: \"0bcf123a-c619-4d5e-a18a-e9a40db85ce4\") " pod="openstack/cinder-318e-account-create-update-zvxwz" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.697762 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b8504b7-ac15-45df-b26a-7c848cf4c068-operator-scripts\") pod \"barbican-aa40-account-create-update-dj2sh\" (UID: \"7b8504b7-ac15-45df-b26a-7c848cf4c068\") " pod="openstack/barbican-aa40-account-create-update-dj2sh" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.697786 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce-operator-scripts\") pod \"cinder-db-create-gfwwb\" (UID: \"2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce\") " pod="openstack/cinder-db-create-gfwwb" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.697843 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vlff\" (UniqueName: \"kubernetes.io/projected/2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce-kube-api-access-4vlff\") pod \"cinder-db-create-gfwwb\" (UID: \"2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce\") " pod="openstack/cinder-db-create-gfwwb" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.697872 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bcf123a-c619-4d5e-a18a-e9a40db85ce4-operator-scripts\") pod \"cinder-318e-account-create-update-zvxwz\" (UID: \"0bcf123a-c619-4d5e-a18a-e9a40db85ce4\") " pod="openstack/cinder-318e-account-create-update-zvxwz" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.697918 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw26j\" (UniqueName: \"kubernetes.io/projected/7b8504b7-ac15-45df-b26a-7c848cf4c068-kube-api-access-rw26j\") pod \"barbican-aa40-account-create-update-dj2sh\" (UID: \"7b8504b7-ac15-45df-b26a-7c848cf4c068\") " pod="openstack/barbican-aa40-account-create-update-dj2sh" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.699052 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bcf123a-c619-4d5e-a18a-e9a40db85ce4-operator-scripts\") pod \"cinder-318e-account-create-update-zvxwz\" (UID: \"0bcf123a-c619-4d5e-a18a-e9a40db85ce4\") " pod="openstack/cinder-318e-account-create-update-zvxwz" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.699351 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce-operator-scripts\") pod \"cinder-db-create-gfwwb\" (UID: \"2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce\") " pod="openstack/cinder-db-create-gfwwb" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.714430 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-6589-account-create-update-flcbb"] Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.734641 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-468h4" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.737559 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vlff\" (UniqueName: \"kubernetes.io/projected/2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce-kube-api-access-4vlff\") pod \"cinder-db-create-gfwwb\" (UID: \"2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce\") " pod="openstack/cinder-db-create-gfwwb" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.741070 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76kl5\" (UniqueName: \"kubernetes.io/projected/0bcf123a-c619-4d5e-a18a-e9a40db85ce4-kube-api-access-76kl5\") pod \"cinder-318e-account-create-update-zvxwz\" (UID: \"0bcf123a-c619-4d5e-a18a-e9a40db85ce4\") " pod="openstack/cinder-318e-account-create-update-zvxwz" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.799434 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kpp2\" (UniqueName: \"kubernetes.io/projected/ab13ae17-fd00-4317-bbe7-bac949fbb0bb-kube-api-access-6kpp2\") pod \"manila-6589-account-create-update-flcbb\" (UID: \"ab13ae17-fd00-4317-bbe7-bac949fbb0bb\") " pod="openstack/manila-6589-account-create-update-flcbb" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.799526 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab13ae17-fd00-4317-bbe7-bac949fbb0bb-operator-scripts\") pod \"manila-6589-account-create-update-flcbb\" (UID: \"ab13ae17-fd00-4317-bbe7-bac949fbb0bb\") " pod="openstack/manila-6589-account-create-update-flcbb" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.799602 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw26j\" (UniqueName: \"kubernetes.io/projected/7b8504b7-ac15-45df-b26a-7c848cf4c068-kube-api-access-rw26j\") pod \"barbican-aa40-account-create-update-dj2sh\" (UID: \"7b8504b7-ac15-45df-b26a-7c848cf4c068\") " pod="openstack/barbican-aa40-account-create-update-dj2sh" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.799664 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b8504b7-ac15-45df-b26a-7c848cf4c068-operator-scripts\") pod \"barbican-aa40-account-create-update-dj2sh\" (UID: \"7b8504b7-ac15-45df-b26a-7c848cf4c068\") " pod="openstack/barbican-aa40-account-create-update-dj2sh" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.800616 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b8504b7-ac15-45df-b26a-7c848cf4c068-operator-scripts\") pod \"barbican-aa40-account-create-update-dj2sh\" (UID: \"7b8504b7-ac15-45df-b26a-7c848cf4c068\") " pod="openstack/barbican-aa40-account-create-update-dj2sh" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.808496 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-4nrzc"] Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.818881 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4nrzc" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.819225 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gfwwb" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.819870 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw26j\" (UniqueName: \"kubernetes.io/projected/7b8504b7-ac15-45df-b26a-7c848cf4c068-kube-api-access-rw26j\") pod \"barbican-aa40-account-create-update-dj2sh\" (UID: \"7b8504b7-ac15-45df-b26a-7c848cf4c068\") " pod="openstack/barbican-aa40-account-create-update-dj2sh" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.835765 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.847651 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-4nrzc"] Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.861869 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-318e-account-create-update-zvxwz" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.873425 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-h89v6"] Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.874486 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-h89v6" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.880146 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.894209 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.894522 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.895145 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-cl6hz" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.898151 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-h89v6"] Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.908725 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kpp2\" (UniqueName: \"kubernetes.io/projected/ab13ae17-fd00-4317-bbe7-bac949fbb0bb-kube-api-access-6kpp2\") pod \"manila-6589-account-create-update-flcbb\" (UID: \"ab13ae17-fd00-4317-bbe7-bac949fbb0bb\") " pod="openstack/manila-6589-account-create-update-flcbb" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.908819 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab13ae17-fd00-4317-bbe7-bac949fbb0bb-operator-scripts\") pod \"manila-6589-account-create-update-flcbb\" (UID: \"ab13ae17-fd00-4317-bbe7-bac949fbb0bb\") " pod="openstack/manila-6589-account-create-update-flcbb" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.908891 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0-operator-scripts\") pod \"barbican-db-create-4nrzc\" (UID: \"eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0\") " pod="openstack/barbican-db-create-4nrzc" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.908927 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnbn5\" (UniqueName: \"kubernetes.io/projected/eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0-kube-api-access-bnbn5\") pod \"barbican-db-create-4nrzc\" (UID: \"eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0\") " pod="openstack/barbican-db-create-4nrzc" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.914236 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab13ae17-fd00-4317-bbe7-bac949fbb0bb-operator-scripts\") pod \"manila-6589-account-create-update-flcbb\" (UID: \"ab13ae17-fd00-4317-bbe7-bac949fbb0bb\") " pod="openstack/manila-6589-account-create-update-flcbb" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.918580 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-n44jd"] Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.919806 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-n44jd" Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.954190 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-mkmpj"] Mar 12 15:07:15 crc kubenswrapper[4869]: I0312 15:07:15.954933 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-mkmpj" podUID="349a0f9a-cebb-4465-9774-b497c95357b6" containerName="dnsmasq-dns" containerID="cri-o://0c07fd9e49269aa67a9298f7761da7f26974c0466717a12acdcc05514613aa2a" gracePeriod=10 Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.001506 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kpp2\" (UniqueName: \"kubernetes.io/projected/ab13ae17-fd00-4317-bbe7-bac949fbb0bb-kube-api-access-6kpp2\") pod \"manila-6589-account-create-update-flcbb\" (UID: \"ab13ae17-fd00-4317-bbe7-bac949fbb0bb\") " pod="openstack/manila-6589-account-create-update-flcbb" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.003090 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-aa40-account-create-update-dj2sh" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.003849 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-n44jd"] Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.004073 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-6589-account-create-update-flcbb" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.014697 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42953bf3-0bd5-467e-9502-6ddea5e8bf92-combined-ca-bundle\") pod \"keystone-db-sync-h89v6\" (UID: \"42953bf3-0bd5-467e-9502-6ddea5e8bf92\") " pod="openstack/keystone-db-sync-h89v6" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.014789 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2917626f-4943-4e2a-989b-3ec041337596-operator-scripts\") pod \"neutron-db-create-n44jd\" (UID: \"2917626f-4943-4e2a-989b-3ec041337596\") " pod="openstack/neutron-db-create-n44jd" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.014828 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg7ff\" (UniqueName: \"kubernetes.io/projected/42953bf3-0bd5-467e-9502-6ddea5e8bf92-kube-api-access-bg7ff\") pod \"keystone-db-sync-h89v6\" (UID: \"42953bf3-0bd5-467e-9502-6ddea5e8bf92\") " pod="openstack/keystone-db-sync-h89v6" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.014853 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfn69\" (UniqueName: \"kubernetes.io/projected/2917626f-4943-4e2a-989b-3ec041337596-kube-api-access-kfn69\") pod \"neutron-db-create-n44jd\" (UID: \"2917626f-4943-4e2a-989b-3ec041337596\") " pod="openstack/neutron-db-create-n44jd" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.014903 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0-operator-scripts\") pod \"barbican-db-create-4nrzc\" (UID: \"eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0\") " pod="openstack/barbican-db-create-4nrzc" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.014944 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnbn5\" (UniqueName: \"kubernetes.io/projected/eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0-kube-api-access-bnbn5\") pod \"barbican-db-create-4nrzc\" (UID: \"eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0\") " pod="openstack/barbican-db-create-4nrzc" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.014987 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42953bf3-0bd5-467e-9502-6ddea5e8bf92-config-data\") pod \"keystone-db-sync-h89v6\" (UID: \"42953bf3-0bd5-467e-9502-6ddea5e8bf92\") " pod="openstack/keystone-db-sync-h89v6" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.017980 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0-operator-scripts\") pod \"barbican-db-create-4nrzc\" (UID: \"eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0\") " pod="openstack/barbican-db-create-4nrzc" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.041349 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnbn5\" (UniqueName: \"kubernetes.io/projected/eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0-kube-api-access-bnbn5\") pod \"barbican-db-create-4nrzc\" (UID: \"eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0\") " pod="openstack/barbican-db-create-4nrzc" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.108632 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-efc2-account-create-update-54dhl"] Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.110126 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-efc2-account-create-update-54dhl" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.112827 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.116748 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42953bf3-0bd5-467e-9502-6ddea5e8bf92-combined-ca-bundle\") pod \"keystone-db-sync-h89v6\" (UID: \"42953bf3-0bd5-467e-9502-6ddea5e8bf92\") " pod="openstack/keystone-db-sync-h89v6" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.116824 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2917626f-4943-4e2a-989b-3ec041337596-operator-scripts\") pod \"neutron-db-create-n44jd\" (UID: \"2917626f-4943-4e2a-989b-3ec041337596\") " pod="openstack/neutron-db-create-n44jd" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.116865 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg7ff\" (UniqueName: \"kubernetes.io/projected/42953bf3-0bd5-467e-9502-6ddea5e8bf92-kube-api-access-bg7ff\") pod \"keystone-db-sync-h89v6\" (UID: \"42953bf3-0bd5-467e-9502-6ddea5e8bf92\") " pod="openstack/keystone-db-sync-h89v6" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.116891 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfn69\" (UniqueName: \"kubernetes.io/projected/2917626f-4943-4e2a-989b-3ec041337596-kube-api-access-kfn69\") pod \"neutron-db-create-n44jd\" (UID: \"2917626f-4943-4e2a-989b-3ec041337596\") " pod="openstack/neutron-db-create-n44jd" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.116964 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42953bf3-0bd5-467e-9502-6ddea5e8bf92-config-data\") pod \"keystone-db-sync-h89v6\" (UID: \"42953bf3-0bd5-467e-9502-6ddea5e8bf92\") " pod="openstack/keystone-db-sync-h89v6" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.118070 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2917626f-4943-4e2a-989b-3ec041337596-operator-scripts\") pod \"neutron-db-create-n44jd\" (UID: \"2917626f-4943-4e2a-989b-3ec041337596\") " pod="openstack/neutron-db-create-n44jd" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.121500 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42953bf3-0bd5-467e-9502-6ddea5e8bf92-combined-ca-bundle\") pod \"keystone-db-sync-h89v6\" (UID: \"42953bf3-0bd5-467e-9502-6ddea5e8bf92\") " pod="openstack/keystone-db-sync-h89v6" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.122523 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-efc2-account-create-update-54dhl"] Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.125320 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42953bf3-0bd5-467e-9502-6ddea5e8bf92-config-data\") pod \"keystone-db-sync-h89v6\" (UID: \"42953bf3-0bd5-467e-9502-6ddea5e8bf92\") " pod="openstack/keystone-db-sync-h89v6" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.149798 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfn69\" (UniqueName: \"kubernetes.io/projected/2917626f-4943-4e2a-989b-3ec041337596-kube-api-access-kfn69\") pod \"neutron-db-create-n44jd\" (UID: \"2917626f-4943-4e2a-989b-3ec041337596\") " pod="openstack/neutron-db-create-n44jd" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.176476 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg7ff\" (UniqueName: \"kubernetes.io/projected/42953bf3-0bd5-467e-9502-6ddea5e8bf92-kube-api-access-bg7ff\") pod \"keystone-db-sync-h89v6\" (UID: \"42953bf3-0bd5-467e-9502-6ddea5e8bf92\") " pod="openstack/keystone-db-sync-h89v6" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.201901 4869 generic.go:334] "Generic (PLEG): container finished" podID="349a0f9a-cebb-4465-9774-b497c95357b6" containerID="0c07fd9e49269aa67a9298f7761da7f26974c0466717a12acdcc05514613aa2a" exitCode=0 Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.201955 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-mkmpj" event={"ID":"349a0f9a-cebb-4465-9774-b497c95357b6","Type":"ContainerDied","Data":"0c07fd9e49269aa67a9298f7761da7f26974c0466717a12acdcc05514613aa2a"} Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.220100 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7t9r\" (UniqueName: \"kubernetes.io/projected/18c82400-c2cb-44d6-a384-8c604824f269-kube-api-access-s7t9r\") pod \"neutron-efc2-account-create-update-54dhl\" (UID: \"18c82400-c2cb-44d6-a384-8c604824f269\") " pod="openstack/neutron-efc2-account-create-update-54dhl" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.220192 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18c82400-c2cb-44d6-a384-8c604824f269-operator-scripts\") pod \"neutron-efc2-account-create-update-54dhl\" (UID: \"18c82400-c2cb-44d6-a384-8c604824f269\") " pod="openstack/neutron-efc2-account-create-update-54dhl" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.234178 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4nrzc" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.247533 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-h89v6" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.302863 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-n44jd" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.322182 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7t9r\" (UniqueName: \"kubernetes.io/projected/18c82400-c2cb-44d6-a384-8c604824f269-kube-api-access-s7t9r\") pod \"neutron-efc2-account-create-update-54dhl\" (UID: \"18c82400-c2cb-44d6-a384-8c604824f269\") " pod="openstack/neutron-efc2-account-create-update-54dhl" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.322253 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18c82400-c2cb-44d6-a384-8c604824f269-operator-scripts\") pod \"neutron-efc2-account-create-update-54dhl\" (UID: \"18c82400-c2cb-44d6-a384-8c604824f269\") " pod="openstack/neutron-efc2-account-create-update-54dhl" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.323142 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18c82400-c2cb-44d6-a384-8c604824f269-operator-scripts\") pod \"neutron-efc2-account-create-update-54dhl\" (UID: \"18c82400-c2cb-44d6-a384-8c604824f269\") " pod="openstack/neutron-efc2-account-create-update-54dhl" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.344486 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7t9r\" (UniqueName: \"kubernetes.io/projected/18c82400-c2cb-44d6-a384-8c604824f269-kube-api-access-s7t9r\") pod \"neutron-efc2-account-create-update-54dhl\" (UID: \"18c82400-c2cb-44d6-a384-8c604824f269\") " pod="openstack/neutron-efc2-account-create-update-54dhl" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.486069 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-efc2-account-create-update-54dhl" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.495526 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-468h4"] Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.778048 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-mkmpj" Mar 12 15:07:16 crc kubenswrapper[4869]: W0312 15:07:16.821007 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2552e441_e5f2_4fe6_bd1b_4a92dc9e2cce.slice/crio-4ad9b15eac5450d05a9fd6882e35cb39c903d9269109d0fb742560231c630e62 WatchSource:0}: Error finding container 4ad9b15eac5450d05a9fd6882e35cb39c903d9269109d0fb742560231c630e62: Status 404 returned error can't find the container with id 4ad9b15eac5450d05a9fd6882e35cb39c903d9269109d0fb742560231c630e62 Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.829752 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-gfwwb"] Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.833355 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/349a0f9a-cebb-4465-9774-b497c95357b6-ovsdbserver-sb\") pod \"349a0f9a-cebb-4465-9774-b497c95357b6\" (UID: \"349a0f9a-cebb-4465-9774-b497c95357b6\") " Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.833602 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/349a0f9a-cebb-4465-9774-b497c95357b6-dns-svc\") pod \"349a0f9a-cebb-4465-9774-b497c95357b6\" (UID: \"349a0f9a-cebb-4465-9774-b497c95357b6\") " Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.833733 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/349a0f9a-cebb-4465-9774-b497c95357b6-ovsdbserver-nb\") pod \"349a0f9a-cebb-4465-9774-b497c95357b6\" (UID: \"349a0f9a-cebb-4465-9774-b497c95357b6\") " Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.833832 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmvjm\" (UniqueName: \"kubernetes.io/projected/349a0f9a-cebb-4465-9774-b497c95357b6-kube-api-access-nmvjm\") pod \"349a0f9a-cebb-4465-9774-b497c95357b6\" (UID: \"349a0f9a-cebb-4465-9774-b497c95357b6\") " Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.833920 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/349a0f9a-cebb-4465-9774-b497c95357b6-config\") pod \"349a0f9a-cebb-4465-9774-b497c95357b6\" (UID: \"349a0f9a-cebb-4465-9774-b497c95357b6\") " Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.848034 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-aa40-account-create-update-dj2sh"] Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.849709 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/349a0f9a-cebb-4465-9774-b497c95357b6-kube-api-access-nmvjm" (OuterVolumeSpecName: "kube-api-access-nmvjm") pod "349a0f9a-cebb-4465-9774-b497c95357b6" (UID: "349a0f9a-cebb-4465-9774-b497c95357b6"). InnerVolumeSpecName "kube-api-access-nmvjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.928966 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/349a0f9a-cebb-4465-9774-b497c95357b6-config" (OuterVolumeSpecName: "config") pod "349a0f9a-cebb-4465-9774-b497c95357b6" (UID: "349a0f9a-cebb-4465-9774-b497c95357b6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.928691 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/349a0f9a-cebb-4465-9774-b497c95357b6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "349a0f9a-cebb-4465-9774-b497c95357b6" (UID: "349a0f9a-cebb-4465-9774-b497c95357b6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.934782 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/349a0f9a-cebb-4465-9774-b497c95357b6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.934804 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmvjm\" (UniqueName: \"kubernetes.io/projected/349a0f9a-cebb-4465-9774-b497c95357b6-kube-api-access-nmvjm\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.934814 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/349a0f9a-cebb-4465-9774-b497c95357b6-config\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.979732 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/349a0f9a-cebb-4465-9774-b497c95357b6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "349a0f9a-cebb-4465-9774-b497c95357b6" (UID: "349a0f9a-cebb-4465-9774-b497c95357b6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:16 crc kubenswrapper[4869]: I0312 15:07:16.991477 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-4nrzc"] Mar 12 15:07:17 crc kubenswrapper[4869]: I0312 15:07:17.002496 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/349a0f9a-cebb-4465-9774-b497c95357b6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "349a0f9a-cebb-4465-9774-b497c95357b6" (UID: "349a0f9a-cebb-4465-9774-b497c95357b6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:17 crc kubenswrapper[4869]: W0312 15:07:17.005652 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2917626f_4943_4e2a_989b_3ec041337596.slice/crio-ea199879952392ed783c8dce6e60331d55d3fc847ff5a4f4087148711ffd2de6 WatchSource:0}: Error finding container ea199879952392ed783c8dce6e60331d55d3fc847ff5a4f4087148711ffd2de6: Status 404 returned error can't find the container with id ea199879952392ed783c8dce6e60331d55d3fc847ff5a4f4087148711ffd2de6 Mar 12 15:07:17 crc kubenswrapper[4869]: I0312 15:07:17.006182 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-n44jd"] Mar 12 15:07:17 crc kubenswrapper[4869]: I0312 15:07:17.035881 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/349a0f9a-cebb-4465-9774-b497c95357b6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:17 crc kubenswrapper[4869]: I0312 15:07:17.035914 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/349a0f9a-cebb-4465-9774-b497c95357b6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:17 crc kubenswrapper[4869]: I0312 15:07:17.121240 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-efc2-account-create-update-54dhl"] Mar 12 15:07:17 crc kubenswrapper[4869]: I0312 15:07:17.214434 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-318e-account-create-update-zvxwz"] Mar 12 15:07:17 crc kubenswrapper[4869]: I0312 15:07:17.231441 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-h89v6"] Mar 12 15:07:17 crc kubenswrapper[4869]: I0312 15:07:17.240009 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-aa40-account-create-update-dj2sh" event={"ID":"7b8504b7-ac15-45df-b26a-7c848cf4c068","Type":"ContainerStarted","Data":"9ce1eea7ceafe6fc46678886416f1b58da24b4abe7ed2164b6c1448e1cda05fc"} Mar 12 15:07:17 crc kubenswrapper[4869]: I0312 15:07:17.242963 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4nrzc" event={"ID":"eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0","Type":"ContainerStarted","Data":"aebafe8f7ce086f2325b476e12a817eab4fa68d9b5351a1c67d66ef936f8d8c3"} Mar 12 15:07:17 crc kubenswrapper[4869]: I0312 15:07:17.250270 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-6589-account-create-update-flcbb"] Mar 12 15:07:17 crc kubenswrapper[4869]: I0312 15:07:17.254810 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-n44jd" event={"ID":"2917626f-4943-4e2a-989b-3ec041337596","Type":"ContainerStarted","Data":"ea199879952392ed783c8dce6e60331d55d3fc847ff5a4f4087148711ffd2de6"} Mar 12 15:07:17 crc kubenswrapper[4869]: I0312 15:07:17.269269 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-mkmpj" event={"ID":"349a0f9a-cebb-4465-9774-b497c95357b6","Type":"ContainerDied","Data":"dbc60ef521237927abf30373117692ea48953dce7451c1ec3bf82c3b6c07fab9"} Mar 12 15:07:17 crc kubenswrapper[4869]: I0312 15:07:17.269320 4869 scope.go:117] "RemoveContainer" containerID="0c07fd9e49269aa67a9298f7761da7f26974c0466717a12acdcc05514613aa2a" Mar 12 15:07:17 crc kubenswrapper[4869]: I0312 15:07:17.269441 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-mkmpj" Mar 12 15:07:17 crc kubenswrapper[4869]: I0312 15:07:17.276146 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gfwwb" event={"ID":"2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce","Type":"ContainerStarted","Data":"4ad9b15eac5450d05a9fd6882e35cb39c903d9269109d0fb742560231c630e62"} Mar 12 15:07:17 crc kubenswrapper[4869]: I0312 15:07:17.284718 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-468h4" event={"ID":"e31b3289-8593-4def-be89-bed7b813ef1b","Type":"ContainerStarted","Data":"dfc729fa701f62a7acbd858e1ce6589b7b5cc7f3e4e0814d46c8b66a576c30bb"} Mar 12 15:07:17 crc kubenswrapper[4869]: I0312 15:07:17.284762 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-468h4" event={"ID":"e31b3289-8593-4def-be89-bed7b813ef1b","Type":"ContainerStarted","Data":"192cb6dd726f7b7f259f98f2a6220d847777edb688361e984e265b65fffbd1e5"} Mar 12 15:07:17 crc kubenswrapper[4869]: I0312 15:07:17.312507 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-create-468h4" podStartSLOduration=2.312489944 podStartE2EDuration="2.312489944s" podCreationTimestamp="2026-03-12 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:07:17.310831306 +0000 UTC m=+1189.596056584" watchObservedRunningTime="2026-03-12 15:07:17.312489944 +0000 UTC m=+1189.597715222" Mar 12 15:07:17 crc kubenswrapper[4869]: I0312 15:07:17.332848 4869 scope.go:117] "RemoveContainer" containerID="b719cf0d2219bc98e57cc40a542a4c524ddacd6264fc96a45d45a2f39129b310" Mar 12 15:07:17 crc kubenswrapper[4869]: W0312 15:07:17.364221 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab13ae17_fd00_4317_bbe7_bac949fbb0bb.slice/crio-421e745f743ae306a3c9e3508841d8139148426bf94573bed9eb1795adf09518 WatchSource:0}: Error finding container 421e745f743ae306a3c9e3508841d8139148426bf94573bed9eb1795adf09518: Status 404 returned error can't find the container with id 421e745f743ae306a3c9e3508841d8139148426bf94573bed9eb1795adf09518 Mar 12 15:07:17 crc kubenswrapper[4869]: I0312 15:07:17.382852 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-mkmpj"] Mar 12 15:07:17 crc kubenswrapper[4869]: I0312 15:07:17.405353 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-mkmpj"] Mar 12 15:07:18 crc kubenswrapper[4869]: I0312 15:07:18.294108 4869 generic.go:334] "Generic (PLEG): container finished" podID="2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce" containerID="d7e2802fbcddddc202060a2ad07bea3a697bcfc6c526ac74580df40d5d118e85" exitCode=0 Mar 12 15:07:18 crc kubenswrapper[4869]: I0312 15:07:18.294220 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gfwwb" event={"ID":"2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce","Type":"ContainerDied","Data":"d7e2802fbcddddc202060a2ad07bea3a697bcfc6c526ac74580df40d5d118e85"} Mar 12 15:07:18 crc kubenswrapper[4869]: I0312 15:07:18.299235 4869 generic.go:334] "Generic (PLEG): container finished" podID="e31b3289-8593-4def-be89-bed7b813ef1b" containerID="dfc729fa701f62a7acbd858e1ce6589b7b5cc7f3e4e0814d46c8b66a576c30bb" exitCode=0 Mar 12 15:07:18 crc kubenswrapper[4869]: I0312 15:07:18.299310 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-468h4" event={"ID":"e31b3289-8593-4def-be89-bed7b813ef1b","Type":"ContainerDied","Data":"dfc729fa701f62a7acbd858e1ce6589b7b5cc7f3e4e0814d46c8b66a576c30bb"} Mar 12 15:07:18 crc kubenswrapper[4869]: I0312 15:07:18.300849 4869 generic.go:334] "Generic (PLEG): container finished" podID="7b8504b7-ac15-45df-b26a-7c848cf4c068" containerID="3a2842f8f11d563df3cf17fa015a2d9eaad47c9fd20005ea4dada238ecf6a1b7" exitCode=0 Mar 12 15:07:18 crc kubenswrapper[4869]: I0312 15:07:18.300904 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-aa40-account-create-update-dj2sh" event={"ID":"7b8504b7-ac15-45df-b26a-7c848cf4c068","Type":"ContainerDied","Data":"3a2842f8f11d563df3cf17fa015a2d9eaad47c9fd20005ea4dada238ecf6a1b7"} Mar 12 15:07:18 crc kubenswrapper[4869]: I0312 15:07:18.303226 4869 generic.go:334] "Generic (PLEG): container finished" podID="eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0" containerID="e7d3c51e8ec76a1ad414f7a73293decf839862c1d364222ca0eee41115ec500b" exitCode=0 Mar 12 15:07:18 crc kubenswrapper[4869]: I0312 15:07:18.303284 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4nrzc" event={"ID":"eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0","Type":"ContainerDied","Data":"e7d3c51e8ec76a1ad414f7a73293decf839862c1d364222ca0eee41115ec500b"} Mar 12 15:07:18 crc kubenswrapper[4869]: I0312 15:07:18.308944 4869 generic.go:334] "Generic (PLEG): container finished" podID="0bcf123a-c619-4d5e-a18a-e9a40db85ce4" containerID="280546406e7403bb6ec74cebcd27dc92a837e19c2dc31048c2e42d77827e7cc0" exitCode=0 Mar 12 15:07:18 crc kubenswrapper[4869]: I0312 15:07:18.309127 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-318e-account-create-update-zvxwz" event={"ID":"0bcf123a-c619-4d5e-a18a-e9a40db85ce4","Type":"ContainerDied","Data":"280546406e7403bb6ec74cebcd27dc92a837e19c2dc31048c2e42d77827e7cc0"} Mar 12 15:07:18 crc kubenswrapper[4869]: I0312 15:07:18.309163 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-318e-account-create-update-zvxwz" event={"ID":"0bcf123a-c619-4d5e-a18a-e9a40db85ce4","Type":"ContainerStarted","Data":"63a7257c1121dbbce1c3f43ecef1e6d7e064484a6f00e6ecb63a9a773302a4d8"} Mar 12 15:07:18 crc kubenswrapper[4869]: I0312 15:07:18.315873 4869 generic.go:334] "Generic (PLEG): container finished" podID="ab13ae17-fd00-4317-bbe7-bac949fbb0bb" containerID="85aac3668302de3448bcef5f1aed7619f7feddce3efa61660a003a9bd189257b" exitCode=0 Mar 12 15:07:18 crc kubenswrapper[4869]: I0312 15:07:18.315983 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-6589-account-create-update-flcbb" event={"ID":"ab13ae17-fd00-4317-bbe7-bac949fbb0bb","Type":"ContainerDied","Data":"85aac3668302de3448bcef5f1aed7619f7feddce3efa61660a003a9bd189257b"} Mar 12 15:07:18 crc kubenswrapper[4869]: I0312 15:07:18.316022 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-6589-account-create-update-flcbb" event={"ID":"ab13ae17-fd00-4317-bbe7-bac949fbb0bb","Type":"ContainerStarted","Data":"421e745f743ae306a3c9e3508841d8139148426bf94573bed9eb1795adf09518"} Mar 12 15:07:18 crc kubenswrapper[4869]: I0312 15:07:18.317516 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-h89v6" event={"ID":"42953bf3-0bd5-467e-9502-6ddea5e8bf92","Type":"ContainerStarted","Data":"5df3b048338d1569a73d05b5b999d954138d6d56c8b22f83e7b9151fb2142635"} Mar 12 15:07:18 crc kubenswrapper[4869]: I0312 15:07:18.319157 4869 generic.go:334] "Generic (PLEG): container finished" podID="18c82400-c2cb-44d6-a384-8c604824f269" containerID="eb0b5225f0b17ea9fd14d44ab11c0e8d9b454f53a8ec345e0a8c5bdf4f90dca2" exitCode=0 Mar 12 15:07:18 crc kubenswrapper[4869]: I0312 15:07:18.319528 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-efc2-account-create-update-54dhl" event={"ID":"18c82400-c2cb-44d6-a384-8c604824f269","Type":"ContainerDied","Data":"eb0b5225f0b17ea9fd14d44ab11c0e8d9b454f53a8ec345e0a8c5bdf4f90dca2"} Mar 12 15:07:18 crc kubenswrapper[4869]: I0312 15:07:18.319571 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-efc2-account-create-update-54dhl" event={"ID":"18c82400-c2cb-44d6-a384-8c604824f269","Type":"ContainerStarted","Data":"32803000b24d46f00f77f60d41cf63c35b69ab67f180544da6763181a4db9025"} Mar 12 15:07:18 crc kubenswrapper[4869]: I0312 15:07:18.322706 4869 generic.go:334] "Generic (PLEG): container finished" podID="2917626f-4943-4e2a-989b-3ec041337596" containerID="f9a0313a123637debb355a3e3a1a19d19df8997691674ff56da3a3483f37dbeb" exitCode=0 Mar 12 15:07:18 crc kubenswrapper[4869]: I0312 15:07:18.322817 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-n44jd" event={"ID":"2917626f-4943-4e2a-989b-3ec041337596","Type":"ContainerDied","Data":"f9a0313a123637debb355a3e3a1a19d19df8997691674ff56da3a3483f37dbeb"} Mar 12 15:07:18 crc kubenswrapper[4869]: I0312 15:07:18.351504 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="349a0f9a-cebb-4465-9774-b497c95357b6" path="/var/lib/kubelet/pods/349a0f9a-cebb-4465-9774-b497c95357b6/volumes" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.626593 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4nrzc" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.635923 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-468h4" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.665656 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-318e-account-create-update-zvxwz" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.693105 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gfwwb" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.695991 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-6589-account-create-update-flcbb" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.709816 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-n44jd" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.719561 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-aa40-account-create-update-dj2sh" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.731975 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-efc2-account-create-update-54dhl" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.822965 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw26j\" (UniqueName: \"kubernetes.io/projected/7b8504b7-ac15-45df-b26a-7c848cf4c068-kube-api-access-rw26j\") pod \"7b8504b7-ac15-45df-b26a-7c848cf4c068\" (UID: \"7b8504b7-ac15-45df-b26a-7c848cf4c068\") " Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.823002 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76kl5\" (UniqueName: \"kubernetes.io/projected/0bcf123a-c619-4d5e-a18a-e9a40db85ce4-kube-api-access-76kl5\") pod \"0bcf123a-c619-4d5e-a18a-e9a40db85ce4\" (UID: \"0bcf123a-c619-4d5e-a18a-e9a40db85ce4\") " Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.823063 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce-operator-scripts\") pod \"2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce\" (UID: \"2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce\") " Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.823114 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfn69\" (UniqueName: \"kubernetes.io/projected/2917626f-4943-4e2a-989b-3ec041337596-kube-api-access-kfn69\") pod \"2917626f-4943-4e2a-989b-3ec041337596\" (UID: \"2917626f-4943-4e2a-989b-3ec041337596\") " Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.823146 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnp9d\" (UniqueName: \"kubernetes.io/projected/e31b3289-8593-4def-be89-bed7b813ef1b-kube-api-access-hnp9d\") pod \"e31b3289-8593-4def-be89-bed7b813ef1b\" (UID: \"e31b3289-8593-4def-be89-bed7b813ef1b\") " Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.823173 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e31b3289-8593-4def-be89-bed7b813ef1b-operator-scripts\") pod \"e31b3289-8593-4def-be89-bed7b813ef1b\" (UID: \"e31b3289-8593-4def-be89-bed7b813ef1b\") " Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.823202 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab13ae17-fd00-4317-bbe7-bac949fbb0bb-operator-scripts\") pod \"ab13ae17-fd00-4317-bbe7-bac949fbb0bb\" (UID: \"ab13ae17-fd00-4317-bbe7-bac949fbb0bb\") " Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.823233 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bcf123a-c619-4d5e-a18a-e9a40db85ce4-operator-scripts\") pod \"0bcf123a-c619-4d5e-a18a-e9a40db85ce4\" (UID: \"0bcf123a-c619-4d5e-a18a-e9a40db85ce4\") " Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.823261 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0-operator-scripts\") pod \"eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0\" (UID: \"eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0\") " Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.823290 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2917626f-4943-4e2a-989b-3ec041337596-operator-scripts\") pod \"2917626f-4943-4e2a-989b-3ec041337596\" (UID: \"2917626f-4943-4e2a-989b-3ec041337596\") " Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.823319 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b8504b7-ac15-45df-b26a-7c848cf4c068-operator-scripts\") pod \"7b8504b7-ac15-45df-b26a-7c848cf4c068\" (UID: \"7b8504b7-ac15-45df-b26a-7c848cf4c068\") " Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.823349 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vlff\" (UniqueName: \"kubernetes.io/projected/2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce-kube-api-access-4vlff\") pod \"2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce\" (UID: \"2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce\") " Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.823371 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnbn5\" (UniqueName: \"kubernetes.io/projected/eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0-kube-api-access-bnbn5\") pod \"eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0\" (UID: \"eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0\") " Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.823409 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kpp2\" (UniqueName: \"kubernetes.io/projected/ab13ae17-fd00-4317-bbe7-bac949fbb0bb-kube-api-access-6kpp2\") pod \"ab13ae17-fd00-4317-bbe7-bac949fbb0bb\" (UID: \"ab13ae17-fd00-4317-bbe7-bac949fbb0bb\") " Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.824155 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce" (UID: "2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.824330 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bcf123a-c619-4d5e-a18a-e9a40db85ce4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0bcf123a-c619-4d5e-a18a-e9a40db85ce4" (UID: "0bcf123a-c619-4d5e-a18a-e9a40db85ce4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.825018 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e31b3289-8593-4def-be89-bed7b813ef1b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e31b3289-8593-4def-be89-bed7b813ef1b" (UID: "e31b3289-8593-4def-be89-bed7b813ef1b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.825106 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab13ae17-fd00-4317-bbe7-bac949fbb0bb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ab13ae17-fd00-4317-bbe7-bac949fbb0bb" (UID: "ab13ae17-fd00-4317-bbe7-bac949fbb0bb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.825317 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b8504b7-ac15-45df-b26a-7c848cf4c068-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7b8504b7-ac15-45df-b26a-7c848cf4c068" (UID: "7b8504b7-ac15-45df-b26a-7c848cf4c068"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.825921 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2917626f-4943-4e2a-989b-3ec041337596-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2917626f-4943-4e2a-989b-3ec041337596" (UID: "2917626f-4943-4e2a-989b-3ec041337596"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.826189 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.826206 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e31b3289-8593-4def-be89-bed7b813ef1b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.826214 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab13ae17-fd00-4317-bbe7-bac949fbb0bb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.826222 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bcf123a-c619-4d5e-a18a-e9a40db85ce4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.826230 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2917626f-4943-4e2a-989b-3ec041337596-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.826228 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0" (UID: "eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.826238 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b8504b7-ac15-45df-b26a-7c848cf4c068-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.829335 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab13ae17-fd00-4317-bbe7-bac949fbb0bb-kube-api-access-6kpp2" (OuterVolumeSpecName: "kube-api-access-6kpp2") pod "ab13ae17-fd00-4317-bbe7-bac949fbb0bb" (UID: "ab13ae17-fd00-4317-bbe7-bac949fbb0bb"). InnerVolumeSpecName "kube-api-access-6kpp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.829472 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b8504b7-ac15-45df-b26a-7c848cf4c068-kube-api-access-rw26j" (OuterVolumeSpecName: "kube-api-access-rw26j") pod "7b8504b7-ac15-45df-b26a-7c848cf4c068" (UID: "7b8504b7-ac15-45df-b26a-7c848cf4c068"). InnerVolumeSpecName "kube-api-access-rw26j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.829740 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e31b3289-8593-4def-be89-bed7b813ef1b-kube-api-access-hnp9d" (OuterVolumeSpecName: "kube-api-access-hnp9d") pod "e31b3289-8593-4def-be89-bed7b813ef1b" (UID: "e31b3289-8593-4def-be89-bed7b813ef1b"). InnerVolumeSpecName "kube-api-access-hnp9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.830361 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2917626f-4943-4e2a-989b-3ec041337596-kube-api-access-kfn69" (OuterVolumeSpecName: "kube-api-access-kfn69") pod "2917626f-4943-4e2a-989b-3ec041337596" (UID: "2917626f-4943-4e2a-989b-3ec041337596"). InnerVolumeSpecName "kube-api-access-kfn69". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.831094 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce-kube-api-access-4vlff" (OuterVolumeSpecName: "kube-api-access-4vlff") pod "2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce" (UID: "2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce"). InnerVolumeSpecName "kube-api-access-4vlff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.835753 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bcf123a-c619-4d5e-a18a-e9a40db85ce4-kube-api-access-76kl5" (OuterVolumeSpecName: "kube-api-access-76kl5") pod "0bcf123a-c619-4d5e-a18a-e9a40db85ce4" (UID: "0bcf123a-c619-4d5e-a18a-e9a40db85ce4"). InnerVolumeSpecName "kube-api-access-76kl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.836829 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0-kube-api-access-bnbn5" (OuterVolumeSpecName: "kube-api-access-bnbn5") pod "eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0" (UID: "eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0"). InnerVolumeSpecName "kube-api-access-bnbn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.927340 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18c82400-c2cb-44d6-a384-8c604824f269-operator-scripts\") pod \"18c82400-c2cb-44d6-a384-8c604824f269\" (UID: \"18c82400-c2cb-44d6-a384-8c604824f269\") " Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.927638 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7t9r\" (UniqueName: \"kubernetes.io/projected/18c82400-c2cb-44d6-a384-8c604824f269-kube-api-access-s7t9r\") pod \"18c82400-c2cb-44d6-a384-8c604824f269\" (UID: \"18c82400-c2cb-44d6-a384-8c604824f269\") " Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.928005 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kpp2\" (UniqueName: \"kubernetes.io/projected/ab13ae17-fd00-4317-bbe7-bac949fbb0bb-kube-api-access-6kpp2\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.928027 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw26j\" (UniqueName: \"kubernetes.io/projected/7b8504b7-ac15-45df-b26a-7c848cf4c068-kube-api-access-rw26j\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.928039 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76kl5\" (UniqueName: \"kubernetes.io/projected/0bcf123a-c619-4d5e-a18a-e9a40db85ce4-kube-api-access-76kl5\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.928051 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfn69\" (UniqueName: \"kubernetes.io/projected/2917626f-4943-4e2a-989b-3ec041337596-kube-api-access-kfn69\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.928062 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnp9d\" (UniqueName: \"kubernetes.io/projected/e31b3289-8593-4def-be89-bed7b813ef1b-kube-api-access-hnp9d\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.928074 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.928086 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vlff\" (UniqueName: \"kubernetes.io/projected/2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce-kube-api-access-4vlff\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.928099 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnbn5\" (UniqueName: \"kubernetes.io/projected/eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0-kube-api-access-bnbn5\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.929136 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18c82400-c2cb-44d6-a384-8c604824f269-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "18c82400-c2cb-44d6-a384-8c604824f269" (UID: "18c82400-c2cb-44d6-a384-8c604824f269"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:21 crc kubenswrapper[4869]: I0312 15:07:21.931626 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18c82400-c2cb-44d6-a384-8c604824f269-kube-api-access-s7t9r" (OuterVolumeSpecName: "kube-api-access-s7t9r") pod "18c82400-c2cb-44d6-a384-8c604824f269" (UID: "18c82400-c2cb-44d6-a384-8c604824f269"). InnerVolumeSpecName "kube-api-access-s7t9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:07:22 crc kubenswrapper[4869]: I0312 15:07:22.029447 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7t9r\" (UniqueName: \"kubernetes.io/projected/18c82400-c2cb-44d6-a384-8c604824f269-kube-api-access-s7t9r\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:22 crc kubenswrapper[4869]: I0312 15:07:22.029737 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18c82400-c2cb-44d6-a384-8c604824f269-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:22 crc kubenswrapper[4869]: I0312 15:07:22.382426 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gfwwb" event={"ID":"2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce","Type":"ContainerDied","Data":"4ad9b15eac5450d05a9fd6882e35cb39c903d9269109d0fb742560231c630e62"} Mar 12 15:07:22 crc kubenswrapper[4869]: I0312 15:07:22.382467 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gfwwb" Mar 12 15:07:22 crc kubenswrapper[4869]: I0312 15:07:22.382479 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ad9b15eac5450d05a9fd6882e35cb39c903d9269109d0fb742560231c630e62" Mar 12 15:07:22 crc kubenswrapper[4869]: I0312 15:07:22.384867 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-468h4" event={"ID":"e31b3289-8593-4def-be89-bed7b813ef1b","Type":"ContainerDied","Data":"192cb6dd726f7b7f259f98f2a6220d847777edb688361e984e265b65fffbd1e5"} Mar 12 15:07:22 crc kubenswrapper[4869]: I0312 15:07:22.384912 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="192cb6dd726f7b7f259f98f2a6220d847777edb688361e984e265b65fffbd1e5" Mar 12 15:07:22 crc kubenswrapper[4869]: I0312 15:07:22.385416 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-468h4" Mar 12 15:07:22 crc kubenswrapper[4869]: I0312 15:07:22.387501 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-h89v6" event={"ID":"42953bf3-0bd5-467e-9502-6ddea5e8bf92","Type":"ContainerStarted","Data":"d6f56cce82a1c2c58f55b9d26a6c8afe3d7b1528ce0cde80535c453bab65d721"} Mar 12 15:07:22 crc kubenswrapper[4869]: I0312 15:07:22.389842 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-efc2-account-create-update-54dhl" event={"ID":"18c82400-c2cb-44d6-a384-8c604824f269","Type":"ContainerDied","Data":"32803000b24d46f00f77f60d41cf63c35b69ab67f180544da6763181a4db9025"} Mar 12 15:07:22 crc kubenswrapper[4869]: I0312 15:07:22.389882 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32803000b24d46f00f77f60d41cf63c35b69ab67f180544da6763181a4db9025" Mar 12 15:07:22 crc kubenswrapper[4869]: I0312 15:07:22.389937 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-efc2-account-create-update-54dhl" Mar 12 15:07:22 crc kubenswrapper[4869]: I0312 15:07:22.393458 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4nrzc" event={"ID":"eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0","Type":"ContainerDied","Data":"aebafe8f7ce086f2325b476e12a817eab4fa68d9b5351a1c67d66ef936f8d8c3"} Mar 12 15:07:22 crc kubenswrapper[4869]: I0312 15:07:22.393502 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aebafe8f7ce086f2325b476e12a817eab4fa68d9b5351a1c67d66ef936f8d8c3" Mar 12 15:07:22 crc kubenswrapper[4869]: I0312 15:07:22.393596 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4nrzc" Mar 12 15:07:22 crc kubenswrapper[4869]: I0312 15:07:22.397631 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-6589-account-create-update-flcbb" Mar 12 15:07:22 crc kubenswrapper[4869]: I0312 15:07:22.397848 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-6589-account-create-update-flcbb" event={"ID":"ab13ae17-fd00-4317-bbe7-bac949fbb0bb","Type":"ContainerDied","Data":"421e745f743ae306a3c9e3508841d8139148426bf94573bed9eb1795adf09518"} Mar 12 15:07:22 crc kubenswrapper[4869]: I0312 15:07:22.397892 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="421e745f743ae306a3c9e3508841d8139148426bf94573bed9eb1795adf09518" Mar 12 15:07:22 crc kubenswrapper[4869]: I0312 15:07:22.399527 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-aa40-account-create-update-dj2sh" event={"ID":"7b8504b7-ac15-45df-b26a-7c848cf4c068","Type":"ContainerDied","Data":"9ce1eea7ceafe6fc46678886416f1b58da24b4abe7ed2164b6c1448e1cda05fc"} Mar 12 15:07:22 crc kubenswrapper[4869]: I0312 15:07:22.399580 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ce1eea7ceafe6fc46678886416f1b58da24b4abe7ed2164b6c1448e1cda05fc" Mar 12 15:07:22 crc kubenswrapper[4869]: I0312 15:07:22.399581 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-aa40-account-create-update-dj2sh" Mar 12 15:07:22 crc kubenswrapper[4869]: I0312 15:07:22.403016 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-n44jd" event={"ID":"2917626f-4943-4e2a-989b-3ec041337596","Type":"ContainerDied","Data":"ea199879952392ed783c8dce6e60331d55d3fc847ff5a4f4087148711ffd2de6"} Mar 12 15:07:22 crc kubenswrapper[4869]: I0312 15:07:22.403047 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea199879952392ed783c8dce6e60331d55d3fc847ff5a4f4087148711ffd2de6" Mar 12 15:07:22 crc kubenswrapper[4869]: I0312 15:07:22.403098 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-n44jd" Mar 12 15:07:22 crc kubenswrapper[4869]: I0312 15:07:22.408817 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-h89v6" podStartSLOduration=3.299468764 podStartE2EDuration="7.40879925s" podCreationTimestamp="2026-03-12 15:07:15 +0000 UTC" firstStartedPulling="2026-03-12 15:07:17.403406759 +0000 UTC m=+1189.688632037" lastFinishedPulling="2026-03-12 15:07:21.512737245 +0000 UTC m=+1193.797962523" observedRunningTime="2026-03-12 15:07:22.403110247 +0000 UTC m=+1194.688335525" watchObservedRunningTime="2026-03-12 15:07:22.40879925 +0000 UTC m=+1194.694024518" Mar 12 15:07:22 crc kubenswrapper[4869]: I0312 15:07:22.408848 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-318e-account-create-update-zvxwz" event={"ID":"0bcf123a-c619-4d5e-a18a-e9a40db85ce4","Type":"ContainerDied","Data":"63a7257c1121dbbce1c3f43ecef1e6d7e064484a6f00e6ecb63a9a773302a4d8"} Mar 12 15:07:22 crc kubenswrapper[4869]: I0312 15:07:22.408895 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63a7257c1121dbbce1c3f43ecef1e6d7e064484a6f00e6ecb63a9a773302a4d8" Mar 12 15:07:22 crc kubenswrapper[4869]: I0312 15:07:22.408991 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-318e-account-create-update-zvxwz" Mar 12 15:07:24 crc kubenswrapper[4869]: I0312 15:07:24.428280 4869 generic.go:334] "Generic (PLEG): container finished" podID="42953bf3-0bd5-467e-9502-6ddea5e8bf92" containerID="d6f56cce82a1c2c58f55b9d26a6c8afe3d7b1528ce0cde80535c453bab65d721" exitCode=0 Mar 12 15:07:24 crc kubenswrapper[4869]: I0312 15:07:24.428363 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-h89v6" event={"ID":"42953bf3-0bd5-467e-9502-6ddea5e8bf92","Type":"ContainerDied","Data":"d6f56cce82a1c2c58f55b9d26a6c8afe3d7b1528ce0cde80535c453bab65d721"} Mar 12 15:07:25 crc kubenswrapper[4869]: I0312 15:07:25.780421 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-h89v6" Mar 12 15:07:25 crc kubenswrapper[4869]: I0312 15:07:25.795161 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bg7ff\" (UniqueName: \"kubernetes.io/projected/42953bf3-0bd5-467e-9502-6ddea5e8bf92-kube-api-access-bg7ff\") pod \"42953bf3-0bd5-467e-9502-6ddea5e8bf92\" (UID: \"42953bf3-0bd5-467e-9502-6ddea5e8bf92\") " Mar 12 15:07:25 crc kubenswrapper[4869]: I0312 15:07:25.795595 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42953bf3-0bd5-467e-9502-6ddea5e8bf92-config-data\") pod \"42953bf3-0bd5-467e-9502-6ddea5e8bf92\" (UID: \"42953bf3-0bd5-467e-9502-6ddea5e8bf92\") " Mar 12 15:07:25 crc kubenswrapper[4869]: I0312 15:07:25.795718 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42953bf3-0bd5-467e-9502-6ddea5e8bf92-combined-ca-bundle\") pod \"42953bf3-0bd5-467e-9502-6ddea5e8bf92\" (UID: \"42953bf3-0bd5-467e-9502-6ddea5e8bf92\") " Mar 12 15:07:25 crc kubenswrapper[4869]: I0312 15:07:25.802844 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42953bf3-0bd5-467e-9502-6ddea5e8bf92-kube-api-access-bg7ff" (OuterVolumeSpecName: "kube-api-access-bg7ff") pod "42953bf3-0bd5-467e-9502-6ddea5e8bf92" (UID: "42953bf3-0bd5-467e-9502-6ddea5e8bf92"). InnerVolumeSpecName "kube-api-access-bg7ff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:07:25 crc kubenswrapper[4869]: I0312 15:07:25.821049 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42953bf3-0bd5-467e-9502-6ddea5e8bf92-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42953bf3-0bd5-467e-9502-6ddea5e8bf92" (UID: "42953bf3-0bd5-467e-9502-6ddea5e8bf92"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:07:25 crc kubenswrapper[4869]: I0312 15:07:25.854855 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42953bf3-0bd5-467e-9502-6ddea5e8bf92-config-data" (OuterVolumeSpecName: "config-data") pod "42953bf3-0bd5-467e-9502-6ddea5e8bf92" (UID: "42953bf3-0bd5-467e-9502-6ddea5e8bf92"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:07:25 crc kubenswrapper[4869]: I0312 15:07:25.897673 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bg7ff\" (UniqueName: \"kubernetes.io/projected/42953bf3-0bd5-467e-9502-6ddea5e8bf92-kube-api-access-bg7ff\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:25 crc kubenswrapper[4869]: I0312 15:07:25.897707 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42953bf3-0bd5-467e-9502-6ddea5e8bf92-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:25 crc kubenswrapper[4869]: I0312 15:07:25.897717 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42953bf3-0bd5-467e-9502-6ddea5e8bf92-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.447548 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-h89v6" event={"ID":"42953bf3-0bd5-467e-9502-6ddea5e8bf92","Type":"ContainerDied","Data":"5df3b048338d1569a73d05b5b999d954138d6d56c8b22f83e7b9151fb2142635"} Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.447927 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5df3b048338d1569a73d05b5b999d954138d6d56c8b22f83e7b9151fb2142635" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.447634 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-h89v6" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.656090 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-ww7mn"] Mar 12 15:07:26 crc kubenswrapper[4869]: E0312 15:07:26.656643 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="349a0f9a-cebb-4465-9774-b497c95357b6" containerName="init" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.656662 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="349a0f9a-cebb-4465-9774-b497c95357b6" containerName="init" Mar 12 15:07:26 crc kubenswrapper[4869]: E0312 15:07:26.656676 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0" containerName="mariadb-database-create" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.656683 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0" containerName="mariadb-database-create" Mar 12 15:07:26 crc kubenswrapper[4869]: E0312 15:07:26.656695 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42953bf3-0bd5-467e-9502-6ddea5e8bf92" containerName="keystone-db-sync" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.656701 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="42953bf3-0bd5-467e-9502-6ddea5e8bf92" containerName="keystone-db-sync" Mar 12 15:07:26 crc kubenswrapper[4869]: E0312 15:07:26.656711 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce" containerName="mariadb-database-create" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.656717 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce" containerName="mariadb-database-create" Mar 12 15:07:26 crc kubenswrapper[4869]: E0312 15:07:26.656730 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b8504b7-ac15-45df-b26a-7c848cf4c068" containerName="mariadb-account-create-update" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.656736 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b8504b7-ac15-45df-b26a-7c848cf4c068" containerName="mariadb-account-create-update" Mar 12 15:07:26 crc kubenswrapper[4869]: E0312 15:07:26.656745 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e31b3289-8593-4def-be89-bed7b813ef1b" containerName="mariadb-database-create" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.656751 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e31b3289-8593-4def-be89-bed7b813ef1b" containerName="mariadb-database-create" Mar 12 15:07:26 crc kubenswrapper[4869]: E0312 15:07:26.656762 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bcf123a-c619-4d5e-a18a-e9a40db85ce4" containerName="mariadb-account-create-update" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.656768 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bcf123a-c619-4d5e-a18a-e9a40db85ce4" containerName="mariadb-account-create-update" Mar 12 15:07:26 crc kubenswrapper[4869]: E0312 15:07:26.656781 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2917626f-4943-4e2a-989b-3ec041337596" containerName="mariadb-database-create" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.656787 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="2917626f-4943-4e2a-989b-3ec041337596" containerName="mariadb-database-create" Mar 12 15:07:26 crc kubenswrapper[4869]: E0312 15:07:26.656793 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="349a0f9a-cebb-4465-9774-b497c95357b6" containerName="dnsmasq-dns" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.656799 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="349a0f9a-cebb-4465-9774-b497c95357b6" containerName="dnsmasq-dns" Mar 12 15:07:26 crc kubenswrapper[4869]: E0312 15:07:26.656807 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18c82400-c2cb-44d6-a384-8c604824f269" containerName="mariadb-account-create-update" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.656813 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c82400-c2cb-44d6-a384-8c604824f269" containerName="mariadb-account-create-update" Mar 12 15:07:26 crc kubenswrapper[4869]: E0312 15:07:26.656828 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab13ae17-fd00-4317-bbe7-bac949fbb0bb" containerName="mariadb-account-create-update" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.656834 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab13ae17-fd00-4317-bbe7-bac949fbb0bb" containerName="mariadb-account-create-update" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.656977 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="349a0f9a-cebb-4465-9774-b497c95357b6" containerName="dnsmasq-dns" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.656988 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b8504b7-ac15-45df-b26a-7c848cf4c068" containerName="mariadb-account-create-update" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.656997 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="2917626f-4943-4e2a-989b-3ec041337596" containerName="mariadb-database-create" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.657007 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bcf123a-c619-4d5e-a18a-e9a40db85ce4" containerName="mariadb-account-create-update" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.657017 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab13ae17-fd00-4317-bbe7-bac949fbb0bb" containerName="mariadb-account-create-update" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.657023 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="42953bf3-0bd5-467e-9502-6ddea5e8bf92" containerName="keystone-db-sync" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.657037 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="e31b3289-8593-4def-be89-bed7b813ef1b" containerName="mariadb-database-create" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.657046 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce" containerName="mariadb-database-create" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.657054 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0" containerName="mariadb-database-create" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.657066 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="18c82400-c2cb-44d6-a384-8c604824f269" containerName="mariadb-account-create-update" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.658000 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-ww7mn" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.677286 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-ww7mn"] Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.710440 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/268b9009-c0b5-4ea9-a199-797e2444130c-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-ww7mn\" (UID: \"268b9009-c0b5-4ea9-a199-797e2444130c\") " pod="openstack/dnsmasq-dns-847c4cc679-ww7mn" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.710485 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/268b9009-c0b5-4ea9-a199-797e2444130c-config\") pod \"dnsmasq-dns-847c4cc679-ww7mn\" (UID: \"268b9009-c0b5-4ea9-a199-797e2444130c\") " pod="openstack/dnsmasq-dns-847c4cc679-ww7mn" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.710504 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/268b9009-c0b5-4ea9-a199-797e2444130c-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-ww7mn\" (UID: \"268b9009-c0b5-4ea9-a199-797e2444130c\") " pod="openstack/dnsmasq-dns-847c4cc679-ww7mn" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.710527 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/268b9009-c0b5-4ea9-a199-797e2444130c-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-ww7mn\" (UID: \"268b9009-c0b5-4ea9-a199-797e2444130c\") " pod="openstack/dnsmasq-dns-847c4cc679-ww7mn" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.710580 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/268b9009-c0b5-4ea9-a199-797e2444130c-dns-svc\") pod \"dnsmasq-dns-847c4cc679-ww7mn\" (UID: \"268b9009-c0b5-4ea9-a199-797e2444130c\") " pod="openstack/dnsmasq-dns-847c4cc679-ww7mn" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.710632 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7mgc\" (UniqueName: \"kubernetes.io/projected/268b9009-c0b5-4ea9-a199-797e2444130c-kube-api-access-r7mgc\") pod \"dnsmasq-dns-847c4cc679-ww7mn\" (UID: \"268b9009-c0b5-4ea9-a199-797e2444130c\") " pod="openstack/dnsmasq-dns-847c4cc679-ww7mn" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.716116 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-r56x2"] Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.731126 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r56x2" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.739241 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.739357 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-cl6hz" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.739651 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.742131 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.748850 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.754865 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-r56x2"] Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.818368 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/268b9009-c0b5-4ea9-a199-797e2444130c-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-ww7mn\" (UID: \"268b9009-c0b5-4ea9-a199-797e2444130c\") " pod="openstack/dnsmasq-dns-847c4cc679-ww7mn" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.818409 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/268b9009-c0b5-4ea9-a199-797e2444130c-config\") pod \"dnsmasq-dns-847c4cc679-ww7mn\" (UID: \"268b9009-c0b5-4ea9-a199-797e2444130c\") " pod="openstack/dnsmasq-dns-847c4cc679-ww7mn" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.818437 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/268b9009-c0b5-4ea9-a199-797e2444130c-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-ww7mn\" (UID: \"268b9009-c0b5-4ea9-a199-797e2444130c\") " pod="openstack/dnsmasq-dns-847c4cc679-ww7mn" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.818474 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6pcb\" (UniqueName: \"kubernetes.io/projected/8d771411-1e30-4680-913f-a9e0a9a74e31-kube-api-access-m6pcb\") pod \"keystone-bootstrap-r56x2\" (UID: \"8d771411-1e30-4680-913f-a9e0a9a74e31\") " pod="openstack/keystone-bootstrap-r56x2" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.818507 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d771411-1e30-4680-913f-a9e0a9a74e31-combined-ca-bundle\") pod \"keystone-bootstrap-r56x2\" (UID: \"8d771411-1e30-4680-913f-a9e0a9a74e31\") " pod="openstack/keystone-bootstrap-r56x2" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.818527 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8d771411-1e30-4680-913f-a9e0a9a74e31-credential-keys\") pod \"keystone-bootstrap-r56x2\" (UID: \"8d771411-1e30-4680-913f-a9e0a9a74e31\") " pod="openstack/keystone-bootstrap-r56x2" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.818561 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8d771411-1e30-4680-913f-a9e0a9a74e31-fernet-keys\") pod \"keystone-bootstrap-r56x2\" (UID: \"8d771411-1e30-4680-913f-a9e0a9a74e31\") " pod="openstack/keystone-bootstrap-r56x2" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.818588 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/268b9009-c0b5-4ea9-a199-797e2444130c-dns-svc\") pod \"dnsmasq-dns-847c4cc679-ww7mn\" (UID: \"268b9009-c0b5-4ea9-a199-797e2444130c\") " pod="openstack/dnsmasq-dns-847c4cc679-ww7mn" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.818618 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d771411-1e30-4680-913f-a9e0a9a74e31-scripts\") pod \"keystone-bootstrap-r56x2\" (UID: \"8d771411-1e30-4680-913f-a9e0a9a74e31\") " pod="openstack/keystone-bootstrap-r56x2" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.819797 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d771411-1e30-4680-913f-a9e0a9a74e31-config-data\") pod \"keystone-bootstrap-r56x2\" (UID: \"8d771411-1e30-4680-913f-a9e0a9a74e31\") " pod="openstack/keystone-bootstrap-r56x2" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.820939 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7mgc\" (UniqueName: \"kubernetes.io/projected/268b9009-c0b5-4ea9-a199-797e2444130c-kube-api-access-r7mgc\") pod \"dnsmasq-dns-847c4cc679-ww7mn\" (UID: \"268b9009-c0b5-4ea9-a199-797e2444130c\") " pod="openstack/dnsmasq-dns-847c4cc679-ww7mn" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.821093 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/268b9009-c0b5-4ea9-a199-797e2444130c-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-ww7mn\" (UID: \"268b9009-c0b5-4ea9-a199-797e2444130c\") " pod="openstack/dnsmasq-dns-847c4cc679-ww7mn" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.819762 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/268b9009-c0b5-4ea9-a199-797e2444130c-dns-svc\") pod \"dnsmasq-dns-847c4cc679-ww7mn\" (UID: \"268b9009-c0b5-4ea9-a199-797e2444130c\") " pod="openstack/dnsmasq-dns-847c4cc679-ww7mn" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.821981 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/268b9009-c0b5-4ea9-a199-797e2444130c-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-ww7mn\" (UID: \"268b9009-c0b5-4ea9-a199-797e2444130c\") " pod="openstack/dnsmasq-dns-847c4cc679-ww7mn" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.823304 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/268b9009-c0b5-4ea9-a199-797e2444130c-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-ww7mn\" (UID: \"268b9009-c0b5-4ea9-a199-797e2444130c\") " pod="openstack/dnsmasq-dns-847c4cc679-ww7mn" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.823528 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/268b9009-c0b5-4ea9-a199-797e2444130c-config\") pod \"dnsmasq-dns-847c4cc679-ww7mn\" (UID: \"268b9009-c0b5-4ea9-a199-797e2444130c\") " pod="openstack/dnsmasq-dns-847c4cc679-ww7mn" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.830968 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/268b9009-c0b5-4ea9-a199-797e2444130c-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-ww7mn\" (UID: \"268b9009-c0b5-4ea9-a199-797e2444130c\") " pod="openstack/dnsmasq-dns-847c4cc679-ww7mn" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.886409 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7mgc\" (UniqueName: \"kubernetes.io/projected/268b9009-c0b5-4ea9-a199-797e2444130c-kube-api-access-r7mgc\") pod \"dnsmasq-dns-847c4cc679-ww7mn\" (UID: \"268b9009-c0b5-4ea9-a199-797e2444130c\") " pod="openstack/dnsmasq-dns-847c4cc679-ww7mn" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.902286 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5cb86d8b85-vl7p7"] Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.903792 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cb86d8b85-vl7p7" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.908148 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5cb86d8b85-vl7p7"] Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.920793 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-t7fgp"] Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.922490 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6pcb\" (UniqueName: \"kubernetes.io/projected/8d771411-1e30-4680-913f-a9e0a9a74e31-kube-api-access-m6pcb\") pod \"keystone-bootstrap-r56x2\" (UID: \"8d771411-1e30-4680-913f-a9e0a9a74e31\") " pod="openstack/keystone-bootstrap-r56x2" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.922521 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18e6c8a7-be39-4624-b798-c2f6715ab6e8-logs\") pod \"horizon-5cb86d8b85-vl7p7\" (UID: \"18e6c8a7-be39-4624-b798-c2f6715ab6e8\") " pod="openstack/horizon-5cb86d8b85-vl7p7" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.922555 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr48s\" (UniqueName: \"kubernetes.io/projected/18e6c8a7-be39-4624-b798-c2f6715ab6e8-kube-api-access-dr48s\") pod \"horizon-5cb86d8b85-vl7p7\" (UID: \"18e6c8a7-be39-4624-b798-c2f6715ab6e8\") " pod="openstack/horizon-5cb86d8b85-vl7p7" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.922577 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18e6c8a7-be39-4624-b798-c2f6715ab6e8-config-data\") pod \"horizon-5cb86d8b85-vl7p7\" (UID: \"18e6c8a7-be39-4624-b798-c2f6715ab6e8\") " pod="openstack/horizon-5cb86d8b85-vl7p7" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.922597 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d771411-1e30-4680-913f-a9e0a9a74e31-combined-ca-bundle\") pod \"keystone-bootstrap-r56x2\" (UID: \"8d771411-1e30-4680-913f-a9e0a9a74e31\") " pod="openstack/keystone-bootstrap-r56x2" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.922612 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8d771411-1e30-4680-913f-a9e0a9a74e31-credential-keys\") pod \"keystone-bootstrap-r56x2\" (UID: \"8d771411-1e30-4680-913f-a9e0a9a74e31\") " pod="openstack/keystone-bootstrap-r56x2" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.922629 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8d771411-1e30-4680-913f-a9e0a9a74e31-fernet-keys\") pod \"keystone-bootstrap-r56x2\" (UID: \"8d771411-1e30-4680-913f-a9e0a9a74e31\") " pod="openstack/keystone-bootstrap-r56x2" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.922650 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18e6c8a7-be39-4624-b798-c2f6715ab6e8-scripts\") pod \"horizon-5cb86d8b85-vl7p7\" (UID: \"18e6c8a7-be39-4624-b798-c2f6715ab6e8\") " pod="openstack/horizon-5cb86d8b85-vl7p7" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.922668 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d771411-1e30-4680-913f-a9e0a9a74e31-scripts\") pod \"keystone-bootstrap-r56x2\" (UID: \"8d771411-1e30-4680-913f-a9e0a9a74e31\") " pod="openstack/keystone-bootstrap-r56x2" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.922687 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d771411-1e30-4680-913f-a9e0a9a74e31-config-data\") pod \"keystone-bootstrap-r56x2\" (UID: \"8d771411-1e30-4680-913f-a9e0a9a74e31\") " pod="openstack/keystone-bootstrap-r56x2" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.922777 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/18e6c8a7-be39-4624-b798-c2f6715ab6e8-horizon-secret-key\") pod \"horizon-5cb86d8b85-vl7p7\" (UID: \"18e6c8a7-be39-4624-b798-c2f6715ab6e8\") " pod="openstack/horizon-5cb86d8b85-vl7p7" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.923860 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-t7fgp" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.928738 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.929066 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.929240 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.930078 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-phhph" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.933684 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8d771411-1e30-4680-913f-a9e0a9a74e31-credential-keys\") pod \"keystone-bootstrap-r56x2\" (UID: \"8d771411-1e30-4680-913f-a9e0a9a74e31\") " pod="openstack/keystone-bootstrap-r56x2" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.934319 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d771411-1e30-4680-913f-a9e0a9a74e31-combined-ca-bundle\") pod \"keystone-bootstrap-r56x2\" (UID: \"8d771411-1e30-4680-913f-a9e0a9a74e31\") " pod="openstack/keystone-bootstrap-r56x2" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.934580 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-t7fgp"] Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.944902 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.945204 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-9c8hn" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.945345 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.960490 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d771411-1e30-4680-913f-a9e0a9a74e31-config-data\") pod \"keystone-bootstrap-r56x2\" (UID: \"8d771411-1e30-4680-913f-a9e0a9a74e31\") " pod="openstack/keystone-bootstrap-r56x2" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.970801 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d771411-1e30-4680-913f-a9e0a9a74e31-scripts\") pod \"keystone-bootstrap-r56x2\" (UID: \"8d771411-1e30-4680-913f-a9e0a9a74e31\") " pod="openstack/keystone-bootstrap-r56x2" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.974251 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8d771411-1e30-4680-913f-a9e0a9a74e31-fernet-keys\") pod \"keystone-bootstrap-r56x2\" (UID: \"8d771411-1e30-4680-913f-a9e0a9a74e31\") " pod="openstack/keystone-bootstrap-r56x2" Mar 12 15:07:26 crc kubenswrapper[4869]: I0312 15:07:26.984040 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6pcb\" (UniqueName: \"kubernetes.io/projected/8d771411-1e30-4680-913f-a9e0a9a74e31-kube-api-access-m6pcb\") pod \"keystone-bootstrap-r56x2\" (UID: \"8d771411-1e30-4680-913f-a9e0a9a74e31\") " pod="openstack/keystone-bootstrap-r56x2" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.002763 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-ww7mn" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.024498 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/18e6c8a7-be39-4624-b798-c2f6715ab6e8-horizon-secret-key\") pod \"horizon-5cb86d8b85-vl7p7\" (UID: \"18e6c8a7-be39-4624-b798-c2f6715ab6e8\") " pod="openstack/horizon-5cb86d8b85-vl7p7" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.024568 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18e6c8a7-be39-4624-b798-c2f6715ab6e8-logs\") pod \"horizon-5cb86d8b85-vl7p7\" (UID: \"18e6c8a7-be39-4624-b798-c2f6715ab6e8\") " pod="openstack/horizon-5cb86d8b85-vl7p7" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.024589 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr48s\" (UniqueName: \"kubernetes.io/projected/18e6c8a7-be39-4624-b798-c2f6715ab6e8-kube-api-access-dr48s\") pod \"horizon-5cb86d8b85-vl7p7\" (UID: \"18e6c8a7-be39-4624-b798-c2f6715ab6e8\") " pod="openstack/horizon-5cb86d8b85-vl7p7" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.024606 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18e6c8a7-be39-4624-b798-c2f6715ab6e8-config-data\") pod \"horizon-5cb86d8b85-vl7p7\" (UID: \"18e6c8a7-be39-4624-b798-c2f6715ab6e8\") " pod="openstack/horizon-5cb86d8b85-vl7p7" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.024638 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18e6c8a7-be39-4624-b798-c2f6715ab6e8-scripts\") pod \"horizon-5cb86d8b85-vl7p7\" (UID: \"18e6c8a7-be39-4624-b798-c2f6715ab6e8\") " pod="openstack/horizon-5cb86d8b85-vl7p7" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.025456 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18e6c8a7-be39-4624-b798-c2f6715ab6e8-scripts\") pod \"horizon-5cb86d8b85-vl7p7\" (UID: \"18e6c8a7-be39-4624-b798-c2f6715ab6e8\") " pod="openstack/horizon-5cb86d8b85-vl7p7" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.028172 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/18e6c8a7-be39-4624-b798-c2f6715ab6e8-horizon-secret-key\") pod \"horizon-5cb86d8b85-vl7p7\" (UID: \"18e6c8a7-be39-4624-b798-c2f6715ab6e8\") " pod="openstack/horizon-5cb86d8b85-vl7p7" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.028402 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18e6c8a7-be39-4624-b798-c2f6715ab6e8-logs\") pod \"horizon-5cb86d8b85-vl7p7\" (UID: \"18e6c8a7-be39-4624-b798-c2f6715ab6e8\") " pod="openstack/horizon-5cb86d8b85-vl7p7" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.029658 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18e6c8a7-be39-4624-b798-c2f6715ab6e8-config-data\") pod \"horizon-5cb86d8b85-vl7p7\" (UID: \"18e6c8a7-be39-4624-b798-c2f6715ab6e8\") " pod="openstack/horizon-5cb86d8b85-vl7p7" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.067599 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-9p9gw"] Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.068848 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-9p9gw" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.074902 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.075163 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-kftdm" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.081359 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-vxdx8"] Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.082629 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-vxdx8" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.095192 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.095371 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-k4mns" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.095786 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.096164 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r56x2" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.115607 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-vxdx8"] Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.121045 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr48s\" (UniqueName: \"kubernetes.io/projected/18e6c8a7-be39-4624-b798-c2f6715ab6e8-kube-api-access-dr48s\") pod \"horizon-5cb86d8b85-vl7p7\" (UID: \"18e6c8a7-be39-4624-b798-c2f6715ab6e8\") " pod="openstack/horizon-5cb86d8b85-vl7p7" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.140318 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cb86d8b85-vl7p7" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.145480 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca1953bb-fc5d-4285-8de4-b67746201d05-db-sync-config-data\") pod \"cinder-db-sync-t7fgp\" (UID: \"ca1953bb-fc5d-4285-8de4-b67746201d05\") " pod="openstack/cinder-db-sync-t7fgp" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.145567 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca1953bb-fc5d-4285-8de4-b67746201d05-config-data\") pod \"cinder-db-sync-t7fgp\" (UID: \"ca1953bb-fc5d-4285-8de4-b67746201d05\") " pod="openstack/cinder-db-sync-t7fgp" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.145605 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s57vj\" (UniqueName: \"kubernetes.io/projected/ca1953bb-fc5d-4285-8de4-b67746201d05-kube-api-access-s57vj\") pod \"cinder-db-sync-t7fgp\" (UID: \"ca1953bb-fc5d-4285-8de4-b67746201d05\") " pod="openstack/cinder-db-sync-t7fgp" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.145661 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca1953bb-fc5d-4285-8de4-b67746201d05-etc-machine-id\") pod \"cinder-db-sync-t7fgp\" (UID: \"ca1953bb-fc5d-4285-8de4-b67746201d05\") " pod="openstack/cinder-db-sync-t7fgp" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.145739 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca1953bb-fc5d-4285-8de4-b67746201d05-combined-ca-bundle\") pod \"cinder-db-sync-t7fgp\" (UID: \"ca1953bb-fc5d-4285-8de4-b67746201d05\") " pod="openstack/cinder-db-sync-t7fgp" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.145777 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca1953bb-fc5d-4285-8de4-b67746201d05-scripts\") pod \"cinder-db-sync-t7fgp\" (UID: \"ca1953bb-fc5d-4285-8de4-b67746201d05\") " pod="openstack/cinder-db-sync-t7fgp" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.149805 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-9p9gw"] Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.163598 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-ww7mn"] Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.195813 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2pbrx"] Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.197129 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.223030 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5855b8d555-mwzss"] Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.227016 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5855b8d555-mwzss" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.236966 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.243426 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.246392 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.246623 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.248866 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca1953bb-fc5d-4285-8de4-b67746201d05-combined-ca-bundle\") pod \"cinder-db-sync-t7fgp\" (UID: \"ca1953bb-fc5d-4285-8de4-b67746201d05\") " pod="openstack/cinder-db-sync-t7fgp" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.249184 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca1953bb-fc5d-4285-8de4-b67746201d05-scripts\") pod \"cinder-db-sync-t7fgp\" (UID: \"ca1953bb-fc5d-4285-8de4-b67746201d05\") " pod="openstack/cinder-db-sync-t7fgp" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.249266 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b444668b-c6e2-42fc-93e2-8b14ef77eef3-log-httpd\") pod \"ceilometer-0\" (UID: \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\") " pod="openstack/ceilometer-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.249412 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8721cab-3eb8-4c80-a0c8-79c7e007b614-combined-ca-bundle\") pod \"manila-db-sync-9p9gw\" (UID: \"e8721cab-3eb8-4c80-a0c8-79c7e007b614\") " pod="openstack/manila-db-sync-9p9gw" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.249507 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d7b2cdfe-e6e9-47b3-99de-d14e20f3330b-horizon-secret-key\") pod \"horizon-5855b8d555-mwzss\" (UID: \"d7b2cdfe-e6e9-47b3-99de-d14e20f3330b\") " pod="openstack/horizon-5855b8d555-mwzss" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.249595 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b444668b-c6e2-42fc-93e2-8b14ef77eef3-config-data\") pod \"ceilometer-0\" (UID: \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\") " pod="openstack/ceilometer-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.249673 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-2pbrx\" (UID: \"e724f55d-2897-4b64-8a80-cdee522a7143\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.249742 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b444668b-c6e2-42fc-93e2-8b14ef77eef3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\") " pod="openstack/ceilometer-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.249864 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6xfc\" (UniqueName: \"kubernetes.io/projected/e724f55d-2897-4b64-8a80-cdee522a7143-kube-api-access-k6xfc\") pod \"dnsmasq-dns-785d8bcb8c-2pbrx\" (UID: \"e724f55d-2897-4b64-8a80-cdee522a7143\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.249935 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-2pbrx\" (UID: \"e724f55d-2897-4b64-8a80-cdee522a7143\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.250003 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b444668b-c6e2-42fc-93e2-8b14ef77eef3-scripts\") pod \"ceilometer-0\" (UID: \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\") " pod="openstack/ceilometer-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.250093 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cadca9f-20ef-432c-8816-e5fea0d0c93e-scripts\") pod \"placement-db-sync-vxdx8\" (UID: \"2cadca9f-20ef-432c-8816-e5fea0d0c93e\") " pod="openstack/placement-db-sync-vxdx8" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.250168 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca1953bb-fc5d-4285-8de4-b67746201d05-db-sync-config-data\") pod \"cinder-db-sync-t7fgp\" (UID: \"ca1953bb-fc5d-4285-8de4-b67746201d05\") " pod="openstack/cinder-db-sync-t7fgp" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.250239 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b444668b-c6e2-42fc-93e2-8b14ef77eef3-run-httpd\") pod \"ceilometer-0\" (UID: \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\") " pod="openstack/ceilometer-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.250611 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/e8721cab-3eb8-4c80-a0c8-79c7e007b614-job-config-data\") pod \"manila-db-sync-9p9gw\" (UID: \"e8721cab-3eb8-4c80-a0c8-79c7e007b614\") " pod="openstack/manila-db-sync-9p9gw" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.250714 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mqqv\" (UniqueName: \"kubernetes.io/projected/b444668b-c6e2-42fc-93e2-8b14ef77eef3-kube-api-access-2mqqv\") pod \"ceilometer-0\" (UID: \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\") " pod="openstack/ceilometer-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.250774 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca1953bb-fc5d-4285-8de4-b67746201d05-config-data\") pod \"cinder-db-sync-t7fgp\" (UID: \"ca1953bb-fc5d-4285-8de4-b67746201d05\") " pod="openstack/cinder-db-sync-t7fgp" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.250822 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b444668b-c6e2-42fc-93e2-8b14ef77eef3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\") " pod="openstack/ceilometer-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.250847 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d7b2cdfe-e6e9-47b3-99de-d14e20f3330b-config-data\") pod \"horizon-5855b8d555-mwzss\" (UID: \"d7b2cdfe-e6e9-47b3-99de-d14e20f3330b\") " pod="openstack/horizon-5855b8d555-mwzss" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.250919 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s57vj\" (UniqueName: \"kubernetes.io/projected/ca1953bb-fc5d-4285-8de4-b67746201d05-kube-api-access-s57vj\") pod \"cinder-db-sync-t7fgp\" (UID: \"ca1953bb-fc5d-4285-8de4-b67746201d05\") " pod="openstack/cinder-db-sync-t7fgp" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.252189 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwm5b\" (UniqueName: \"kubernetes.io/projected/d7b2cdfe-e6e9-47b3-99de-d14e20f3330b-kube-api-access-rwm5b\") pod \"horizon-5855b8d555-mwzss\" (UID: \"d7b2cdfe-e6e9-47b3-99de-d14e20f3330b\") " pod="openstack/horizon-5855b8d555-mwzss" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.252281 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cadca9f-20ef-432c-8816-e5fea0d0c93e-combined-ca-bundle\") pod \"placement-db-sync-vxdx8\" (UID: \"2cadca9f-20ef-432c-8816-e5fea0d0c93e\") " pod="openstack/placement-db-sync-vxdx8" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.252372 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cadca9f-20ef-432c-8816-e5fea0d0c93e-logs\") pod \"placement-db-sync-vxdx8\" (UID: \"2cadca9f-20ef-432c-8816-e5fea0d0c93e\") " pod="openstack/placement-db-sync-vxdx8" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.252440 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8721cab-3eb8-4c80-a0c8-79c7e007b614-config-data\") pod \"manila-db-sync-9p9gw\" (UID: \"e8721cab-3eb8-4c80-a0c8-79c7e007b614\") " pod="openstack/manila-db-sync-9p9gw" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.258035 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-2pbrx\" (UID: \"e724f55d-2897-4b64-8a80-cdee522a7143\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.258075 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7b2cdfe-e6e9-47b3-99de-d14e20f3330b-scripts\") pod \"horizon-5855b8d555-mwzss\" (UID: \"d7b2cdfe-e6e9-47b3-99de-d14e20f3330b\") " pod="openstack/horizon-5855b8d555-mwzss" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.258097 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7b2cdfe-e6e9-47b3-99de-d14e20f3330b-logs\") pod \"horizon-5855b8d555-mwzss\" (UID: \"d7b2cdfe-e6e9-47b3-99de-d14e20f3330b\") " pod="openstack/horizon-5855b8d555-mwzss" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.258120 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-config\") pod \"dnsmasq-dns-785d8bcb8c-2pbrx\" (UID: \"e724f55d-2897-4b64-8a80-cdee522a7143\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.258140 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cadca9f-20ef-432c-8816-e5fea0d0c93e-config-data\") pod \"placement-db-sync-vxdx8\" (UID: \"2cadca9f-20ef-432c-8816-e5fea0d0c93e\") " pod="openstack/placement-db-sync-vxdx8" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.258175 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca1953bb-fc5d-4285-8de4-b67746201d05-etc-machine-id\") pod \"cinder-db-sync-t7fgp\" (UID: \"ca1953bb-fc5d-4285-8de4-b67746201d05\") " pod="openstack/cinder-db-sync-t7fgp" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.258209 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zgmn\" (UniqueName: \"kubernetes.io/projected/2cadca9f-20ef-432c-8816-e5fea0d0c93e-kube-api-access-5zgmn\") pod \"placement-db-sync-vxdx8\" (UID: \"2cadca9f-20ef-432c-8816-e5fea0d0c93e\") " pod="openstack/placement-db-sync-vxdx8" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.258226 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-2pbrx\" (UID: \"e724f55d-2897-4b64-8a80-cdee522a7143\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.258261 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r49ql\" (UniqueName: \"kubernetes.io/projected/e8721cab-3eb8-4c80-a0c8-79c7e007b614-kube-api-access-r49ql\") pod \"manila-db-sync-9p9gw\" (UID: \"e8721cab-3eb8-4c80-a0c8-79c7e007b614\") " pod="openstack/manila-db-sync-9p9gw" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.255086 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-556xl"] Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.258504 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca1953bb-fc5d-4285-8de4-b67746201d05-etc-machine-id\") pod \"cinder-db-sync-t7fgp\" (UID: \"ca1953bb-fc5d-4285-8de4-b67746201d05\") " pod="openstack/cinder-db-sync-t7fgp" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.255755 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca1953bb-fc5d-4285-8de4-b67746201d05-combined-ca-bundle\") pod \"cinder-db-sync-t7fgp\" (UID: \"ca1953bb-fc5d-4285-8de4-b67746201d05\") " pod="openstack/cinder-db-sync-t7fgp" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.259806 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca1953bb-fc5d-4285-8de4-b67746201d05-config-data\") pod \"cinder-db-sync-t7fgp\" (UID: \"ca1953bb-fc5d-4285-8de4-b67746201d05\") " pod="openstack/cinder-db-sync-t7fgp" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.264370 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca1953bb-fc5d-4285-8de4-b67746201d05-scripts\") pod \"cinder-db-sync-t7fgp\" (UID: \"ca1953bb-fc5d-4285-8de4-b67746201d05\") " pod="openstack/cinder-db-sync-t7fgp" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.265281 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-556xl" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.267804 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-dhzrk" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.267824 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca1953bb-fc5d-4285-8de4-b67746201d05-db-sync-config-data\") pod \"cinder-db-sync-t7fgp\" (UID: \"ca1953bb-fc5d-4285-8de4-b67746201d05\") " pod="openstack/cinder-db-sync-t7fgp" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.268045 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.272941 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s57vj\" (UniqueName: \"kubernetes.io/projected/ca1953bb-fc5d-4285-8de4-b67746201d05-kube-api-access-s57vj\") pod \"cinder-db-sync-t7fgp\" (UID: \"ca1953bb-fc5d-4285-8de4-b67746201d05\") " pod="openstack/cinder-db-sync-t7fgp" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.287860 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.291007 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5855b8d555-mwzss"] Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.364403 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zgmn\" (UniqueName: \"kubernetes.io/projected/2cadca9f-20ef-432c-8816-e5fea0d0c93e-kube-api-access-5zgmn\") pod \"placement-db-sync-vxdx8\" (UID: \"2cadca9f-20ef-432c-8816-e5fea0d0c93e\") " pod="openstack/placement-db-sync-vxdx8" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.364445 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-2pbrx\" (UID: \"e724f55d-2897-4b64-8a80-cdee522a7143\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.364471 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx8lf\" (UniqueName: \"kubernetes.io/projected/50098315-1895-433b-9d66-df198c579b4e-kube-api-access-hx8lf\") pod \"neutron-db-sync-556xl\" (UID: \"50098315-1895-433b-9d66-df198c579b4e\") " pod="openstack/neutron-db-sync-556xl" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.364516 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r49ql\" (UniqueName: \"kubernetes.io/projected/e8721cab-3eb8-4c80-a0c8-79c7e007b614-kube-api-access-r49ql\") pod \"manila-db-sync-9p9gw\" (UID: \"e8721cab-3eb8-4c80-a0c8-79c7e007b614\") " pod="openstack/manila-db-sync-9p9gw" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.364596 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b444668b-c6e2-42fc-93e2-8b14ef77eef3-log-httpd\") pod \"ceilometer-0\" (UID: \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\") " pod="openstack/ceilometer-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.366155 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-2pbrx\" (UID: \"e724f55d-2897-4b64-8a80-cdee522a7143\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.366674 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8721cab-3eb8-4c80-a0c8-79c7e007b614-combined-ca-bundle\") pod \"manila-db-sync-9p9gw\" (UID: \"e8721cab-3eb8-4c80-a0c8-79c7e007b614\") " pod="openstack/manila-db-sync-9p9gw" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.366799 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d7b2cdfe-e6e9-47b3-99de-d14e20f3330b-horizon-secret-key\") pod \"horizon-5855b8d555-mwzss\" (UID: \"d7b2cdfe-e6e9-47b3-99de-d14e20f3330b\") " pod="openstack/horizon-5855b8d555-mwzss" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.366854 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b444668b-c6e2-42fc-93e2-8b14ef77eef3-config-data\") pod \"ceilometer-0\" (UID: \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\") " pod="openstack/ceilometer-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.366935 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-2pbrx\" (UID: \"e724f55d-2897-4b64-8a80-cdee522a7143\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.366968 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b444668b-c6e2-42fc-93e2-8b14ef77eef3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\") " pod="openstack/ceilometer-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.367003 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6xfc\" (UniqueName: \"kubernetes.io/projected/e724f55d-2897-4b64-8a80-cdee522a7143-kube-api-access-k6xfc\") pod \"dnsmasq-dns-785d8bcb8c-2pbrx\" (UID: \"e724f55d-2897-4b64-8a80-cdee522a7143\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.367062 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-2pbrx\" (UID: \"e724f55d-2897-4b64-8a80-cdee522a7143\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.367111 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b444668b-c6e2-42fc-93e2-8b14ef77eef3-scripts\") pod \"ceilometer-0\" (UID: \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\") " pod="openstack/ceilometer-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.367184 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/50098315-1895-433b-9d66-df198c579b4e-config\") pod \"neutron-db-sync-556xl\" (UID: \"50098315-1895-433b-9d66-df198c579b4e\") " pod="openstack/neutron-db-sync-556xl" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.367226 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cadca9f-20ef-432c-8816-e5fea0d0c93e-scripts\") pod \"placement-db-sync-vxdx8\" (UID: \"2cadca9f-20ef-432c-8816-e5fea0d0c93e\") " pod="openstack/placement-db-sync-vxdx8" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.367254 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b444668b-c6e2-42fc-93e2-8b14ef77eef3-run-httpd\") pod \"ceilometer-0\" (UID: \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\") " pod="openstack/ceilometer-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.367278 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/e8721cab-3eb8-4c80-a0c8-79c7e007b614-job-config-data\") pod \"manila-db-sync-9p9gw\" (UID: \"e8721cab-3eb8-4c80-a0c8-79c7e007b614\") " pod="openstack/manila-db-sync-9p9gw" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.367314 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mqqv\" (UniqueName: \"kubernetes.io/projected/b444668b-c6e2-42fc-93e2-8b14ef77eef3-kube-api-access-2mqqv\") pod \"ceilometer-0\" (UID: \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\") " pod="openstack/ceilometer-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.367347 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b444668b-c6e2-42fc-93e2-8b14ef77eef3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\") " pod="openstack/ceilometer-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.367372 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d7b2cdfe-e6e9-47b3-99de-d14e20f3330b-config-data\") pod \"horizon-5855b8d555-mwzss\" (UID: \"d7b2cdfe-e6e9-47b3-99de-d14e20f3330b\") " pod="openstack/horizon-5855b8d555-mwzss" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.367402 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50098315-1895-433b-9d66-df198c579b4e-combined-ca-bundle\") pod \"neutron-db-sync-556xl\" (UID: \"50098315-1895-433b-9d66-df198c579b4e\") " pod="openstack/neutron-db-sync-556xl" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.367430 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwm5b\" (UniqueName: \"kubernetes.io/projected/d7b2cdfe-e6e9-47b3-99de-d14e20f3330b-kube-api-access-rwm5b\") pod \"horizon-5855b8d555-mwzss\" (UID: \"d7b2cdfe-e6e9-47b3-99de-d14e20f3330b\") " pod="openstack/horizon-5855b8d555-mwzss" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.367456 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cadca9f-20ef-432c-8816-e5fea0d0c93e-combined-ca-bundle\") pod \"placement-db-sync-vxdx8\" (UID: \"2cadca9f-20ef-432c-8816-e5fea0d0c93e\") " pod="openstack/placement-db-sync-vxdx8" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.367491 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8721cab-3eb8-4c80-a0c8-79c7e007b614-config-data\") pod \"manila-db-sync-9p9gw\" (UID: \"e8721cab-3eb8-4c80-a0c8-79c7e007b614\") " pod="openstack/manila-db-sync-9p9gw" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.367510 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cadca9f-20ef-432c-8816-e5fea0d0c93e-logs\") pod \"placement-db-sync-vxdx8\" (UID: \"2cadca9f-20ef-432c-8816-e5fea0d0c93e\") " pod="openstack/placement-db-sync-vxdx8" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.367607 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-2pbrx\" (UID: \"e724f55d-2897-4b64-8a80-cdee522a7143\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.367637 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7b2cdfe-e6e9-47b3-99de-d14e20f3330b-scripts\") pod \"horizon-5855b8d555-mwzss\" (UID: \"d7b2cdfe-e6e9-47b3-99de-d14e20f3330b\") " pod="openstack/horizon-5855b8d555-mwzss" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.367653 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7b2cdfe-e6e9-47b3-99de-d14e20f3330b-logs\") pod \"horizon-5855b8d555-mwzss\" (UID: \"d7b2cdfe-e6e9-47b3-99de-d14e20f3330b\") " pod="openstack/horizon-5855b8d555-mwzss" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.367668 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-config\") pod \"dnsmasq-dns-785d8bcb8c-2pbrx\" (UID: \"e724f55d-2897-4b64-8a80-cdee522a7143\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.367685 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cadca9f-20ef-432c-8816-e5fea0d0c93e-config-data\") pod \"placement-db-sync-vxdx8\" (UID: \"2cadca9f-20ef-432c-8816-e5fea0d0c93e\") " pod="openstack/placement-db-sync-vxdx8" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.367849 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-2pbrx\" (UID: \"e724f55d-2897-4b64-8a80-cdee522a7143\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.368973 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7b2cdfe-e6e9-47b3-99de-d14e20f3330b-logs\") pod \"horizon-5855b8d555-mwzss\" (UID: \"d7b2cdfe-e6e9-47b3-99de-d14e20f3330b\") " pod="openstack/horizon-5855b8d555-mwzss" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.369529 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-config\") pod \"dnsmasq-dns-785d8bcb8c-2pbrx\" (UID: \"e724f55d-2897-4b64-8a80-cdee522a7143\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.372234 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cadca9f-20ef-432c-8816-e5fea0d0c93e-logs\") pod \"placement-db-sync-vxdx8\" (UID: \"2cadca9f-20ef-432c-8816-e5fea0d0c93e\") " pod="openstack/placement-db-sync-vxdx8" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.372561 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b444668b-c6e2-42fc-93e2-8b14ef77eef3-run-httpd\") pod \"ceilometer-0\" (UID: \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\") " pod="openstack/ceilometer-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.375563 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d7b2cdfe-e6e9-47b3-99de-d14e20f3330b-config-data\") pod \"horizon-5855b8d555-mwzss\" (UID: \"d7b2cdfe-e6e9-47b3-99de-d14e20f3330b\") " pod="openstack/horizon-5855b8d555-mwzss" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.376412 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-2pbrx\" (UID: \"e724f55d-2897-4b64-8a80-cdee522a7143\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.377004 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-2pbrx\" (UID: \"e724f55d-2897-4b64-8a80-cdee522a7143\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.379682 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7b2cdfe-e6e9-47b3-99de-d14e20f3330b-scripts\") pod \"horizon-5855b8d555-mwzss\" (UID: \"d7b2cdfe-e6e9-47b3-99de-d14e20f3330b\") " pod="openstack/horizon-5855b8d555-mwzss" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.382234 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d7b2cdfe-e6e9-47b3-99de-d14e20f3330b-horizon-secret-key\") pod \"horizon-5855b8d555-mwzss\" (UID: \"d7b2cdfe-e6e9-47b3-99de-d14e20f3330b\") " pod="openstack/horizon-5855b8d555-mwzss" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.382501 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b444668b-c6e2-42fc-93e2-8b14ef77eef3-log-httpd\") pod \"ceilometer-0\" (UID: \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\") " pod="openstack/ceilometer-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.384230 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b444668b-c6e2-42fc-93e2-8b14ef77eef3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\") " pod="openstack/ceilometer-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.388169 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8721cab-3eb8-4c80-a0c8-79c7e007b614-config-data\") pod \"manila-db-sync-9p9gw\" (UID: \"e8721cab-3eb8-4c80-a0c8-79c7e007b614\") " pod="openstack/manila-db-sync-9p9gw" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.388658 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b444668b-c6e2-42fc-93e2-8b14ef77eef3-config-data\") pod \"ceilometer-0\" (UID: \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\") " pod="openstack/ceilometer-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.389327 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cadca9f-20ef-432c-8816-e5fea0d0c93e-scripts\") pod \"placement-db-sync-vxdx8\" (UID: \"2cadca9f-20ef-432c-8816-e5fea0d0c93e\") " pod="openstack/placement-db-sync-vxdx8" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.392943 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2pbrx"] Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.397169 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b444668b-c6e2-42fc-93e2-8b14ef77eef3-scripts\") pod \"ceilometer-0\" (UID: \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\") " pod="openstack/ceilometer-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.399485 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b444668b-c6e2-42fc-93e2-8b14ef77eef3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\") " pod="openstack/ceilometer-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.401857 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zgmn\" (UniqueName: \"kubernetes.io/projected/2cadca9f-20ef-432c-8816-e5fea0d0c93e-kube-api-access-5zgmn\") pod \"placement-db-sync-vxdx8\" (UID: \"2cadca9f-20ef-432c-8816-e5fea0d0c93e\") " pod="openstack/placement-db-sync-vxdx8" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.419944 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cadca9f-20ef-432c-8816-e5fea0d0c93e-combined-ca-bundle\") pod \"placement-db-sync-vxdx8\" (UID: \"2cadca9f-20ef-432c-8816-e5fea0d0c93e\") " pod="openstack/placement-db-sync-vxdx8" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.420252 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/e8721cab-3eb8-4c80-a0c8-79c7e007b614-job-config-data\") pod \"manila-db-sync-9p9gw\" (UID: \"e8721cab-3eb8-4c80-a0c8-79c7e007b614\") " pod="openstack/manila-db-sync-9p9gw" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.420428 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8721cab-3eb8-4c80-a0c8-79c7e007b614-combined-ca-bundle\") pod \"manila-db-sync-9p9gw\" (UID: \"e8721cab-3eb8-4c80-a0c8-79c7e007b614\") " pod="openstack/manila-db-sync-9p9gw" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.421051 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cadca9f-20ef-432c-8816-e5fea0d0c93e-config-data\") pod \"placement-db-sync-vxdx8\" (UID: \"2cadca9f-20ef-432c-8816-e5fea0d0c93e\") " pod="openstack/placement-db-sync-vxdx8" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.448719 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwm5b\" (UniqueName: \"kubernetes.io/projected/d7b2cdfe-e6e9-47b3-99de-d14e20f3330b-kube-api-access-rwm5b\") pod \"horizon-5855b8d555-mwzss\" (UID: \"d7b2cdfe-e6e9-47b3-99de-d14e20f3330b\") " pod="openstack/horizon-5855b8d555-mwzss" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.452339 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r49ql\" (UniqueName: \"kubernetes.io/projected/e8721cab-3eb8-4c80-a0c8-79c7e007b614-kube-api-access-r49ql\") pod \"manila-db-sync-9p9gw\" (UID: \"e8721cab-3eb8-4c80-a0c8-79c7e007b614\") " pod="openstack/manila-db-sync-9p9gw" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.467989 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-t7fgp" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.469090 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/50098315-1895-433b-9d66-df198c579b4e-config\") pod \"neutron-db-sync-556xl\" (UID: \"50098315-1895-433b-9d66-df198c579b4e\") " pod="openstack/neutron-db-sync-556xl" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.469134 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50098315-1895-433b-9d66-df198c579b4e-combined-ca-bundle\") pod \"neutron-db-sync-556xl\" (UID: \"50098315-1895-433b-9d66-df198c579b4e\") " pod="openstack/neutron-db-sync-556xl" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.469170 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx8lf\" (UniqueName: \"kubernetes.io/projected/50098315-1895-433b-9d66-df198c579b4e-kube-api-access-hx8lf\") pod \"neutron-db-sync-556xl\" (UID: \"50098315-1895-433b-9d66-df198c579b4e\") " pod="openstack/neutron-db-sync-556xl" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.470730 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6xfc\" (UniqueName: \"kubernetes.io/projected/e724f55d-2897-4b64-8a80-cdee522a7143-kube-api-access-k6xfc\") pod \"dnsmasq-dns-785d8bcb8c-2pbrx\" (UID: \"e724f55d-2897-4b64-8a80-cdee522a7143\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.472622 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.477561 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/50098315-1895-433b-9d66-df198c579b4e-config\") pod \"neutron-db-sync-556xl\" (UID: \"50098315-1895-433b-9d66-df198c579b4e\") " pod="openstack/neutron-db-sync-556xl" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.477878 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50098315-1895-433b-9d66-df198c579b4e-combined-ca-bundle\") pod \"neutron-db-sync-556xl\" (UID: \"50098315-1895-433b-9d66-df198c579b4e\") " pod="openstack/neutron-db-sync-556xl" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.479942 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mqqv\" (UniqueName: \"kubernetes.io/projected/b444668b-c6e2-42fc-93e2-8b14ef77eef3-kube-api-access-2mqqv\") pod \"ceilometer-0\" (UID: \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\") " pod="openstack/ceilometer-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.493776 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx8lf\" (UniqueName: \"kubernetes.io/projected/50098315-1895-433b-9d66-df198c579b4e-kube-api-access-hx8lf\") pod \"neutron-db-sync-556xl\" (UID: \"50098315-1895-433b-9d66-df198c579b4e\") " pod="openstack/neutron-db-sync-556xl" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.515106 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-9p9gw" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.537150 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-vxdx8" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.550967 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.553634 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5855b8d555-mwzss" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.556376 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-556xl"] Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.563821 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.593710 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-556xl" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.602859 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-4rkhc"] Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.604132 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-4rkhc" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.607712 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-vkmmd" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.609182 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.609476 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.612893 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.617869 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.617999 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-g4pjv" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.618251 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.618572 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.619987 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-4rkhc"] Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.628824 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.671827 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.674095 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2rtz\" (UniqueName: \"kubernetes.io/projected/bb0e86fd-502f-4fef-9f29-4d612a8d111f-kube-api-access-v2rtz\") pod \"barbican-db-sync-4rkhc\" (UID: \"bb0e86fd-502f-4fef-9f29-4d612a8d111f\") " pod="openstack/barbican-db-sync-4rkhc" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.674300 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bb0e86fd-502f-4fef-9f29-4d612a8d111f-db-sync-config-data\") pod \"barbican-db-sync-4rkhc\" (UID: \"bb0e86fd-502f-4fef-9f29-4d612a8d111f\") " pod="openstack/barbican-db-sync-4rkhc" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.674387 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0e86fd-502f-4fef-9f29-4d612a8d111f-combined-ca-bundle\") pod \"barbican-db-sync-4rkhc\" (UID: \"bb0e86fd-502f-4fef-9f29-4d612a8d111f\") " pod="openstack/barbican-db-sync-4rkhc" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.782754 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aedadb74-4f2b-40b3-9847-20519eea4209-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.782916 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aedadb74-4f2b-40b3-9847-20519eea4209-config-data\") pod \"glance-default-external-api-0\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.782942 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aedadb74-4f2b-40b3-9847-20519eea4209-logs\") pod \"glance-default-external-api-0\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.782964 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aedadb74-4f2b-40b3-9847-20519eea4209-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.782988 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bb0e86fd-502f-4fef-9f29-4d612a8d111f-db-sync-config-data\") pod \"barbican-db-sync-4rkhc\" (UID: \"bb0e86fd-502f-4fef-9f29-4d612a8d111f\") " pod="openstack/barbican-db-sync-4rkhc" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.783011 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/aedadb74-4f2b-40b3-9847-20519eea4209-ceph\") pod \"glance-default-external-api-0\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.783034 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgnv6\" (UniqueName: \"kubernetes.io/projected/aedadb74-4f2b-40b3-9847-20519eea4209-kube-api-access-wgnv6\") pod \"glance-default-external-api-0\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.783052 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0e86fd-502f-4fef-9f29-4d612a8d111f-combined-ca-bundle\") pod \"barbican-db-sync-4rkhc\" (UID: \"bb0e86fd-502f-4fef-9f29-4d612a8d111f\") " pod="openstack/barbican-db-sync-4rkhc" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.783147 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aedadb74-4f2b-40b3-9847-20519eea4209-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.783228 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.783266 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aedadb74-4f2b-40b3-9847-20519eea4209-scripts\") pod \"glance-default-external-api-0\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.783390 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2rtz\" (UniqueName: \"kubernetes.io/projected/bb0e86fd-502f-4fef-9f29-4d612a8d111f-kube-api-access-v2rtz\") pod \"barbican-db-sync-4rkhc\" (UID: \"bb0e86fd-502f-4fef-9f29-4d612a8d111f\") " pod="openstack/barbican-db-sync-4rkhc" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.798257 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bb0e86fd-502f-4fef-9f29-4d612a8d111f-db-sync-config-data\") pod \"barbican-db-sync-4rkhc\" (UID: \"bb0e86fd-502f-4fef-9f29-4d612a8d111f\") " pod="openstack/barbican-db-sync-4rkhc" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.798448 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0e86fd-502f-4fef-9f29-4d612a8d111f-combined-ca-bundle\") pod \"barbican-db-sync-4rkhc\" (UID: \"bb0e86fd-502f-4fef-9f29-4d612a8d111f\") " pod="openstack/barbican-db-sync-4rkhc" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.814812 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2rtz\" (UniqueName: \"kubernetes.io/projected/bb0e86fd-502f-4fef-9f29-4d612a8d111f-kube-api-access-v2rtz\") pod \"barbican-db-sync-4rkhc\" (UID: \"bb0e86fd-502f-4fef-9f29-4d612a8d111f\") " pod="openstack/barbican-db-sync-4rkhc" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.859637 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-r56x2"] Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.886476 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgnv6\" (UniqueName: \"kubernetes.io/projected/aedadb74-4f2b-40b3-9847-20519eea4209-kube-api-access-wgnv6\") pod \"glance-default-external-api-0\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.886532 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aedadb74-4f2b-40b3-9847-20519eea4209-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.886656 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.886705 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aedadb74-4f2b-40b3-9847-20519eea4209-scripts\") pod \"glance-default-external-api-0\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.886792 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aedadb74-4f2b-40b3-9847-20519eea4209-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.886817 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aedadb74-4f2b-40b3-9847-20519eea4209-config-data\") pod \"glance-default-external-api-0\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.886833 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aedadb74-4f2b-40b3-9847-20519eea4209-logs\") pod \"glance-default-external-api-0\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.886876 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aedadb74-4f2b-40b3-9847-20519eea4209-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.886903 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/aedadb74-4f2b-40b3-9847-20519eea4209-ceph\") pod \"glance-default-external-api-0\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.890502 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aedadb74-4f2b-40b3-9847-20519eea4209-logs\") pod \"glance-default-external-api-0\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.890822 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aedadb74-4f2b-40b3-9847-20519eea4209-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.892213 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.912957 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/aedadb74-4f2b-40b3-9847-20519eea4209-ceph\") pod \"glance-default-external-api-0\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.913240 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aedadb74-4f2b-40b3-9847-20519eea4209-scripts\") pod \"glance-default-external-api-0\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.914320 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aedadb74-4f2b-40b3-9847-20519eea4209-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.915111 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aedadb74-4f2b-40b3-9847-20519eea4209-config-data\") pod \"glance-default-external-api-0\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.915228 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aedadb74-4f2b-40b3-9847-20519eea4209-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.925109 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgnv6\" (UniqueName: \"kubernetes.io/projected/aedadb74-4f2b-40b3-9847-20519eea4209-kube-api-access-wgnv6\") pod \"glance-default-external-api-0\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.957070 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.957390 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-4rkhc" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.965411 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.967454 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.970215 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.970249 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 12 15:07:27 crc kubenswrapper[4869]: I0312 15:07:27.979168 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.023585 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5cb86d8b85-vl7p7"] Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.093032 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfea72a5-eabd-4a81-bccb-ce2a59466007-logs\") pod \"glance-default-internal-api-0\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.093093 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfea72a5-eabd-4a81-bccb-ce2a59466007-config-data\") pod \"glance-default-internal-api-0\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.093122 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.093140 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dfea72a5-eabd-4a81-bccb-ce2a59466007-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.093156 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfea72a5-eabd-4a81-bccb-ce2a59466007-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.093177 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfea72a5-eabd-4a81-bccb-ce2a59466007-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.093205 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/dfea72a5-eabd-4a81-bccb-ce2a59466007-ceph\") pod \"glance-default-internal-api-0\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.093220 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfea72a5-eabd-4a81-bccb-ce2a59466007-scripts\") pod \"glance-default-internal-api-0\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.093316 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7zxk\" (UniqueName: \"kubernetes.io/projected/dfea72a5-eabd-4a81-bccb-ce2a59466007-kube-api-access-n7zxk\") pod \"glance-default-internal-api-0\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.183509 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-t7fgp"] Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.194884 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-ww7mn"] Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.195379 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfea72a5-eabd-4a81-bccb-ce2a59466007-logs\") pod \"glance-default-internal-api-0\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.195429 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfea72a5-eabd-4a81-bccb-ce2a59466007-config-data\") pod \"glance-default-internal-api-0\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.195517 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.196067 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.196249 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfea72a5-eabd-4a81-bccb-ce2a59466007-logs\") pod \"glance-default-internal-api-0\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.196837 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dfea72a5-eabd-4a81-bccb-ce2a59466007-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.196872 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfea72a5-eabd-4a81-bccb-ce2a59466007-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.196895 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfea72a5-eabd-4a81-bccb-ce2a59466007-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.196928 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/dfea72a5-eabd-4a81-bccb-ce2a59466007-ceph\") pod \"glance-default-internal-api-0\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.196944 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfea72a5-eabd-4a81-bccb-ce2a59466007-scripts\") pod \"glance-default-internal-api-0\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.197167 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7zxk\" (UniqueName: \"kubernetes.io/projected/dfea72a5-eabd-4a81-bccb-ce2a59466007-kube-api-access-n7zxk\") pod \"glance-default-internal-api-0\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.197973 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dfea72a5-eabd-4a81-bccb-ce2a59466007-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.201917 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfea72a5-eabd-4a81-bccb-ce2a59466007-config-data\") pod \"glance-default-internal-api-0\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.207608 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfea72a5-eabd-4a81-bccb-ce2a59466007-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.208618 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfea72a5-eabd-4a81-bccb-ce2a59466007-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.221986 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/dfea72a5-eabd-4a81-bccb-ce2a59466007-ceph\") pod \"glance-default-internal-api-0\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.222317 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfea72a5-eabd-4a81-bccb-ce2a59466007-scripts\") pod \"glance-default-internal-api-0\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.240831 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7zxk\" (UniqueName: \"kubernetes.io/projected/dfea72a5-eabd-4a81-bccb-ce2a59466007-kube-api-access-n7zxk\") pod \"glance-default-internal-api-0\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.247665 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.263321 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.498338 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.513296 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-ww7mn" event={"ID":"268b9009-c0b5-4ea9-a199-797e2444130c","Type":"ContainerStarted","Data":"857077ba39ecca4d9438182d9d2233c07ccd3b5994633bedba0dd3b66381388e"} Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.521459 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" event={"ID":"e724f55d-2897-4b64-8a80-cdee522a7143","Type":"ContainerStarted","Data":"e49f1770b143a9135a560b637f31aaac5c06affa416c85df779f0c67bc032070"} Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.529210 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-t7fgp" event={"ID":"ca1953bb-fc5d-4285-8de4-b67746201d05","Type":"ContainerStarted","Data":"b21d3ca01c0107f42f8287a0c0d733e88fcf2e0da3f1602a7f3ac0c90f260b58"} Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.534319 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b444668b-c6e2-42fc-93e2-8b14ef77eef3","Type":"ContainerStarted","Data":"788a5b2fafa6aa775313d1284d6a52f23ee3f8dbac0ea70c73d3d1f9827438e5"} Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.541458 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-vxdx8" event={"ID":"2cadca9f-20ef-432c-8816-e5fea0d0c93e","Type":"ContainerStarted","Data":"62cc4671254e525dedec9a6f9463ca964477738887c8e971a10fa649f947755a"} Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.546043 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r56x2" event={"ID":"8d771411-1e30-4680-913f-a9e0a9a74e31","Type":"ContainerStarted","Data":"85f0ae6aebb26417371e54af05957b6d246bf047dc19680a33d5123e212433be"} Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.546094 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r56x2" event={"ID":"8d771411-1e30-4680-913f-a9e0a9a74e31","Type":"ContainerStarted","Data":"05e36edbb8fed3a56c5437c1e5180732ff88885e322dddb1ebfa29c65b641e5b"} Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.549882 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cb86d8b85-vl7p7" event={"ID":"18e6c8a7-be39-4624-b798-c2f6715ab6e8","Type":"ContainerStarted","Data":"f151fd9be1879756534d902ecc6d4dfe097c1dfb06c19f5d3b72c00b981e0121"} Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.555222 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2pbrx"] Mar 12 15:07:28 crc kubenswrapper[4869]: W0312 15:07:28.570115 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8721cab_3eb8_4c80_a0c8_79c7e007b614.slice/crio-cac60e0160fd577099f2d2db83bdb6f515cc375d92843fe6f69a71109923a491 WatchSource:0}: Error finding container cac60e0160fd577099f2d2db83bdb6f515cc375d92843fe6f69a71109923a491: Status 404 returned error can't find the container with id cac60e0160fd577099f2d2db83bdb6f515cc375d92843fe6f69a71109923a491 Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.599361 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.619476 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-vxdx8"] Mar 12 15:07:28 crc kubenswrapper[4869]: W0312 15:07:28.647747 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7b2cdfe_e6e9_47b3_99de_d14e20f3330b.slice/crio-ba02d63c3419366c138705d1e79cc1b195805a55e665fb84dc33c21cc0ee62c6 WatchSource:0}: Error finding container ba02d63c3419366c138705d1e79cc1b195805a55e665fb84dc33c21cc0ee62c6: Status 404 returned error can't find the container with id ba02d63c3419366c138705d1e79cc1b195805a55e665fb84dc33c21cc0ee62c6 Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.682718 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-9p9gw"] Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.686159 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-r56x2" podStartSLOduration=2.6861429489999997 podStartE2EDuration="2.686142949s" podCreationTimestamp="2026-03-12 15:07:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:07:28.565795831 +0000 UTC m=+1200.851021119" watchObservedRunningTime="2026-03-12 15:07:28.686142949 +0000 UTC m=+1200.971368227" Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.719605 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5855b8d555-mwzss"] Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.736869 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-556xl"] Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.788410 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-4rkhc"] Mar 12 15:07:28 crc kubenswrapper[4869]: I0312 15:07:28.958188 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 15:07:28 crc kubenswrapper[4869]: W0312 15:07:28.980534 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaedadb74_4f2b_40b3_9847_20519eea4209.slice/crio-40f4ed5eedd6b23c3234b849cb149b1e196d3e0abacd82fea591a97dd216c6c8 WatchSource:0}: Error finding container 40f4ed5eedd6b23c3234b849cb149b1e196d3e0abacd82fea591a97dd216c6c8: Status 404 returned error can't find the container with id 40f4ed5eedd6b23c3234b849cb149b1e196d3e0abacd82fea591a97dd216c6c8 Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.059144 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.110286 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5855b8d555-mwzss"] Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.144419 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.155225 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7df4569cb7-bfnpz"] Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.159379 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7df4569cb7-bfnpz" Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.193691 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7df4569cb7-bfnpz"] Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.227000 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ea3aeb8-779a-45a1-b601-0e6f2776311a-logs\") pod \"horizon-7df4569cb7-bfnpz\" (UID: \"5ea3aeb8-779a-45a1-b601-0e6f2776311a\") " pod="openstack/horizon-7df4569cb7-bfnpz" Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.227083 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5ea3aeb8-779a-45a1-b601-0e6f2776311a-horizon-secret-key\") pod \"horizon-7df4569cb7-bfnpz\" (UID: \"5ea3aeb8-779a-45a1-b601-0e6f2776311a\") " pod="openstack/horizon-7df4569cb7-bfnpz" Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.227110 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5ea3aeb8-779a-45a1-b601-0e6f2776311a-config-data\") pod \"horizon-7df4569cb7-bfnpz\" (UID: \"5ea3aeb8-779a-45a1-b601-0e6f2776311a\") " pod="openstack/horizon-7df4569cb7-bfnpz" Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.227187 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5ea3aeb8-779a-45a1-b601-0e6f2776311a-scripts\") pod \"horizon-7df4569cb7-bfnpz\" (UID: \"5ea3aeb8-779a-45a1-b601-0e6f2776311a\") " pod="openstack/horizon-7df4569cb7-bfnpz" Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.227240 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph9hb\" (UniqueName: \"kubernetes.io/projected/5ea3aeb8-779a-45a1-b601-0e6f2776311a-kube-api-access-ph9hb\") pod \"horizon-7df4569cb7-bfnpz\" (UID: \"5ea3aeb8-779a-45a1-b601-0e6f2776311a\") " pod="openstack/horizon-7df4569cb7-bfnpz" Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.243940 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.254273 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 15:07:29 crc kubenswrapper[4869]: W0312 15:07:29.270417 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfea72a5_eabd_4a81_bccb_ce2a59466007.slice/crio-d4b53209a044d3b04f18b9ddfd700b128ba1c8cf84457e62beda49cd8d5c6381 WatchSource:0}: Error finding container d4b53209a044d3b04f18b9ddfd700b128ba1c8cf84457e62beda49cd8d5c6381: Status 404 returned error can't find the container with id d4b53209a044d3b04f18b9ddfd700b128ba1c8cf84457e62beda49cd8d5c6381 Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.328729 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph9hb\" (UniqueName: \"kubernetes.io/projected/5ea3aeb8-779a-45a1-b601-0e6f2776311a-kube-api-access-ph9hb\") pod \"horizon-7df4569cb7-bfnpz\" (UID: \"5ea3aeb8-779a-45a1-b601-0e6f2776311a\") " pod="openstack/horizon-7df4569cb7-bfnpz" Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.328808 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ea3aeb8-779a-45a1-b601-0e6f2776311a-logs\") pod \"horizon-7df4569cb7-bfnpz\" (UID: \"5ea3aeb8-779a-45a1-b601-0e6f2776311a\") " pod="openstack/horizon-7df4569cb7-bfnpz" Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.328878 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5ea3aeb8-779a-45a1-b601-0e6f2776311a-horizon-secret-key\") pod \"horizon-7df4569cb7-bfnpz\" (UID: \"5ea3aeb8-779a-45a1-b601-0e6f2776311a\") " pod="openstack/horizon-7df4569cb7-bfnpz" Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.328900 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5ea3aeb8-779a-45a1-b601-0e6f2776311a-config-data\") pod \"horizon-7df4569cb7-bfnpz\" (UID: \"5ea3aeb8-779a-45a1-b601-0e6f2776311a\") " pod="openstack/horizon-7df4569cb7-bfnpz" Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.328972 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5ea3aeb8-779a-45a1-b601-0e6f2776311a-scripts\") pod \"horizon-7df4569cb7-bfnpz\" (UID: \"5ea3aeb8-779a-45a1-b601-0e6f2776311a\") " pod="openstack/horizon-7df4569cb7-bfnpz" Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.329817 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5ea3aeb8-779a-45a1-b601-0e6f2776311a-scripts\") pod \"horizon-7df4569cb7-bfnpz\" (UID: \"5ea3aeb8-779a-45a1-b601-0e6f2776311a\") " pod="openstack/horizon-7df4569cb7-bfnpz" Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.330050 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ea3aeb8-779a-45a1-b601-0e6f2776311a-logs\") pod \"horizon-7df4569cb7-bfnpz\" (UID: \"5ea3aeb8-779a-45a1-b601-0e6f2776311a\") " pod="openstack/horizon-7df4569cb7-bfnpz" Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.336707 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5ea3aeb8-779a-45a1-b601-0e6f2776311a-config-data\") pod \"horizon-7df4569cb7-bfnpz\" (UID: \"5ea3aeb8-779a-45a1-b601-0e6f2776311a\") " pod="openstack/horizon-7df4569cb7-bfnpz" Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.341034 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5ea3aeb8-779a-45a1-b601-0e6f2776311a-horizon-secret-key\") pod \"horizon-7df4569cb7-bfnpz\" (UID: \"5ea3aeb8-779a-45a1-b601-0e6f2776311a\") " pod="openstack/horizon-7df4569cb7-bfnpz" Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.348089 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph9hb\" (UniqueName: \"kubernetes.io/projected/5ea3aeb8-779a-45a1-b601-0e6f2776311a-kube-api-access-ph9hb\") pod \"horizon-7df4569cb7-bfnpz\" (UID: \"5ea3aeb8-779a-45a1-b601-0e6f2776311a\") " pod="openstack/horizon-7df4569cb7-bfnpz" Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.563492 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dfea72a5-eabd-4a81-bccb-ce2a59466007","Type":"ContainerStarted","Data":"d4b53209a044d3b04f18b9ddfd700b128ba1c8cf84457e62beda49cd8d5c6381"} Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.591291 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-556xl" event={"ID":"50098315-1895-433b-9d66-df198c579b4e","Type":"ContainerStarted","Data":"99ab82161b41e97ab7c1ebd91e6d6c0cf00ba76e190a938a5a292faf679ad0ab"} Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.591337 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-556xl" event={"ID":"50098315-1895-433b-9d66-df198c579b4e","Type":"ContainerStarted","Data":"ed812f60a1fe5812ad3dce32b3f2b09d3ed6f3ab4ba28b142e317d52c839509c"} Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.597242 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-4rkhc" event={"ID":"bb0e86fd-502f-4fef-9f29-4d612a8d111f","Type":"ContainerStarted","Data":"e80c3e3b9e25726dfdf4341591f776898e52d4221fa26e6ca804babe3f5f6ffd"} Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.601800 4869 generic.go:334] "Generic (PLEG): container finished" podID="268b9009-c0b5-4ea9-a199-797e2444130c" containerID="f04065ae52669020a1a59f22d5c67f4db5049c8287550e598ab8db2d865651e4" exitCode=0 Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.601858 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-ww7mn" event={"ID":"268b9009-c0b5-4ea9-a199-797e2444130c","Type":"ContainerDied","Data":"f04065ae52669020a1a59f22d5c67f4db5049c8287550e598ab8db2d865651e4"} Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.615830 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-556xl" podStartSLOduration=2.615808848 podStartE2EDuration="2.615808848s" podCreationTimestamp="2026-03-12 15:07:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:07:29.609634591 +0000 UTC m=+1201.894859869" watchObservedRunningTime="2026-03-12 15:07:29.615808848 +0000 UTC m=+1201.901034126" Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.626599 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7df4569cb7-bfnpz" Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.627017 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"aedadb74-4f2b-40b3-9847-20519eea4209","Type":"ContainerStarted","Data":"40f4ed5eedd6b23c3234b849cb149b1e196d3e0abacd82fea591a97dd216c6c8"} Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.630208 4869 generic.go:334] "Generic (PLEG): container finished" podID="e724f55d-2897-4b64-8a80-cdee522a7143" containerID="9afe1fe8028fdeb1af82340d7026017ffb74f59432a4797d0a4773a4fa812c5a" exitCode=0 Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.630263 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" event={"ID":"e724f55d-2897-4b64-8a80-cdee522a7143","Type":"ContainerDied","Data":"9afe1fe8028fdeb1af82340d7026017ffb74f59432a4797d0a4773a4fa812c5a"} Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.637751 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5855b8d555-mwzss" event={"ID":"d7b2cdfe-e6e9-47b3-99de-d14e20f3330b","Type":"ContainerStarted","Data":"ba02d63c3419366c138705d1e79cc1b195805a55e665fb84dc33c21cc0ee62c6"} Mar 12 15:07:29 crc kubenswrapper[4869]: I0312 15:07:29.640226 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-9p9gw" event={"ID":"e8721cab-3eb8-4c80-a0c8-79c7e007b614","Type":"ContainerStarted","Data":"cac60e0160fd577099f2d2db83bdb6f515cc375d92843fe6f69a71109923a491"} Mar 12 15:07:30 crc kubenswrapper[4869]: I0312 15:07:30.135934 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-ww7mn" Mar 12 15:07:30 crc kubenswrapper[4869]: I0312 15:07:30.254840 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/268b9009-c0b5-4ea9-a199-797e2444130c-dns-svc\") pod \"268b9009-c0b5-4ea9-a199-797e2444130c\" (UID: \"268b9009-c0b5-4ea9-a199-797e2444130c\") " Mar 12 15:07:30 crc kubenswrapper[4869]: I0312 15:07:30.254898 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/268b9009-c0b5-4ea9-a199-797e2444130c-dns-swift-storage-0\") pod \"268b9009-c0b5-4ea9-a199-797e2444130c\" (UID: \"268b9009-c0b5-4ea9-a199-797e2444130c\") " Mar 12 15:07:30 crc kubenswrapper[4869]: I0312 15:07:30.254931 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/268b9009-c0b5-4ea9-a199-797e2444130c-ovsdbserver-sb\") pod \"268b9009-c0b5-4ea9-a199-797e2444130c\" (UID: \"268b9009-c0b5-4ea9-a199-797e2444130c\") " Mar 12 15:07:30 crc kubenswrapper[4869]: I0312 15:07:30.254953 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/268b9009-c0b5-4ea9-a199-797e2444130c-ovsdbserver-nb\") pod \"268b9009-c0b5-4ea9-a199-797e2444130c\" (UID: \"268b9009-c0b5-4ea9-a199-797e2444130c\") " Mar 12 15:07:30 crc kubenswrapper[4869]: I0312 15:07:30.254974 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/268b9009-c0b5-4ea9-a199-797e2444130c-config\") pod \"268b9009-c0b5-4ea9-a199-797e2444130c\" (UID: \"268b9009-c0b5-4ea9-a199-797e2444130c\") " Mar 12 15:07:30 crc kubenswrapper[4869]: I0312 15:07:30.255074 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7mgc\" (UniqueName: \"kubernetes.io/projected/268b9009-c0b5-4ea9-a199-797e2444130c-kube-api-access-r7mgc\") pod \"268b9009-c0b5-4ea9-a199-797e2444130c\" (UID: \"268b9009-c0b5-4ea9-a199-797e2444130c\") " Mar 12 15:07:30 crc kubenswrapper[4869]: I0312 15:07:30.454335 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7df4569cb7-bfnpz"] Mar 12 15:07:30 crc kubenswrapper[4869]: I0312 15:07:30.674833 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/268b9009-c0b5-4ea9-a199-797e2444130c-kube-api-access-r7mgc" (OuterVolumeSpecName: "kube-api-access-r7mgc") pod "268b9009-c0b5-4ea9-a199-797e2444130c" (UID: "268b9009-c0b5-4ea9-a199-797e2444130c"). InnerVolumeSpecName "kube-api-access-r7mgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:07:30 crc kubenswrapper[4869]: I0312 15:07:30.678329 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/268b9009-c0b5-4ea9-a199-797e2444130c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "268b9009-c0b5-4ea9-a199-797e2444130c" (UID: "268b9009-c0b5-4ea9-a199-797e2444130c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:30 crc kubenswrapper[4869]: I0312 15:07:30.678354 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/268b9009-c0b5-4ea9-a199-797e2444130c-config" (OuterVolumeSpecName: "config") pod "268b9009-c0b5-4ea9-a199-797e2444130c" (UID: "268b9009-c0b5-4ea9-a199-797e2444130c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:30 crc kubenswrapper[4869]: I0312 15:07:30.679394 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/268b9009-c0b5-4ea9-a199-797e2444130c-config\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:30 crc kubenswrapper[4869]: I0312 15:07:30.679623 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7mgc\" (UniqueName: \"kubernetes.io/projected/268b9009-c0b5-4ea9-a199-797e2444130c-kube-api-access-r7mgc\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:30 crc kubenswrapper[4869]: I0312 15:07:30.679657 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/268b9009-c0b5-4ea9-a199-797e2444130c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:30 crc kubenswrapper[4869]: I0312 15:07:30.679775 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/268b9009-c0b5-4ea9-a199-797e2444130c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "268b9009-c0b5-4ea9-a199-797e2444130c" (UID: "268b9009-c0b5-4ea9-a199-797e2444130c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:30 crc kubenswrapper[4869]: I0312 15:07:30.680481 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/268b9009-c0b5-4ea9-a199-797e2444130c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "268b9009-c0b5-4ea9-a199-797e2444130c" (UID: "268b9009-c0b5-4ea9-a199-797e2444130c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:30 crc kubenswrapper[4869]: I0312 15:07:30.680628 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/268b9009-c0b5-4ea9-a199-797e2444130c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "268b9009-c0b5-4ea9-a199-797e2444130c" (UID: "268b9009-c0b5-4ea9-a199-797e2444130c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:30 crc kubenswrapper[4869]: I0312 15:07:30.691132 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-ww7mn" event={"ID":"268b9009-c0b5-4ea9-a199-797e2444130c","Type":"ContainerDied","Data":"857077ba39ecca4d9438182d9d2233c07ccd3b5994633bedba0dd3b66381388e"} Mar 12 15:07:30 crc kubenswrapper[4869]: I0312 15:07:30.691199 4869 scope.go:117] "RemoveContainer" containerID="f04065ae52669020a1a59f22d5c67f4db5049c8287550e598ab8db2d865651e4" Mar 12 15:07:30 crc kubenswrapper[4869]: I0312 15:07:30.691203 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-ww7mn" Mar 12 15:07:30 crc kubenswrapper[4869]: W0312 15:07:30.699282 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ea3aeb8_779a_45a1_b601_0e6f2776311a.slice/crio-abb919130e0ee799a2d9d0ceb78b73b83569dc57f13a004cb59f39af977fbfbe WatchSource:0}: Error finding container abb919130e0ee799a2d9d0ceb78b73b83569dc57f13a004cb59f39af977fbfbe: Status 404 returned error can't find the container with id abb919130e0ee799a2d9d0ceb78b73b83569dc57f13a004cb59f39af977fbfbe Mar 12 15:07:30 crc kubenswrapper[4869]: I0312 15:07:30.781936 4869 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/268b9009-c0b5-4ea9-a199-797e2444130c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:30 crc kubenswrapper[4869]: I0312 15:07:30.781986 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/268b9009-c0b5-4ea9-a199-797e2444130c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:30 crc kubenswrapper[4869]: I0312 15:07:30.781995 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/268b9009-c0b5-4ea9-a199-797e2444130c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:30 crc kubenswrapper[4869]: I0312 15:07:30.783140 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-ww7mn"] Mar 12 15:07:30 crc kubenswrapper[4869]: I0312 15:07:30.799333 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-ww7mn"] Mar 12 15:07:31 crc kubenswrapper[4869]: I0312 15:07:31.707900 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" event={"ID":"e724f55d-2897-4b64-8a80-cdee522a7143","Type":"ContainerStarted","Data":"c707e908255bc7787c8e32007a043fb2f818792e04634519ab5b21f0e75e694d"} Mar 12 15:07:31 crc kubenswrapper[4869]: I0312 15:07:31.709155 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" Mar 12 15:07:31 crc kubenswrapper[4869]: I0312 15:07:31.711138 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7df4569cb7-bfnpz" event={"ID":"5ea3aeb8-779a-45a1-b601-0e6f2776311a","Type":"ContainerStarted","Data":"abb919130e0ee799a2d9d0ceb78b73b83569dc57f13a004cb59f39af977fbfbe"} Mar 12 15:07:31 crc kubenswrapper[4869]: I0312 15:07:31.718298 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"aedadb74-4f2b-40b3-9847-20519eea4209","Type":"ContainerStarted","Data":"aea473439d4553f1b190401df584a21257a0d0d8aa41a8425de7002607e36162"} Mar 12 15:07:31 crc kubenswrapper[4869]: I0312 15:07:31.723793 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dfea72a5-eabd-4a81-bccb-ce2a59466007","Type":"ContainerStarted","Data":"9bdddc46067e2d496dd9fd439da79e0fdf11052bb41bd6cbc5da3c431fc88000"} Mar 12 15:07:31 crc kubenswrapper[4869]: I0312 15:07:31.733095 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" podStartSLOduration=4.733077914 podStartE2EDuration="4.733077914s" podCreationTimestamp="2026-03-12 15:07:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:07:31.723430268 +0000 UTC m=+1204.008655546" watchObservedRunningTime="2026-03-12 15:07:31.733077914 +0000 UTC m=+1204.018303192" Mar 12 15:07:31 crc kubenswrapper[4869]: E0312 15:07:31.972491 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = reading blob sha256:c0641985c80a2eda9edd947fcbab64bf498674c55c3beb0504f3d1a3a943e351: fetching blob: received unexpected HTTP status: 502 Bad Gateway" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 12 15:07:31 crc kubenswrapper[4869]: E0312 15:07:31.972658 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s57vj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-t7fgp_openstack(ca1953bb-fc5d-4285-8de4-b67746201d05): ErrImagePull: reading blob sha256:c0641985c80a2eda9edd947fcbab64bf498674c55c3beb0504f3d1a3a943e351: fetching blob: received unexpected HTTP status: 502 Bad Gateway" logger="UnhandledError" Mar 12 15:07:31 crc kubenswrapper[4869]: E0312 15:07:31.973999 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"reading blob sha256:c0641985c80a2eda9edd947fcbab64bf498674c55c3beb0504f3d1a3a943e351: fetching blob: received unexpected HTTP status: 502 Bad Gateway\"" pod="openstack/cinder-db-sync-t7fgp" podUID="ca1953bb-fc5d-4285-8de4-b67746201d05" Mar 12 15:07:32 crc kubenswrapper[4869]: I0312 15:07:32.351933 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="268b9009-c0b5-4ea9-a199-797e2444130c" path="/var/lib/kubelet/pods/268b9009-c0b5-4ea9-a199-797e2444130c/volumes" Mar 12 15:07:32 crc kubenswrapper[4869]: I0312 15:07:32.738118 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"aedadb74-4f2b-40b3-9847-20519eea4209","Type":"ContainerStarted","Data":"00f9b6845c52e5cc95094a367d431b00501ccfb18fb55765927fb0a79d1d0c44"} Mar 12 15:07:32 crc kubenswrapper[4869]: I0312 15:07:32.738177 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="aedadb74-4f2b-40b3-9847-20519eea4209" containerName="glance-log" containerID="cri-o://aea473439d4553f1b190401df584a21257a0d0d8aa41a8425de7002607e36162" gracePeriod=30 Mar 12 15:07:32 crc kubenswrapper[4869]: I0312 15:07:32.738236 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="aedadb74-4f2b-40b3-9847-20519eea4209" containerName="glance-httpd" containerID="cri-o://00f9b6845c52e5cc95094a367d431b00501ccfb18fb55765927fb0a79d1d0c44" gracePeriod=30 Mar 12 15:07:32 crc kubenswrapper[4869]: I0312 15:07:32.742686 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dfea72a5-eabd-4a81-bccb-ce2a59466007","Type":"ContainerStarted","Data":"129921bc6a64894b2d2b91a61a91ce9be61427c612008ba736573d0c97d82793"} Mar 12 15:07:32 crc kubenswrapper[4869]: I0312 15:07:32.742849 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="dfea72a5-eabd-4a81-bccb-ce2a59466007" containerName="glance-log" containerID="cri-o://9bdddc46067e2d496dd9fd439da79e0fdf11052bb41bd6cbc5da3c431fc88000" gracePeriod=30 Mar 12 15:07:32 crc kubenswrapper[4869]: I0312 15:07:32.742965 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="dfea72a5-eabd-4a81-bccb-ce2a59466007" containerName="glance-httpd" containerID="cri-o://129921bc6a64894b2d2b91a61a91ce9be61427c612008ba736573d0c97d82793" gracePeriod=30 Mar 12 15:07:32 crc kubenswrapper[4869]: E0312 15:07:32.745743 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-t7fgp" podUID="ca1953bb-fc5d-4285-8de4-b67746201d05" Mar 12 15:07:32 crc kubenswrapper[4869]: I0312 15:07:32.775088 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.775072321 podStartE2EDuration="5.775072321s" podCreationTimestamp="2026-03-12 15:07:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:07:32.768009519 +0000 UTC m=+1205.053234797" watchObservedRunningTime="2026-03-12 15:07:32.775072321 +0000 UTC m=+1205.060297599" Mar 12 15:07:32 crc kubenswrapper[4869]: I0312 15:07:32.831404 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.831383725 podStartE2EDuration="6.831383725s" podCreationTimestamp="2026-03-12 15:07:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:07:32.804069682 +0000 UTC m=+1205.089294960" watchObservedRunningTime="2026-03-12 15:07:32.831383725 +0000 UTC m=+1205.116609013" Mar 12 15:07:33 crc kubenswrapper[4869]: I0312 15:07:33.757086 4869 generic.go:334] "Generic (PLEG): container finished" podID="aedadb74-4f2b-40b3-9847-20519eea4209" containerID="00f9b6845c52e5cc95094a367d431b00501ccfb18fb55765927fb0a79d1d0c44" exitCode=0 Mar 12 15:07:33 crc kubenswrapper[4869]: I0312 15:07:33.757116 4869 generic.go:334] "Generic (PLEG): container finished" podID="aedadb74-4f2b-40b3-9847-20519eea4209" containerID="aea473439d4553f1b190401df584a21257a0d0d8aa41a8425de7002607e36162" exitCode=143 Mar 12 15:07:33 crc kubenswrapper[4869]: I0312 15:07:33.757152 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"aedadb74-4f2b-40b3-9847-20519eea4209","Type":"ContainerDied","Data":"00f9b6845c52e5cc95094a367d431b00501ccfb18fb55765927fb0a79d1d0c44"} Mar 12 15:07:33 crc kubenswrapper[4869]: I0312 15:07:33.757196 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"aedadb74-4f2b-40b3-9847-20519eea4209","Type":"ContainerDied","Data":"aea473439d4553f1b190401df584a21257a0d0d8aa41a8425de7002607e36162"} Mar 12 15:07:33 crc kubenswrapper[4869]: I0312 15:07:33.764763 4869 generic.go:334] "Generic (PLEG): container finished" podID="dfea72a5-eabd-4a81-bccb-ce2a59466007" containerID="129921bc6a64894b2d2b91a61a91ce9be61427c612008ba736573d0c97d82793" exitCode=0 Mar 12 15:07:33 crc kubenswrapper[4869]: I0312 15:07:33.764793 4869 generic.go:334] "Generic (PLEG): container finished" podID="dfea72a5-eabd-4a81-bccb-ce2a59466007" containerID="9bdddc46067e2d496dd9fd439da79e0fdf11052bb41bd6cbc5da3c431fc88000" exitCode=143 Mar 12 15:07:33 crc kubenswrapper[4869]: I0312 15:07:33.764850 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dfea72a5-eabd-4a81-bccb-ce2a59466007","Type":"ContainerDied","Data":"129921bc6a64894b2d2b91a61a91ce9be61427c612008ba736573d0c97d82793"} Mar 12 15:07:33 crc kubenswrapper[4869]: I0312 15:07:33.764877 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dfea72a5-eabd-4a81-bccb-ce2a59466007","Type":"ContainerDied","Data":"9bdddc46067e2d496dd9fd439da79e0fdf11052bb41bd6cbc5da3c431fc88000"} Mar 12 15:07:33 crc kubenswrapper[4869]: I0312 15:07:33.766632 4869 generic.go:334] "Generic (PLEG): container finished" podID="8d771411-1e30-4680-913f-a9e0a9a74e31" containerID="85f0ae6aebb26417371e54af05957b6d246bf047dc19680a33d5123e212433be" exitCode=0 Mar 12 15:07:33 crc kubenswrapper[4869]: I0312 15:07:33.766750 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r56x2" event={"ID":"8d771411-1e30-4680-913f-a9e0a9a74e31","Type":"ContainerDied","Data":"85f0ae6aebb26417371e54af05957b6d246bf047dc19680a33d5123e212433be"} Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.347583 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5cb86d8b85-vl7p7"] Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.444430 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7f798b7b68-jtm8h"] Mar 12 15:07:35 crc kubenswrapper[4869]: E0312 15:07:35.444854 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="268b9009-c0b5-4ea9-a199-797e2444130c" containerName="init" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.444873 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="268b9009-c0b5-4ea9-a199-797e2444130c" containerName="init" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.445056 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="268b9009-c0b5-4ea9-a199-797e2444130c" containerName="init" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.445947 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f798b7b68-jtm8h" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.449167 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.454051 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7f798b7b68-jtm8h"] Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.476219 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7df4569cb7-bfnpz"] Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.508704 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-776cdff46d-hvjw9"] Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.510925 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-776cdff46d-hvjw9" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.528049 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-776cdff46d-hvjw9"] Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.585465 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b4d63031-e072-466e-ae3c-d829a699b197-horizon-secret-key\") pod \"horizon-7f798b7b68-jtm8h\" (UID: \"b4d63031-e072-466e-ae3c-d829a699b197\") " pod="openstack/horizon-7f798b7b68-jtm8h" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.585513 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4d63031-e072-466e-ae3c-d829a699b197-combined-ca-bundle\") pod \"horizon-7f798b7b68-jtm8h\" (UID: \"b4d63031-e072-466e-ae3c-d829a699b197\") " pod="openstack/horizon-7f798b7b68-jtm8h" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.585575 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4d63031-e072-466e-ae3c-d829a699b197-logs\") pod \"horizon-7f798b7b68-jtm8h\" (UID: \"b4d63031-e072-466e-ae3c-d829a699b197\") " pod="openstack/horizon-7f798b7b68-jtm8h" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.585674 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt9dr\" (UniqueName: \"kubernetes.io/projected/b4d63031-e072-466e-ae3c-d829a699b197-kube-api-access-pt9dr\") pod \"horizon-7f798b7b68-jtm8h\" (UID: \"b4d63031-e072-466e-ae3c-d829a699b197\") " pod="openstack/horizon-7f798b7b68-jtm8h" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.585703 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4d63031-e072-466e-ae3c-d829a699b197-horizon-tls-certs\") pod \"horizon-7f798b7b68-jtm8h\" (UID: \"b4d63031-e072-466e-ae3c-d829a699b197\") " pod="openstack/horizon-7f798b7b68-jtm8h" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.585725 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4d63031-e072-466e-ae3c-d829a699b197-scripts\") pod \"horizon-7f798b7b68-jtm8h\" (UID: \"b4d63031-e072-466e-ae3c-d829a699b197\") " pod="openstack/horizon-7f798b7b68-jtm8h" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.585761 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b4d63031-e072-466e-ae3c-d829a699b197-config-data\") pod \"horizon-7f798b7b68-jtm8h\" (UID: \"b4d63031-e072-466e-ae3c-d829a699b197\") " pod="openstack/horizon-7f798b7b68-jtm8h" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.688130 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh9w6\" (UniqueName: \"kubernetes.io/projected/f1a34267-2bcd-4a01-b2b6-7528c474a7a2-kube-api-access-lh9w6\") pod \"horizon-776cdff46d-hvjw9\" (UID: \"f1a34267-2bcd-4a01-b2b6-7528c474a7a2\") " pod="openstack/horizon-776cdff46d-hvjw9" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.688257 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1a34267-2bcd-4a01-b2b6-7528c474a7a2-combined-ca-bundle\") pod \"horizon-776cdff46d-hvjw9\" (UID: \"f1a34267-2bcd-4a01-b2b6-7528c474a7a2\") " pod="openstack/horizon-776cdff46d-hvjw9" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.688289 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f1a34267-2bcd-4a01-b2b6-7528c474a7a2-horizon-secret-key\") pod \"horizon-776cdff46d-hvjw9\" (UID: \"f1a34267-2bcd-4a01-b2b6-7528c474a7a2\") " pod="openstack/horizon-776cdff46d-hvjw9" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.688350 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b4d63031-e072-466e-ae3c-d829a699b197-horizon-secret-key\") pod \"horizon-7f798b7b68-jtm8h\" (UID: \"b4d63031-e072-466e-ae3c-d829a699b197\") " pod="openstack/horizon-7f798b7b68-jtm8h" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.688423 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1a34267-2bcd-4a01-b2b6-7528c474a7a2-horizon-tls-certs\") pod \"horizon-776cdff46d-hvjw9\" (UID: \"f1a34267-2bcd-4a01-b2b6-7528c474a7a2\") " pod="openstack/horizon-776cdff46d-hvjw9" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.688559 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4d63031-e072-466e-ae3c-d829a699b197-combined-ca-bundle\") pod \"horizon-7f798b7b68-jtm8h\" (UID: \"b4d63031-e072-466e-ae3c-d829a699b197\") " pod="openstack/horizon-7f798b7b68-jtm8h" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.688677 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4d63031-e072-466e-ae3c-d829a699b197-logs\") pod \"horizon-7f798b7b68-jtm8h\" (UID: \"b4d63031-e072-466e-ae3c-d829a699b197\") " pod="openstack/horizon-7f798b7b68-jtm8h" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.688899 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1a34267-2bcd-4a01-b2b6-7528c474a7a2-scripts\") pod \"horizon-776cdff46d-hvjw9\" (UID: \"f1a34267-2bcd-4a01-b2b6-7528c474a7a2\") " pod="openstack/horizon-776cdff46d-hvjw9" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.688948 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt9dr\" (UniqueName: \"kubernetes.io/projected/b4d63031-e072-466e-ae3c-d829a699b197-kube-api-access-pt9dr\") pod \"horizon-7f798b7b68-jtm8h\" (UID: \"b4d63031-e072-466e-ae3c-d829a699b197\") " pod="openstack/horizon-7f798b7b68-jtm8h" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.689024 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4d63031-e072-466e-ae3c-d829a699b197-horizon-tls-certs\") pod \"horizon-7f798b7b68-jtm8h\" (UID: \"b4d63031-e072-466e-ae3c-d829a699b197\") " pod="openstack/horizon-7f798b7b68-jtm8h" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.689094 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4d63031-e072-466e-ae3c-d829a699b197-scripts\") pod \"horizon-7f798b7b68-jtm8h\" (UID: \"b4d63031-e072-466e-ae3c-d829a699b197\") " pod="openstack/horizon-7f798b7b68-jtm8h" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.689188 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b4d63031-e072-466e-ae3c-d829a699b197-config-data\") pod \"horizon-7f798b7b68-jtm8h\" (UID: \"b4d63031-e072-466e-ae3c-d829a699b197\") " pod="openstack/horizon-7f798b7b68-jtm8h" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.689280 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1a34267-2bcd-4a01-b2b6-7528c474a7a2-logs\") pod \"horizon-776cdff46d-hvjw9\" (UID: \"f1a34267-2bcd-4a01-b2b6-7528c474a7a2\") " pod="openstack/horizon-776cdff46d-hvjw9" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.689319 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f1a34267-2bcd-4a01-b2b6-7528c474a7a2-config-data\") pod \"horizon-776cdff46d-hvjw9\" (UID: \"f1a34267-2bcd-4a01-b2b6-7528c474a7a2\") " pod="openstack/horizon-776cdff46d-hvjw9" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.689481 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4d63031-e072-466e-ae3c-d829a699b197-logs\") pod \"horizon-7f798b7b68-jtm8h\" (UID: \"b4d63031-e072-466e-ae3c-d829a699b197\") " pod="openstack/horizon-7f798b7b68-jtm8h" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.690379 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4d63031-e072-466e-ae3c-d829a699b197-scripts\") pod \"horizon-7f798b7b68-jtm8h\" (UID: \"b4d63031-e072-466e-ae3c-d829a699b197\") " pod="openstack/horizon-7f798b7b68-jtm8h" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.691021 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b4d63031-e072-466e-ae3c-d829a699b197-config-data\") pod \"horizon-7f798b7b68-jtm8h\" (UID: \"b4d63031-e072-466e-ae3c-d829a699b197\") " pod="openstack/horizon-7f798b7b68-jtm8h" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.696104 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b4d63031-e072-466e-ae3c-d829a699b197-horizon-secret-key\") pod \"horizon-7f798b7b68-jtm8h\" (UID: \"b4d63031-e072-466e-ae3c-d829a699b197\") " pod="openstack/horizon-7f798b7b68-jtm8h" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.699020 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4d63031-e072-466e-ae3c-d829a699b197-combined-ca-bundle\") pod \"horizon-7f798b7b68-jtm8h\" (UID: \"b4d63031-e072-466e-ae3c-d829a699b197\") " pod="openstack/horizon-7f798b7b68-jtm8h" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.702285 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4d63031-e072-466e-ae3c-d829a699b197-horizon-tls-certs\") pod \"horizon-7f798b7b68-jtm8h\" (UID: \"b4d63031-e072-466e-ae3c-d829a699b197\") " pod="openstack/horizon-7f798b7b68-jtm8h" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.709677 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt9dr\" (UniqueName: \"kubernetes.io/projected/b4d63031-e072-466e-ae3c-d829a699b197-kube-api-access-pt9dr\") pod \"horizon-7f798b7b68-jtm8h\" (UID: \"b4d63031-e072-466e-ae3c-d829a699b197\") " pod="openstack/horizon-7f798b7b68-jtm8h" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.765449 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f798b7b68-jtm8h" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.791376 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1a34267-2bcd-4a01-b2b6-7528c474a7a2-scripts\") pod \"horizon-776cdff46d-hvjw9\" (UID: \"f1a34267-2bcd-4a01-b2b6-7528c474a7a2\") " pod="openstack/horizon-776cdff46d-hvjw9" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.791496 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1a34267-2bcd-4a01-b2b6-7528c474a7a2-logs\") pod \"horizon-776cdff46d-hvjw9\" (UID: \"f1a34267-2bcd-4a01-b2b6-7528c474a7a2\") " pod="openstack/horizon-776cdff46d-hvjw9" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.791522 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f1a34267-2bcd-4a01-b2b6-7528c474a7a2-config-data\") pod \"horizon-776cdff46d-hvjw9\" (UID: \"f1a34267-2bcd-4a01-b2b6-7528c474a7a2\") " pod="openstack/horizon-776cdff46d-hvjw9" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.791589 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh9w6\" (UniqueName: \"kubernetes.io/projected/f1a34267-2bcd-4a01-b2b6-7528c474a7a2-kube-api-access-lh9w6\") pod \"horizon-776cdff46d-hvjw9\" (UID: \"f1a34267-2bcd-4a01-b2b6-7528c474a7a2\") " pod="openstack/horizon-776cdff46d-hvjw9" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.791629 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1a34267-2bcd-4a01-b2b6-7528c474a7a2-combined-ca-bundle\") pod \"horizon-776cdff46d-hvjw9\" (UID: \"f1a34267-2bcd-4a01-b2b6-7528c474a7a2\") " pod="openstack/horizon-776cdff46d-hvjw9" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.791654 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f1a34267-2bcd-4a01-b2b6-7528c474a7a2-horizon-secret-key\") pod \"horizon-776cdff46d-hvjw9\" (UID: \"f1a34267-2bcd-4a01-b2b6-7528c474a7a2\") " pod="openstack/horizon-776cdff46d-hvjw9" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.791692 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1a34267-2bcd-4a01-b2b6-7528c474a7a2-horizon-tls-certs\") pod \"horizon-776cdff46d-hvjw9\" (UID: \"f1a34267-2bcd-4a01-b2b6-7528c474a7a2\") " pod="openstack/horizon-776cdff46d-hvjw9" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.793407 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1a34267-2bcd-4a01-b2b6-7528c474a7a2-logs\") pod \"horizon-776cdff46d-hvjw9\" (UID: \"f1a34267-2bcd-4a01-b2b6-7528c474a7a2\") " pod="openstack/horizon-776cdff46d-hvjw9" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.793411 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1a34267-2bcd-4a01-b2b6-7528c474a7a2-scripts\") pod \"horizon-776cdff46d-hvjw9\" (UID: \"f1a34267-2bcd-4a01-b2b6-7528c474a7a2\") " pod="openstack/horizon-776cdff46d-hvjw9" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.793865 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f1a34267-2bcd-4a01-b2b6-7528c474a7a2-config-data\") pod \"horizon-776cdff46d-hvjw9\" (UID: \"f1a34267-2bcd-4a01-b2b6-7528c474a7a2\") " pod="openstack/horizon-776cdff46d-hvjw9" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.799355 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1a34267-2bcd-4a01-b2b6-7528c474a7a2-combined-ca-bundle\") pod \"horizon-776cdff46d-hvjw9\" (UID: \"f1a34267-2bcd-4a01-b2b6-7528c474a7a2\") " pod="openstack/horizon-776cdff46d-hvjw9" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.801105 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1a34267-2bcd-4a01-b2b6-7528c474a7a2-horizon-tls-certs\") pod \"horizon-776cdff46d-hvjw9\" (UID: \"f1a34267-2bcd-4a01-b2b6-7528c474a7a2\") " pod="openstack/horizon-776cdff46d-hvjw9" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.803413 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f1a34267-2bcd-4a01-b2b6-7528c474a7a2-horizon-secret-key\") pod \"horizon-776cdff46d-hvjw9\" (UID: \"f1a34267-2bcd-4a01-b2b6-7528c474a7a2\") " pod="openstack/horizon-776cdff46d-hvjw9" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.813661 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh9w6\" (UniqueName: \"kubernetes.io/projected/f1a34267-2bcd-4a01-b2b6-7528c474a7a2-kube-api-access-lh9w6\") pod \"horizon-776cdff46d-hvjw9\" (UID: \"f1a34267-2bcd-4a01-b2b6-7528c474a7a2\") " pod="openstack/horizon-776cdff46d-hvjw9" Mar 12 15:07:35 crc kubenswrapper[4869]: I0312 15:07:35.831136 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-776cdff46d-hvjw9" Mar 12 15:07:37 crc kubenswrapper[4869]: I0312 15:07:37.553368 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" Mar 12 15:07:37 crc kubenswrapper[4869]: I0312 15:07:37.617643 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-dqt8v"] Mar 12 15:07:37 crc kubenswrapper[4869]: I0312 15:07:37.617908 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" podUID="687b2dbd-45fa-4c40-887e-ef9ac210561e" containerName="dnsmasq-dns" containerID="cri-o://b315b65b82ba9afaa7e2b718cd9b4b04e3b2b1619ef13670192f0e8e37f67d60" gracePeriod=10 Mar 12 15:07:38 crc kubenswrapper[4869]: I0312 15:07:38.832062 4869 generic.go:334] "Generic (PLEG): container finished" podID="687b2dbd-45fa-4c40-887e-ef9ac210561e" containerID="b315b65b82ba9afaa7e2b718cd9b4b04e3b2b1619ef13670192f0e8e37f67d60" exitCode=0 Mar 12 15:07:38 crc kubenswrapper[4869]: I0312 15:07:38.832107 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" event={"ID":"687b2dbd-45fa-4c40-887e-ef9ac210561e","Type":"ContainerDied","Data":"b315b65b82ba9afaa7e2b718cd9b4b04e3b2b1619ef13670192f0e8e37f67d60"} Mar 12 15:07:40 crc kubenswrapper[4869]: I0312 15:07:40.832252 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" podUID="687b2dbd-45fa-4c40-887e-ef9ac210561e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Mar 12 15:07:45 crc kubenswrapper[4869]: I0312 15:07:45.831179 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" podUID="687b2dbd-45fa-4c40-887e-ef9ac210561e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Mar 12 15:07:47 crc kubenswrapper[4869]: E0312 15:07:47.755227 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 12 15:07:47 crc kubenswrapper[4869]: E0312 15:07:47.755687 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ncchcdh89hcch5bbh68dh99h5c5h699h58ch5bbh96hb5h688h64bh547h88hd9h68ch576h646h54dh58bh5b7h57bh54h694h97h566h5bch597h557q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ph9hb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7df4569cb7-bfnpz_openstack(5ea3aeb8-779a-45a1-b601-0e6f2776311a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 15:07:47 crc kubenswrapper[4869]: E0312 15:07:47.757845 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-7df4569cb7-bfnpz" podUID="5ea3aeb8-779a-45a1-b601-0e6f2776311a" Mar 12 15:07:47 crc kubenswrapper[4869]: E0312 15:07:47.777925 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 12 15:07:47 crc kubenswrapper[4869]: E0312 15:07:47.778147 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n685h79h545h665h54fhfdh675hcfh5ch7dh79hc5h4h5b6h8hbch59dh575h7bh78h55chf6h56bh544h67dh54h64fh647h5dchcch55dh549q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rwm5b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5855b8d555-mwzss_openstack(d7b2cdfe-e6e9-47b3-99de-d14e20f3330b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 15:07:47 crc kubenswrapper[4869]: E0312 15:07:47.780832 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5855b8d555-mwzss" podUID="d7b2cdfe-e6e9-47b3-99de-d14e20f3330b" Mar 12 15:07:47 crc kubenswrapper[4869]: E0312 15:07:47.784174 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 12 15:07:47 crc kubenswrapper[4869]: E0312 15:07:47.784576 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n58ch6dh94h644h5ffh6fh9fh8h596h54hd9h5fdh5d8h5c5h56dh9bh69h648h578hb7h85h588h66ch5dch649h564hbh58dhfbhf9h89hb7q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dr48s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5cb86d8b85-vl7p7_openstack(18e6c8a7-be39-4624-b798-c2f6715ab6e8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 15:07:47 crc kubenswrapper[4869]: E0312 15:07:47.788812 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5cb86d8b85-vl7p7" podUID="18e6c8a7-be39-4624-b798-c2f6715ab6e8" Mar 12 15:07:48 crc kubenswrapper[4869]: E0312 15:07:48.453571 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 12 15:07:48 crc kubenswrapper[4869]: E0312 15:07:48.453905 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v2rtz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-4rkhc_openstack(bb0e86fd-502f-4fef-9f29-4d612a8d111f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 15:07:48 crc kubenswrapper[4869]: E0312 15:07:48.455109 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-4rkhc" podUID="bb0e86fd-502f-4fef-9f29-4d612a8d111f" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.532229 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r56x2" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.537575 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.541947 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.663410 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfea72a5-eabd-4a81-bccb-ce2a59466007-config-data\") pod \"dfea72a5-eabd-4a81-bccb-ce2a59466007\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.663801 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfea72a5-eabd-4a81-bccb-ce2a59466007-scripts\") pod \"dfea72a5-eabd-4a81-bccb-ce2a59466007\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.663833 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d771411-1e30-4680-913f-a9e0a9a74e31-combined-ca-bundle\") pod \"8d771411-1e30-4680-913f-a9e0a9a74e31\" (UID: \"8d771411-1e30-4680-913f-a9e0a9a74e31\") " Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.663889 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aedadb74-4f2b-40b3-9847-20519eea4209-logs\") pod \"aedadb74-4f2b-40b3-9847-20519eea4209\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.663934 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8d771411-1e30-4680-913f-a9e0a9a74e31-credential-keys\") pod \"8d771411-1e30-4680-913f-a9e0a9a74e31\" (UID: \"8d771411-1e30-4680-913f-a9e0a9a74e31\") " Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.663967 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"dfea72a5-eabd-4a81-bccb-ce2a59466007\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.663992 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d771411-1e30-4680-913f-a9e0a9a74e31-scripts\") pod \"8d771411-1e30-4680-913f-a9e0a9a74e31\" (UID: \"8d771411-1e30-4680-913f-a9e0a9a74e31\") " Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.664034 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfea72a5-eabd-4a81-bccb-ce2a59466007-internal-tls-certs\") pod \"dfea72a5-eabd-4a81-bccb-ce2a59466007\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.664071 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfea72a5-eabd-4a81-bccb-ce2a59466007-logs\") pod \"dfea72a5-eabd-4a81-bccb-ce2a59466007\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.664104 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7zxk\" (UniqueName: \"kubernetes.io/projected/dfea72a5-eabd-4a81-bccb-ce2a59466007-kube-api-access-n7zxk\") pod \"dfea72a5-eabd-4a81-bccb-ce2a59466007\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.664133 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aedadb74-4f2b-40b3-9847-20519eea4209-scripts\") pod \"aedadb74-4f2b-40b3-9847-20519eea4209\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.664173 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aedadb74-4f2b-40b3-9847-20519eea4209-config-data\") pod \"aedadb74-4f2b-40b3-9847-20519eea4209\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.664212 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d771411-1e30-4680-913f-a9e0a9a74e31-config-data\") pod \"8d771411-1e30-4680-913f-a9e0a9a74e31\" (UID: \"8d771411-1e30-4680-913f-a9e0a9a74e31\") " Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.664235 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aedadb74-4f2b-40b3-9847-20519eea4209-combined-ca-bundle\") pod \"aedadb74-4f2b-40b3-9847-20519eea4209\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.664276 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgnv6\" (UniqueName: \"kubernetes.io/projected/aedadb74-4f2b-40b3-9847-20519eea4209-kube-api-access-wgnv6\") pod \"aedadb74-4f2b-40b3-9847-20519eea4209\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.664314 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aedadb74-4f2b-40b3-9847-20519eea4209-public-tls-certs\") pod \"aedadb74-4f2b-40b3-9847-20519eea4209\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.664354 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8d771411-1e30-4680-913f-a9e0a9a74e31-fernet-keys\") pod \"8d771411-1e30-4680-913f-a9e0a9a74e31\" (UID: \"8d771411-1e30-4680-913f-a9e0a9a74e31\") " Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.664386 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/dfea72a5-eabd-4a81-bccb-ce2a59466007-ceph\") pod \"dfea72a5-eabd-4a81-bccb-ce2a59466007\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.664412 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6pcb\" (UniqueName: \"kubernetes.io/projected/8d771411-1e30-4680-913f-a9e0a9a74e31-kube-api-access-m6pcb\") pod \"8d771411-1e30-4680-913f-a9e0a9a74e31\" (UID: \"8d771411-1e30-4680-913f-a9e0a9a74e31\") " Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.664433 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/aedadb74-4f2b-40b3-9847-20519eea4209-ceph\") pod \"aedadb74-4f2b-40b3-9847-20519eea4209\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.664472 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dfea72a5-eabd-4a81-bccb-ce2a59466007-httpd-run\") pod \"dfea72a5-eabd-4a81-bccb-ce2a59466007\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.664514 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aedadb74-4f2b-40b3-9847-20519eea4209-httpd-run\") pod \"aedadb74-4f2b-40b3-9847-20519eea4209\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.664557 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"aedadb74-4f2b-40b3-9847-20519eea4209\" (UID: \"aedadb74-4f2b-40b3-9847-20519eea4209\") " Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.664587 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfea72a5-eabd-4a81-bccb-ce2a59466007-combined-ca-bundle\") pod \"dfea72a5-eabd-4a81-bccb-ce2a59466007\" (UID: \"dfea72a5-eabd-4a81-bccb-ce2a59466007\") " Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.668878 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfea72a5-eabd-4a81-bccb-ce2a59466007-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "dfea72a5-eabd-4a81-bccb-ce2a59466007" (UID: "dfea72a5-eabd-4a81-bccb-ce2a59466007"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.670382 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aedadb74-4f2b-40b3-9847-20519eea4209-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "aedadb74-4f2b-40b3-9847-20519eea4209" (UID: "aedadb74-4f2b-40b3-9847-20519eea4209"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.670636 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aedadb74-4f2b-40b3-9847-20519eea4209-logs" (OuterVolumeSpecName: "logs") pod "aedadb74-4f2b-40b3-9847-20519eea4209" (UID: "aedadb74-4f2b-40b3-9847-20519eea4209"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.670864 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfea72a5-eabd-4a81-bccb-ce2a59466007-logs" (OuterVolumeSpecName: "logs") pod "dfea72a5-eabd-4a81-bccb-ce2a59466007" (UID: "dfea72a5-eabd-4a81-bccb-ce2a59466007"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.674515 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d771411-1e30-4680-913f-a9e0a9a74e31-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8d771411-1e30-4680-913f-a9e0a9a74e31" (UID: "8d771411-1e30-4680-913f-a9e0a9a74e31"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.678030 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d771411-1e30-4680-913f-a9e0a9a74e31-kube-api-access-m6pcb" (OuterVolumeSpecName: "kube-api-access-m6pcb") pod "8d771411-1e30-4680-913f-a9e0a9a74e31" (UID: "8d771411-1e30-4680-913f-a9e0a9a74e31"). InnerVolumeSpecName "kube-api-access-m6pcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.678455 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aedadb74-4f2b-40b3-9847-20519eea4209-kube-api-access-wgnv6" (OuterVolumeSpecName: "kube-api-access-wgnv6") pod "aedadb74-4f2b-40b3-9847-20519eea4209" (UID: "aedadb74-4f2b-40b3-9847-20519eea4209"). InnerVolumeSpecName "kube-api-access-wgnv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.678737 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aedadb74-4f2b-40b3-9847-20519eea4209-scripts" (OuterVolumeSpecName: "scripts") pod "aedadb74-4f2b-40b3-9847-20519eea4209" (UID: "aedadb74-4f2b-40b3-9847-20519eea4209"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.678969 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "aedadb74-4f2b-40b3-9847-20519eea4209" (UID: "aedadb74-4f2b-40b3-9847-20519eea4209"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.679284 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfea72a5-eabd-4a81-bccb-ce2a59466007-ceph" (OuterVolumeSpecName: "ceph") pod "dfea72a5-eabd-4a81-bccb-ce2a59466007" (UID: "dfea72a5-eabd-4a81-bccb-ce2a59466007"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.679981 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d771411-1e30-4680-913f-a9e0a9a74e31-scripts" (OuterVolumeSpecName: "scripts") pod "8d771411-1e30-4680-913f-a9e0a9a74e31" (UID: "8d771411-1e30-4680-913f-a9e0a9a74e31"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.680651 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfea72a5-eabd-4a81-bccb-ce2a59466007-kube-api-access-n7zxk" (OuterVolumeSpecName: "kube-api-access-n7zxk") pod "dfea72a5-eabd-4a81-bccb-ce2a59466007" (UID: "dfea72a5-eabd-4a81-bccb-ce2a59466007"). InnerVolumeSpecName "kube-api-access-n7zxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.681671 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfea72a5-eabd-4a81-bccb-ce2a59466007-scripts" (OuterVolumeSpecName: "scripts") pod "dfea72a5-eabd-4a81-bccb-ce2a59466007" (UID: "dfea72a5-eabd-4a81-bccb-ce2a59466007"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.682257 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "dfea72a5-eabd-4a81-bccb-ce2a59466007" (UID: "dfea72a5-eabd-4a81-bccb-ce2a59466007"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.682895 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aedadb74-4f2b-40b3-9847-20519eea4209-ceph" (OuterVolumeSpecName: "ceph") pod "aedadb74-4f2b-40b3-9847-20519eea4209" (UID: "aedadb74-4f2b-40b3-9847-20519eea4209"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.702302 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d771411-1e30-4680-913f-a9e0a9a74e31-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8d771411-1e30-4680-913f-a9e0a9a74e31" (UID: "8d771411-1e30-4680-913f-a9e0a9a74e31"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.711452 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aedadb74-4f2b-40b3-9847-20519eea4209-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aedadb74-4f2b-40b3-9847-20519eea4209" (UID: "aedadb74-4f2b-40b3-9847-20519eea4209"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.712217 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d771411-1e30-4680-913f-a9e0a9a74e31-config-data" (OuterVolumeSpecName: "config-data") pod "8d771411-1e30-4680-913f-a9e0a9a74e31" (UID: "8d771411-1e30-4680-913f-a9e0a9a74e31"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.714841 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfea72a5-eabd-4a81-bccb-ce2a59466007-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dfea72a5-eabd-4a81-bccb-ce2a59466007" (UID: "dfea72a5-eabd-4a81-bccb-ce2a59466007"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.726881 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d771411-1e30-4680-913f-a9e0a9a74e31-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d771411-1e30-4680-913f-a9e0a9a74e31" (UID: "8d771411-1e30-4680-913f-a9e0a9a74e31"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.750869 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfea72a5-eabd-4a81-bccb-ce2a59466007-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "dfea72a5-eabd-4a81-bccb-ce2a59466007" (UID: "dfea72a5-eabd-4a81-bccb-ce2a59466007"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.754906 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aedadb74-4f2b-40b3-9847-20519eea4209-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "aedadb74-4f2b-40b3-9847-20519eea4209" (UID: "aedadb74-4f2b-40b3-9847-20519eea4209"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.760625 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aedadb74-4f2b-40b3-9847-20519eea4209-config-data" (OuterVolumeSpecName: "config-data") pod "aedadb74-4f2b-40b3-9847-20519eea4209" (UID: "aedadb74-4f2b-40b3-9847-20519eea4209"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.763188 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfea72a5-eabd-4a81-bccb-ce2a59466007-config-data" (OuterVolumeSpecName: "config-data") pod "dfea72a5-eabd-4a81-bccb-ce2a59466007" (UID: "dfea72a5-eabd-4a81-bccb-ce2a59466007"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.767224 4869 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aedadb74-4f2b-40b3-9847-20519eea4209-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.767311 4869 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.767326 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfea72a5-eabd-4a81-bccb-ce2a59466007-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.767359 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfea72a5-eabd-4a81-bccb-ce2a59466007-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.767369 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfea72a5-eabd-4a81-bccb-ce2a59466007-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.767381 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d771411-1e30-4680-913f-a9e0a9a74e31-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.767393 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aedadb74-4f2b-40b3-9847-20519eea4209-logs\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.767403 4869 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8d771411-1e30-4680-913f-a9e0a9a74e31-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.767416 4869 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.767424 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d771411-1e30-4680-913f-a9e0a9a74e31-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.767433 4869 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfea72a5-eabd-4a81-bccb-ce2a59466007-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.767441 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfea72a5-eabd-4a81-bccb-ce2a59466007-logs\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.767449 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7zxk\" (UniqueName: \"kubernetes.io/projected/dfea72a5-eabd-4a81-bccb-ce2a59466007-kube-api-access-n7zxk\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.767457 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aedadb74-4f2b-40b3-9847-20519eea4209-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.767467 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aedadb74-4f2b-40b3-9847-20519eea4209-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.767477 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d771411-1e30-4680-913f-a9e0a9a74e31-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.767488 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aedadb74-4f2b-40b3-9847-20519eea4209-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.767497 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgnv6\" (UniqueName: \"kubernetes.io/projected/aedadb74-4f2b-40b3-9847-20519eea4209-kube-api-access-wgnv6\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.767505 4869 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aedadb74-4f2b-40b3-9847-20519eea4209-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.767512 4869 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8d771411-1e30-4680-913f-a9e0a9a74e31-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.767520 4869 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/dfea72a5-eabd-4a81-bccb-ce2a59466007-ceph\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.767528 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6pcb\" (UniqueName: \"kubernetes.io/projected/8d771411-1e30-4680-913f-a9e0a9a74e31-kube-api-access-m6pcb\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.767538 4869 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/aedadb74-4f2b-40b3-9847-20519eea4209-ceph\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.767560 4869 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dfea72a5-eabd-4a81-bccb-ce2a59466007-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.783027 4869 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.784687 4869 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.869015 4869 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.869044 4869 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.938877 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.938970 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"aedadb74-4f2b-40b3-9847-20519eea4209","Type":"ContainerDied","Data":"40f4ed5eedd6b23c3234b849cb149b1e196d3e0abacd82fea591a97dd216c6c8"} Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.940007 4869 scope.go:117] "RemoveContainer" containerID="00f9b6845c52e5cc95094a367d431b00501ccfb18fb55765927fb0a79d1d0c44" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.943239 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dfea72a5-eabd-4a81-bccb-ce2a59466007","Type":"ContainerDied","Data":"d4b53209a044d3b04f18b9ddfd700b128ba1c8cf84457e62beda49cd8d5c6381"} Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.943331 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.948461 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r56x2" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.948497 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r56x2" event={"ID":"8d771411-1e30-4680-913f-a9e0a9a74e31","Type":"ContainerDied","Data":"05e36edbb8fed3a56c5437c1e5180732ff88885e322dddb1ebfa29c65b641e5b"} Mar 12 15:07:48 crc kubenswrapper[4869]: E0312 15:07:48.950315 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-4rkhc" podUID="bb0e86fd-502f-4fef-9f29-4d612a8d111f" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.955630 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05e36edbb8fed3a56c5437c1e5180732ff88885e322dddb1ebfa29c65b641e5b" Mar 12 15:07:48 crc kubenswrapper[4869]: I0312 15:07:48.988490 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.018504 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.045717 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.057949 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.067286 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 15:07:49 crc kubenswrapper[4869]: E0312 15:07:49.069118 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfea72a5-eabd-4a81-bccb-ce2a59466007" containerName="glance-log" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.069141 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfea72a5-eabd-4a81-bccb-ce2a59466007" containerName="glance-log" Mar 12 15:07:49 crc kubenswrapper[4869]: E0312 15:07:49.069164 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aedadb74-4f2b-40b3-9847-20519eea4209" containerName="glance-httpd" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.069172 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="aedadb74-4f2b-40b3-9847-20519eea4209" containerName="glance-httpd" Mar 12 15:07:49 crc kubenswrapper[4869]: E0312 15:07:49.069192 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d771411-1e30-4680-913f-a9e0a9a74e31" containerName="keystone-bootstrap" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.069201 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d771411-1e30-4680-913f-a9e0a9a74e31" containerName="keystone-bootstrap" Mar 12 15:07:49 crc kubenswrapper[4869]: E0312 15:07:49.069218 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aedadb74-4f2b-40b3-9847-20519eea4209" containerName="glance-log" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.069227 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="aedadb74-4f2b-40b3-9847-20519eea4209" containerName="glance-log" Mar 12 15:07:49 crc kubenswrapper[4869]: E0312 15:07:49.069245 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfea72a5-eabd-4a81-bccb-ce2a59466007" containerName="glance-httpd" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.069252 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfea72a5-eabd-4a81-bccb-ce2a59466007" containerName="glance-httpd" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.070086 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfea72a5-eabd-4a81-bccb-ce2a59466007" containerName="glance-log" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.070116 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="aedadb74-4f2b-40b3-9847-20519eea4209" containerName="glance-log" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.070150 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="aedadb74-4f2b-40b3-9847-20519eea4209" containerName="glance-httpd" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.070160 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfea72a5-eabd-4a81-bccb-ce2a59466007" containerName="glance-httpd" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.070172 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d771411-1e30-4680-913f-a9e0a9a74e31" containerName="keystone-bootstrap" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.076739 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.080256 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.080699 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-g4pjv" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.080719 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.080877 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.082054 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.082345 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.095030 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.099186 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.103167 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.103381 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.126883 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.167963 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7df4569cb7-bfnpz" Mar 12 15:07:49 crc kubenswrapper[4869]: E0312 15:07:49.171352 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-manila-api:current-podified" Mar 12 15:07:49 crc kubenswrapper[4869]: E0312 15:07:49.171470 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manila-db-sync,Image:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,Command:[/bin/bash],Args:[-c sleep 0 && /usr/bin/manila-manage --config-dir /etc/manila/manila.conf.d db sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:job-config-data,ReadOnly:true,MountPath:/etc/manila/manila.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r49ql,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42429,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42429,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-db-sync-9p9gw_openstack(e8721cab-3eb8-4c80-a0c8-79c7e007b614): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 15:07:49 crc kubenswrapper[4869]: E0312 15:07:49.172517 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manila-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/manila-db-sync-9p9gw" podUID="e8721cab-3eb8-4c80-a0c8-79c7e007b614" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.175963 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.176040 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.176168 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c423b809-18ee-493c-91b6-d30846b0d68b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.176216 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.176274 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c423b809-18ee-493c-91b6-d30846b0d68b-ceph\") pod \"glance-default-external-api-0\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.176457 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-logs\") pod \"glance-default-internal-api-0\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.176490 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hdtp\" (UniqueName: \"kubernetes.io/projected/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-kube-api-access-5hdtp\") pod \"glance-default-internal-api-0\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.176519 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.176579 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.176704 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnfpc\" (UniqueName: \"kubernetes.io/projected/c423b809-18ee-493c-91b6-d30846b0d68b-kube-api-access-rnfpc\") pod \"glance-default-external-api-0\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.176758 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.176852 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c423b809-18ee-493c-91b6-d30846b0d68b-logs\") pod \"glance-default-external-api-0\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.176872 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.178946 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c423b809-18ee-493c-91b6-d30846b0d68b-config-data\") pod \"glance-default-external-api-0\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.178994 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c423b809-18ee-493c-91b6-d30846b0d68b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.179019 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c423b809-18ee-493c-91b6-d30846b0d68b-scripts\") pod \"glance-default-external-api-0\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.179048 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c423b809-18ee-493c-91b6-d30846b0d68b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.179095 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.183033 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5855b8d555-mwzss" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.189638 4869 scope.go:117] "RemoveContainer" containerID="aea473439d4553f1b190401df584a21257a0d0d8aa41a8425de7002607e36162" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.210116 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cb86d8b85-vl7p7" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.211086 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.281573 4869 scope.go:117] "RemoveContainer" containerID="129921bc6a64894b2d2b91a61a91ce9be61427c612008ba736573d0c97d82793" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.281921 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/687b2dbd-45fa-4c40-887e-ef9ac210561e-dns-svc\") pod \"687b2dbd-45fa-4c40-887e-ef9ac210561e\" (UID: \"687b2dbd-45fa-4c40-887e-ef9ac210561e\") " Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.281956 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/687b2dbd-45fa-4c40-887e-ef9ac210561e-dns-swift-storage-0\") pod \"687b2dbd-45fa-4c40-887e-ef9ac210561e\" (UID: \"687b2dbd-45fa-4c40-887e-ef9ac210561e\") " Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.281975 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18e6c8a7-be39-4624-b798-c2f6715ab6e8-scripts\") pod \"18e6c8a7-be39-4624-b798-c2f6715ab6e8\" (UID: \"18e6c8a7-be39-4624-b798-c2f6715ab6e8\") " Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282007 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp4wj\" (UniqueName: \"kubernetes.io/projected/687b2dbd-45fa-4c40-887e-ef9ac210561e-kube-api-access-hp4wj\") pod \"687b2dbd-45fa-4c40-887e-ef9ac210561e\" (UID: \"687b2dbd-45fa-4c40-887e-ef9ac210561e\") " Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282047 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7b2cdfe-e6e9-47b3-99de-d14e20f3330b-scripts\") pod \"d7b2cdfe-e6e9-47b3-99de-d14e20f3330b\" (UID: \"d7b2cdfe-e6e9-47b3-99de-d14e20f3330b\") " Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282095 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/18e6c8a7-be39-4624-b798-c2f6715ab6e8-horizon-secret-key\") pod \"18e6c8a7-be39-4624-b798-c2f6715ab6e8\" (UID: \"18e6c8a7-be39-4624-b798-c2f6715ab6e8\") " Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282154 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18e6c8a7-be39-4624-b798-c2f6715ab6e8-logs\") pod \"18e6c8a7-be39-4624-b798-c2f6715ab6e8\" (UID: \"18e6c8a7-be39-4624-b798-c2f6715ab6e8\") " Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282184 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18e6c8a7-be39-4624-b798-c2f6715ab6e8-config-data\") pod \"18e6c8a7-be39-4624-b798-c2f6715ab6e8\" (UID: \"18e6c8a7-be39-4624-b798-c2f6715ab6e8\") " Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282199 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d7b2cdfe-e6e9-47b3-99de-d14e20f3330b-horizon-secret-key\") pod \"d7b2cdfe-e6e9-47b3-99de-d14e20f3330b\" (UID: \"d7b2cdfe-e6e9-47b3-99de-d14e20f3330b\") " Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282219 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ph9hb\" (UniqueName: \"kubernetes.io/projected/5ea3aeb8-779a-45a1-b601-0e6f2776311a-kube-api-access-ph9hb\") pod \"5ea3aeb8-779a-45a1-b601-0e6f2776311a\" (UID: \"5ea3aeb8-779a-45a1-b601-0e6f2776311a\") " Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282269 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwm5b\" (UniqueName: \"kubernetes.io/projected/d7b2cdfe-e6e9-47b3-99de-d14e20f3330b-kube-api-access-rwm5b\") pod \"d7b2cdfe-e6e9-47b3-99de-d14e20f3330b\" (UID: \"d7b2cdfe-e6e9-47b3-99de-d14e20f3330b\") " Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282302 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5ea3aeb8-779a-45a1-b601-0e6f2776311a-config-data\") pod \"5ea3aeb8-779a-45a1-b601-0e6f2776311a\" (UID: \"5ea3aeb8-779a-45a1-b601-0e6f2776311a\") " Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282317 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5ea3aeb8-779a-45a1-b601-0e6f2776311a-scripts\") pod \"5ea3aeb8-779a-45a1-b601-0e6f2776311a\" (UID: \"5ea3aeb8-779a-45a1-b601-0e6f2776311a\") " Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282335 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dr48s\" (UniqueName: \"kubernetes.io/projected/18e6c8a7-be39-4624-b798-c2f6715ab6e8-kube-api-access-dr48s\") pod \"18e6c8a7-be39-4624-b798-c2f6715ab6e8\" (UID: \"18e6c8a7-be39-4624-b798-c2f6715ab6e8\") " Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282349 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ea3aeb8-779a-45a1-b601-0e6f2776311a-logs\") pod \"5ea3aeb8-779a-45a1-b601-0e6f2776311a\" (UID: \"5ea3aeb8-779a-45a1-b601-0e6f2776311a\") " Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282370 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5ea3aeb8-779a-45a1-b601-0e6f2776311a-horizon-secret-key\") pod \"5ea3aeb8-779a-45a1-b601-0e6f2776311a\" (UID: \"5ea3aeb8-779a-45a1-b601-0e6f2776311a\") " Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282390 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/687b2dbd-45fa-4c40-887e-ef9ac210561e-ovsdbserver-nb\") pod \"687b2dbd-45fa-4c40-887e-ef9ac210561e\" (UID: \"687b2dbd-45fa-4c40-887e-ef9ac210561e\") " Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282406 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d7b2cdfe-e6e9-47b3-99de-d14e20f3330b-config-data\") pod \"d7b2cdfe-e6e9-47b3-99de-d14e20f3330b\" (UID: \"d7b2cdfe-e6e9-47b3-99de-d14e20f3330b\") " Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282423 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7b2cdfe-e6e9-47b3-99de-d14e20f3330b-logs\") pod \"d7b2cdfe-e6e9-47b3-99de-d14e20f3330b\" (UID: \"d7b2cdfe-e6e9-47b3-99de-d14e20f3330b\") " Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282444 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/687b2dbd-45fa-4c40-887e-ef9ac210561e-config\") pod \"687b2dbd-45fa-4c40-887e-ef9ac210561e\" (UID: \"687b2dbd-45fa-4c40-887e-ef9ac210561e\") " Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282462 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/687b2dbd-45fa-4c40-887e-ef9ac210561e-ovsdbserver-sb\") pod \"687b2dbd-45fa-4c40-887e-ef9ac210561e\" (UID: \"687b2dbd-45fa-4c40-887e-ef9ac210561e\") " Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282635 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c423b809-18ee-493c-91b6-d30846b0d68b-logs\") pod \"glance-default-external-api-0\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282652 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282688 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c423b809-18ee-493c-91b6-d30846b0d68b-config-data\") pod \"glance-default-external-api-0\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282710 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c423b809-18ee-493c-91b6-d30846b0d68b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282728 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c423b809-18ee-493c-91b6-d30846b0d68b-scripts\") pod \"glance-default-external-api-0\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282746 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c423b809-18ee-493c-91b6-d30846b0d68b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282770 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282791 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282816 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282843 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c423b809-18ee-493c-91b6-d30846b0d68b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282859 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282881 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c423b809-18ee-493c-91b6-d30846b0d68b-ceph\") pod \"glance-default-external-api-0\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282903 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-logs\") pod \"glance-default-internal-api-0\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282918 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hdtp\" (UniqueName: \"kubernetes.io/projected/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-kube-api-access-5hdtp\") pod \"glance-default-internal-api-0\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282936 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282953 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282975 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnfpc\" (UniqueName: \"kubernetes.io/projected/c423b809-18ee-493c-91b6-d30846b0d68b-kube-api-access-rnfpc\") pod \"glance-default-external-api-0\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.282993 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.283372 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.313901 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18e6c8a7-be39-4624-b798-c2f6715ab6e8-scripts" (OuterVolumeSpecName: "scripts") pod "18e6c8a7-be39-4624-b798-c2f6715ab6e8" (UID: "18e6c8a7-be39-4624-b798-c2f6715ab6e8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.318555 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7b2cdfe-e6e9-47b3-99de-d14e20f3330b-config-data" (OuterVolumeSpecName: "config-data") pod "d7b2cdfe-e6e9-47b3-99de-d14e20f3330b" (UID: "d7b2cdfe-e6e9-47b3-99de-d14e20f3330b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.319296 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ea3aeb8-779a-45a1-b601-0e6f2776311a-config-data" (OuterVolumeSpecName: "config-data") pod "5ea3aeb8-779a-45a1-b601-0e6f2776311a" (UID: "5ea3aeb8-779a-45a1-b601-0e6f2776311a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.319805 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ea3aeb8-779a-45a1-b601-0e6f2776311a-scripts" (OuterVolumeSpecName: "scripts") pod "5ea3aeb8-779a-45a1-b601-0e6f2776311a" (UID: "5ea3aeb8-779a-45a1-b601-0e6f2776311a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.320951 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/687b2dbd-45fa-4c40-887e-ef9ac210561e-kube-api-access-hp4wj" (OuterVolumeSpecName: "kube-api-access-hp4wj") pod "687b2dbd-45fa-4c40-887e-ef9ac210561e" (UID: "687b2dbd-45fa-4c40-887e-ef9ac210561e"). InnerVolumeSpecName "kube-api-access-hp4wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.321491 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.321996 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18e6c8a7-be39-4624-b798-c2f6715ab6e8-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "18e6c8a7-be39-4624-b798-c2f6715ab6e8" (UID: "18e6c8a7-be39-4624-b798-c2f6715ab6e8"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.322021 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.323139 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7b2cdfe-e6e9-47b3-99de-d14e20f3330b-scripts" (OuterVolumeSpecName: "scripts") pod "d7b2cdfe-e6e9-47b3-99de-d14e20f3330b" (UID: "d7b2cdfe-e6e9-47b3-99de-d14e20f3330b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.323996 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.324341 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-logs\") pod \"glance-default-internal-api-0\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.324999 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ea3aeb8-779a-45a1-b601-0e6f2776311a-logs" (OuterVolumeSpecName: "logs") pod "5ea3aeb8-779a-45a1-b601-0e6f2776311a" (UID: "5ea3aeb8-779a-45a1-b601-0e6f2776311a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.326417 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18e6c8a7-be39-4624-b798-c2f6715ab6e8-logs" (OuterVolumeSpecName: "logs") pod "18e6c8a7-be39-4624-b798-c2f6715ab6e8" (UID: "18e6c8a7-be39-4624-b798-c2f6715ab6e8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.326964 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18e6c8a7-be39-4624-b798-c2f6715ab6e8-config-data" (OuterVolumeSpecName: "config-data") pod "18e6c8a7-be39-4624-b798-c2f6715ab6e8" (UID: "18e6c8a7-be39-4624-b798-c2f6715ab6e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.327286 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c423b809-18ee-493c-91b6-d30846b0d68b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.327293 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c423b809-18ee-493c-91b6-d30846b0d68b-ceph\") pod \"glance-default-external-api-0\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.327820 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c423b809-18ee-493c-91b6-d30846b0d68b-logs\") pod \"glance-default-external-api-0\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.328442 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7b2cdfe-e6e9-47b3-99de-d14e20f3330b-logs" (OuterVolumeSpecName: "logs") pod "d7b2cdfe-e6e9-47b3-99de-d14e20f3330b" (UID: "d7b2cdfe-e6e9-47b3-99de-d14e20f3330b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.328832 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ea3aeb8-779a-45a1-b601-0e6f2776311a-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "5ea3aeb8-779a-45a1-b601-0e6f2776311a" (UID: "5ea3aeb8-779a-45a1-b601-0e6f2776311a"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.337446 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7b2cdfe-e6e9-47b3-99de-d14e20f3330b-kube-api-access-rwm5b" (OuterVolumeSpecName: "kube-api-access-rwm5b") pod "d7b2cdfe-e6e9-47b3-99de-d14e20f3330b" (UID: "d7b2cdfe-e6e9-47b3-99de-d14e20f3330b"). InnerVolumeSpecName "kube-api-access-rwm5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.347286 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.359144 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18e6c8a7-be39-4624-b798-c2f6715ab6e8-kube-api-access-dr48s" (OuterVolumeSpecName: "kube-api-access-dr48s") pod "18e6c8a7-be39-4624-b798-c2f6715ab6e8" (UID: "18e6c8a7-be39-4624-b798-c2f6715ab6e8"). InnerVolumeSpecName "kube-api-access-dr48s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.359279 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.359940 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c423b809-18ee-493c-91b6-d30846b0d68b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.363395 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c423b809-18ee-493c-91b6-d30846b0d68b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.364395 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.364720 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ea3aeb8-779a-45a1-b601-0e6f2776311a-kube-api-access-ph9hb" (OuterVolumeSpecName: "kube-api-access-ph9hb") pod "5ea3aeb8-779a-45a1-b601-0e6f2776311a" (UID: "5ea3aeb8-779a-45a1-b601-0e6f2776311a"). InnerVolumeSpecName "kube-api-access-ph9hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.366388 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hdtp\" (UniqueName: \"kubernetes.io/projected/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-kube-api-access-5hdtp\") pod \"glance-default-internal-api-0\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.367260 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7b2cdfe-e6e9-47b3-99de-d14e20f3330b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d7b2cdfe-e6e9-47b3-99de-d14e20f3330b" (UID: "d7b2cdfe-e6e9-47b3-99de-d14e20f3330b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.387675 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwm5b\" (UniqueName: \"kubernetes.io/projected/d7b2cdfe-e6e9-47b3-99de-d14e20f3330b-kube-api-access-rwm5b\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.387708 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5ea3aeb8-779a-45a1-b601-0e6f2776311a-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.387722 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5ea3aeb8-779a-45a1-b601-0e6f2776311a-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.387733 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ea3aeb8-779a-45a1-b601-0e6f2776311a-logs\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.387754 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dr48s\" (UniqueName: \"kubernetes.io/projected/18e6c8a7-be39-4624-b798-c2f6715ab6e8-kube-api-access-dr48s\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.387766 4869 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5ea3aeb8-779a-45a1-b601-0e6f2776311a-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.387777 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d7b2cdfe-e6e9-47b3-99de-d14e20f3330b-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.387788 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7b2cdfe-e6e9-47b3-99de-d14e20f3330b-logs\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.387802 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18e6c8a7-be39-4624-b798-c2f6715ab6e8-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.387813 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp4wj\" (UniqueName: \"kubernetes.io/projected/687b2dbd-45fa-4c40-887e-ef9ac210561e-kube-api-access-hp4wj\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.387825 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7b2cdfe-e6e9-47b3-99de-d14e20f3330b-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.387836 4869 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/18e6c8a7-be39-4624-b798-c2f6715ab6e8-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.387848 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18e6c8a7-be39-4624-b798-c2f6715ab6e8-logs\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.387858 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18e6c8a7-be39-4624-b798-c2f6715ab6e8-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.387868 4869 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d7b2cdfe-e6e9-47b3-99de-d14e20f3330b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.387879 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ph9hb\" (UniqueName: \"kubernetes.io/projected/5ea3aeb8-779a-45a1-b601-0e6f2776311a-kube-api-access-ph9hb\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.398191 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c423b809-18ee-493c-91b6-d30846b0d68b-scripts\") pod \"glance-default-external-api-0\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.398412 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.404654 4869 scope.go:117] "RemoveContainer" containerID="9bdddc46067e2d496dd9fd439da79e0fdf11052bb41bd6cbc5da3c431fc88000" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.408412 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnfpc\" (UniqueName: \"kubernetes.io/projected/c423b809-18ee-493c-91b6-d30846b0d68b-kube-api-access-rnfpc\") pod \"glance-default-external-api-0\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.410415 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.412838 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c423b809-18ee-493c-91b6-d30846b0d68b-config-data\") pod \"glance-default-external-api-0\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " pod="openstack/glance-default-external-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.503075 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.563767 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/687b2dbd-45fa-4c40-887e-ef9ac210561e-config" (OuterVolumeSpecName: "config") pod "687b2dbd-45fa-4c40-887e-ef9ac210561e" (UID: "687b2dbd-45fa-4c40-887e-ef9ac210561e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.586260 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/687b2dbd-45fa-4c40-887e-ef9ac210561e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "687b2dbd-45fa-4c40-887e-ef9ac210561e" (UID: "687b2dbd-45fa-4c40-887e-ef9ac210561e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.594080 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/687b2dbd-45fa-4c40-887e-ef9ac210561e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "687b2dbd-45fa-4c40-887e-ef9ac210561e" (UID: "687b2dbd-45fa-4c40-887e-ef9ac210561e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.594337 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/687b2dbd-45fa-4c40-887e-ef9ac210561e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.594378 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/687b2dbd-45fa-4c40-887e-ef9ac210561e-config\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.596852 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/687b2dbd-45fa-4c40-887e-ef9ac210561e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "687b2dbd-45fa-4c40-887e-ef9ac210561e" (UID: "687b2dbd-45fa-4c40-887e-ef9ac210561e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.606456 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/687b2dbd-45fa-4c40-887e-ef9ac210561e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "687b2dbd-45fa-4c40-887e-ef9ac210561e" (UID: "687b2dbd-45fa-4c40-887e-ef9ac210561e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.622641 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-r56x2"] Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.628501 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-r56x2"] Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.684954 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.685013 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.696104 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/687b2dbd-45fa-4c40-887e-ef9ac210561e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.696134 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/687b2dbd-45fa-4c40-887e-ef9ac210561e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.696144 4869 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/687b2dbd-45fa-4c40-887e-ef9ac210561e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.709319 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.726971 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-776cdff46d-hvjw9"] Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.728693 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.735765 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-jpbgs"] Mar 12 15:07:49 crc kubenswrapper[4869]: E0312 15:07:49.736129 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="687b2dbd-45fa-4c40-887e-ef9ac210561e" containerName="init" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.736149 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="687b2dbd-45fa-4c40-887e-ef9ac210561e" containerName="init" Mar 12 15:07:49 crc kubenswrapper[4869]: E0312 15:07:49.736176 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="687b2dbd-45fa-4c40-887e-ef9ac210561e" containerName="dnsmasq-dns" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.736184 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="687b2dbd-45fa-4c40-887e-ef9ac210561e" containerName="dnsmasq-dns" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.736342 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="687b2dbd-45fa-4c40-887e-ef9ac210561e" containerName="dnsmasq-dns" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.736899 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jpbgs" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.739857 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.739875 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.740778 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.740945 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.741106 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-cl6hz" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.742867 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jpbgs"] Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.804147 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-combined-ca-bundle\") pod \"keystone-bootstrap-jpbgs\" (UID: \"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718\") " pod="openstack/keystone-bootstrap-jpbgs" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.804223 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-fernet-keys\") pod \"keystone-bootstrap-jpbgs\" (UID: \"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718\") " pod="openstack/keystone-bootstrap-jpbgs" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.804260 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-scripts\") pod \"keystone-bootstrap-jpbgs\" (UID: \"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718\") " pod="openstack/keystone-bootstrap-jpbgs" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.804282 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4zfk\" (UniqueName: \"kubernetes.io/projected/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-kube-api-access-h4zfk\") pod \"keystone-bootstrap-jpbgs\" (UID: \"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718\") " pod="openstack/keystone-bootstrap-jpbgs" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.804332 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-credential-keys\") pod \"keystone-bootstrap-jpbgs\" (UID: \"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718\") " pod="openstack/keystone-bootstrap-jpbgs" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.804351 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-config-data\") pod \"keystone-bootstrap-jpbgs\" (UID: \"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718\") " pod="openstack/keystone-bootstrap-jpbgs" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.811157 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7f798b7b68-jtm8h"] Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.905970 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-credential-keys\") pod \"keystone-bootstrap-jpbgs\" (UID: \"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718\") " pod="openstack/keystone-bootstrap-jpbgs" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.906392 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-config-data\") pod \"keystone-bootstrap-jpbgs\" (UID: \"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718\") " pod="openstack/keystone-bootstrap-jpbgs" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.906507 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-combined-ca-bundle\") pod \"keystone-bootstrap-jpbgs\" (UID: \"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718\") " pod="openstack/keystone-bootstrap-jpbgs" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.906591 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-fernet-keys\") pod \"keystone-bootstrap-jpbgs\" (UID: \"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718\") " pod="openstack/keystone-bootstrap-jpbgs" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.906627 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-scripts\") pod \"keystone-bootstrap-jpbgs\" (UID: \"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718\") " pod="openstack/keystone-bootstrap-jpbgs" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.906656 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4zfk\" (UniqueName: \"kubernetes.io/projected/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-kube-api-access-h4zfk\") pod \"keystone-bootstrap-jpbgs\" (UID: \"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718\") " pod="openstack/keystone-bootstrap-jpbgs" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.914337 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-credential-keys\") pod \"keystone-bootstrap-jpbgs\" (UID: \"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718\") " pod="openstack/keystone-bootstrap-jpbgs" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.914764 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-combined-ca-bundle\") pod \"keystone-bootstrap-jpbgs\" (UID: \"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718\") " pod="openstack/keystone-bootstrap-jpbgs" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.918838 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-scripts\") pod \"keystone-bootstrap-jpbgs\" (UID: \"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718\") " pod="openstack/keystone-bootstrap-jpbgs" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.918967 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-fernet-keys\") pod \"keystone-bootstrap-jpbgs\" (UID: \"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718\") " pod="openstack/keystone-bootstrap-jpbgs" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.919947 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-config-data\") pod \"keystone-bootstrap-jpbgs\" (UID: \"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718\") " pod="openstack/keystone-bootstrap-jpbgs" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.924894 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4zfk\" (UniqueName: \"kubernetes.io/projected/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-kube-api-access-h4zfk\") pod \"keystone-bootstrap-jpbgs\" (UID: \"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718\") " pod="openstack/keystone-bootstrap-jpbgs" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.928044 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jpbgs" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.959961 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-776cdff46d-hvjw9" event={"ID":"f1a34267-2bcd-4a01-b2b6-7528c474a7a2","Type":"ContainerStarted","Data":"b6aa4d2c344519c842b11608315f3d386b0033f55184ff2a31a736a64dab152f"} Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.962563 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5855b8d555-mwzss" event={"ID":"d7b2cdfe-e6e9-47b3-99de-d14e20f3330b","Type":"ContainerDied","Data":"ba02d63c3419366c138705d1e79cc1b195805a55e665fb84dc33c21cc0ee62c6"} Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.962662 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5855b8d555-mwzss" Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.975428 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f798b7b68-jtm8h" event={"ID":"b4d63031-e072-466e-ae3c-d829a699b197","Type":"ContainerStarted","Data":"9c0db23a5ffbd9280c69817bf2cfa28632e216ba14ae4aad3ba2920b3f9b6ed9"} Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.980581 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-vxdx8" event={"ID":"2cadca9f-20ef-432c-8816-e5fea0d0c93e","Type":"ContainerStarted","Data":"f6cfca39163cd028fe9ac1d3846a8353e6ddb513810a96eec760ed92c908ebaa"} Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.986233 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7df4569cb7-bfnpz" event={"ID":"5ea3aeb8-779a-45a1-b601-0e6f2776311a","Type":"ContainerDied","Data":"abb919130e0ee799a2d9d0ceb78b73b83569dc57f13a004cb59f39af977fbfbe"} Mar 12 15:07:49 crc kubenswrapper[4869]: I0312 15:07:49.986401 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7df4569cb7-bfnpz" Mar 12 15:07:50 crc kubenswrapper[4869]: I0312 15:07:50.000888 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cb86d8b85-vl7p7" Mar 12 15:07:50 crc kubenswrapper[4869]: I0312 15:07:50.000661 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cb86d8b85-vl7p7" event={"ID":"18e6c8a7-be39-4624-b798-c2f6715ab6e8","Type":"ContainerDied","Data":"f151fd9be1879756534d902ecc6d4dfe097c1dfb06c19f5d3b72c00b981e0121"} Mar 12 15:07:50 crc kubenswrapper[4869]: I0312 15:07:50.013018 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" event={"ID":"687b2dbd-45fa-4c40-887e-ef9ac210561e","Type":"ContainerDied","Data":"158b1a2ed592c6ece3859f48d75672fa87955cb0c45cc673b3c82337bc2bb875"} Mar 12 15:07:50 crc kubenswrapper[4869]: I0312 15:07:50.013096 4869 scope.go:117] "RemoveContainer" containerID="b315b65b82ba9afaa7e2b718cd9b4b04e3b2b1619ef13670192f0e8e37f67d60" Mar 12 15:07:50 crc kubenswrapper[4869]: I0312 15:07:50.013417 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-dqt8v" Mar 12 15:07:50 crc kubenswrapper[4869]: I0312 15:07:50.019281 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-vxdx8" podStartSLOduration=4.104303544 podStartE2EDuration="24.019262964s" podCreationTimestamp="2026-03-12 15:07:26 +0000 UTC" firstStartedPulling="2026-03-12 15:07:28.528614345 +0000 UTC m=+1200.813839633" lastFinishedPulling="2026-03-12 15:07:48.443573775 +0000 UTC m=+1220.728799053" observedRunningTime="2026-03-12 15:07:49.999969981 +0000 UTC m=+1222.285195289" watchObservedRunningTime="2026-03-12 15:07:50.019262964 +0000 UTC m=+1222.304488242" Mar 12 15:07:50 crc kubenswrapper[4869]: E0312 15:07:50.033746 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manila-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-manila-api:current-podified\\\"\"" pod="openstack/manila-db-sync-9p9gw" podUID="e8721cab-3eb8-4c80-a0c8-79c7e007b614" Mar 12 15:07:50 crc kubenswrapper[4869]: I0312 15:07:50.074876 4869 scope.go:117] "RemoveContainer" containerID="bc00f50c343fcdba62d10919ead04d581abc1b4c00ecde72c2e7b4ba050e0b81" Mar 12 15:07:50 crc kubenswrapper[4869]: I0312 15:07:50.114024 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5855b8d555-mwzss"] Mar 12 15:07:50 crc kubenswrapper[4869]: I0312 15:07:50.129313 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5855b8d555-mwzss"] Mar 12 15:07:50 crc kubenswrapper[4869]: I0312 15:07:50.144808 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5cb86d8b85-vl7p7"] Mar 12 15:07:50 crc kubenswrapper[4869]: I0312 15:07:50.152180 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5cb86d8b85-vl7p7"] Mar 12 15:07:50 crc kubenswrapper[4869]: I0312 15:07:50.162429 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-dqt8v"] Mar 12 15:07:50 crc kubenswrapper[4869]: I0312 15:07:50.170984 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-dqt8v"] Mar 12 15:07:50 crc kubenswrapper[4869]: I0312 15:07:50.185367 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7df4569cb7-bfnpz"] Mar 12 15:07:50 crc kubenswrapper[4869]: I0312 15:07:50.192430 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7df4569cb7-bfnpz"] Mar 12 15:07:50 crc kubenswrapper[4869]: I0312 15:07:50.355347 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18e6c8a7-be39-4624-b798-c2f6715ab6e8" path="/var/lib/kubelet/pods/18e6c8a7-be39-4624-b798-c2f6715ab6e8/volumes" Mar 12 15:07:50 crc kubenswrapper[4869]: I0312 15:07:50.356266 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ea3aeb8-779a-45a1-b601-0e6f2776311a" path="/var/lib/kubelet/pods/5ea3aeb8-779a-45a1-b601-0e6f2776311a/volumes" Mar 12 15:07:50 crc kubenswrapper[4869]: I0312 15:07:50.358954 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="687b2dbd-45fa-4c40-887e-ef9ac210561e" path="/var/lib/kubelet/pods/687b2dbd-45fa-4c40-887e-ef9ac210561e/volumes" Mar 12 15:07:50 crc kubenswrapper[4869]: I0312 15:07:50.360527 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d771411-1e30-4680-913f-a9e0a9a74e31" path="/var/lib/kubelet/pods/8d771411-1e30-4680-913f-a9e0a9a74e31/volumes" Mar 12 15:07:50 crc kubenswrapper[4869]: I0312 15:07:50.362174 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aedadb74-4f2b-40b3-9847-20519eea4209" path="/var/lib/kubelet/pods/aedadb74-4f2b-40b3-9847-20519eea4209/volumes" Mar 12 15:07:50 crc kubenswrapper[4869]: I0312 15:07:50.364814 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7b2cdfe-e6e9-47b3-99de-d14e20f3330b" path="/var/lib/kubelet/pods/d7b2cdfe-e6e9-47b3-99de-d14e20f3330b/volumes" Mar 12 15:07:50 crc kubenswrapper[4869]: I0312 15:07:50.365759 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfea72a5-eabd-4a81-bccb-ce2a59466007" path="/var/lib/kubelet/pods/dfea72a5-eabd-4a81-bccb-ce2a59466007/volumes" Mar 12 15:07:50 crc kubenswrapper[4869]: W0312 15:07:50.368557 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2cc7b5a_a92b_4ad2_8778_d6c25782eeb6.slice/crio-49683d4279b6798f13e61af4fe8150324e31674036fc6f7529b14a0bc0b88b1f WatchSource:0}: Error finding container 49683d4279b6798f13e61af4fe8150324e31674036fc6f7529b14a0bc0b88b1f: Status 404 returned error can't find the container with id 49683d4279b6798f13e61af4fe8150324e31674036fc6f7529b14a0bc0b88b1f Mar 12 15:07:50 crc kubenswrapper[4869]: I0312 15:07:50.369819 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 15:07:50 crc kubenswrapper[4869]: I0312 15:07:50.540250 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 15:07:50 crc kubenswrapper[4869]: W0312 15:07:50.541023 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc423b809_18ee_493c_91b6_d30846b0d68b.slice/crio-01c200ab9f8e075cedd24d767c813308013cc63c491fd7aaaf63118ff09ea6e5 WatchSource:0}: Error finding container 01c200ab9f8e075cedd24d767c813308013cc63c491fd7aaaf63118ff09ea6e5: Status 404 returned error can't find the container with id 01c200ab9f8e075cedd24d767c813308013cc63c491fd7aaaf63118ff09ea6e5 Mar 12 15:07:50 crc kubenswrapper[4869]: I0312 15:07:50.558192 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jpbgs"] Mar 12 15:07:51 crc kubenswrapper[4869]: I0312 15:07:51.033129 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f798b7b68-jtm8h" event={"ID":"b4d63031-e072-466e-ae3c-d829a699b197","Type":"ContainerStarted","Data":"a26eec3d4fd18f4819fca706f27ddd2aa65eae9751ad98f439f13147a5de8480"} Mar 12 15:07:51 crc kubenswrapper[4869]: I0312 15:07:51.034935 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c423b809-18ee-493c-91b6-d30846b0d68b","Type":"ContainerStarted","Data":"01c200ab9f8e075cedd24d767c813308013cc63c491fd7aaaf63118ff09ea6e5"} Mar 12 15:07:51 crc kubenswrapper[4869]: I0312 15:07:51.037457 4869 generic.go:334] "Generic (PLEG): container finished" podID="50098315-1895-433b-9d66-df198c579b4e" containerID="99ab82161b41e97ab7c1ebd91e6d6c0cf00ba76e190a938a5a292faf679ad0ab" exitCode=0 Mar 12 15:07:51 crc kubenswrapper[4869]: I0312 15:07:51.037535 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-556xl" event={"ID":"50098315-1895-433b-9d66-df198c579b4e","Type":"ContainerDied","Data":"99ab82161b41e97ab7c1ebd91e6d6c0cf00ba76e190a938a5a292faf679ad0ab"} Mar 12 15:07:51 crc kubenswrapper[4869]: I0312 15:07:51.041295 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-776cdff46d-hvjw9" event={"ID":"f1a34267-2bcd-4a01-b2b6-7528c474a7a2","Type":"ContainerStarted","Data":"e20ec2d048ae17f8860ce38953cbd8bf6aa7220ec2a7aba163248e4196ff39d5"} Mar 12 15:07:51 crc kubenswrapper[4869]: I0312 15:07:51.041329 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-776cdff46d-hvjw9" event={"ID":"f1a34267-2bcd-4a01-b2b6-7528c474a7a2","Type":"ContainerStarted","Data":"5858ec20f4a5321b8238f9ca8015ca6c0c32f30b32d1269a3046ccdd031cfe97"} Mar 12 15:07:51 crc kubenswrapper[4869]: I0312 15:07:51.057178 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jpbgs" event={"ID":"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718","Type":"ContainerStarted","Data":"749a7118013cbc6f8c1739816d664a47f147c25b58b38882f1e0f5390a076d26"} Mar 12 15:07:51 crc kubenswrapper[4869]: I0312 15:07:51.057238 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jpbgs" event={"ID":"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718","Type":"ContainerStarted","Data":"31f1258ddb97f63c00b4197490116e1d79d34fee5bcbf84faf549183f728e5d7"} Mar 12 15:07:51 crc kubenswrapper[4869]: I0312 15:07:51.064050 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6","Type":"ContainerStarted","Data":"49683d4279b6798f13e61af4fe8150324e31674036fc6f7529b14a0bc0b88b1f"} Mar 12 15:07:51 crc kubenswrapper[4869]: I0312 15:07:51.069297 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b444668b-c6e2-42fc-93e2-8b14ef77eef3","Type":"ContainerStarted","Data":"f1039753306fc2ec2026e325eee31c594a3f58104c33a5cc050b97797fd018cc"} Mar 12 15:07:51 crc kubenswrapper[4869]: I0312 15:07:51.081413 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-776cdff46d-hvjw9" podStartSLOduration=15.526184879 podStartE2EDuration="16.081387878s" podCreationTimestamp="2026-03-12 15:07:35 +0000 UTC" firstStartedPulling="2026-03-12 15:07:49.773403069 +0000 UTC m=+1222.058628347" lastFinishedPulling="2026-03-12 15:07:50.328606078 +0000 UTC m=+1222.613831346" observedRunningTime="2026-03-12 15:07:51.072905865 +0000 UTC m=+1223.358131143" watchObservedRunningTime="2026-03-12 15:07:51.081387878 +0000 UTC m=+1223.366613156" Mar 12 15:07:51 crc kubenswrapper[4869]: I0312 15:07:51.101906 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-jpbgs" podStartSLOduration=2.101883725 podStartE2EDuration="2.101883725s" podCreationTimestamp="2026-03-12 15:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:07:51.09297264 +0000 UTC m=+1223.378197918" watchObservedRunningTime="2026-03-12 15:07:51.101883725 +0000 UTC m=+1223.387109003" Mar 12 15:07:52 crc kubenswrapper[4869]: I0312 15:07:52.093293 4869 generic.go:334] "Generic (PLEG): container finished" podID="2cadca9f-20ef-432c-8816-e5fea0d0c93e" containerID="f6cfca39163cd028fe9ac1d3846a8353e6ddb513810a96eec760ed92c908ebaa" exitCode=0 Mar 12 15:07:52 crc kubenswrapper[4869]: I0312 15:07:52.093996 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-vxdx8" event={"ID":"2cadca9f-20ef-432c-8816-e5fea0d0c93e","Type":"ContainerDied","Data":"f6cfca39163cd028fe9ac1d3846a8353e6ddb513810a96eec760ed92c908ebaa"} Mar 12 15:07:52 crc kubenswrapper[4869]: I0312 15:07:52.126092 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c423b809-18ee-493c-91b6-d30846b0d68b","Type":"ContainerStarted","Data":"26f026590c1088bfa4ee32afd489c3b3d7eeb3392b7e739c59c0af659d2a2d7e"} Mar 12 15:07:52 crc kubenswrapper[4869]: I0312 15:07:52.126133 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c423b809-18ee-493c-91b6-d30846b0d68b","Type":"ContainerStarted","Data":"c64bce07796839a0d8f8c8631eb323e6f9d63a40fb647956a4d8232516d66ec7"} Mar 12 15:07:52 crc kubenswrapper[4869]: I0312 15:07:52.130453 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6","Type":"ContainerStarted","Data":"e29dc032d9f4f31a2776c6c744d3b4f9d0b1aaa6bed5a578ba80645b08858909"} Mar 12 15:07:52 crc kubenswrapper[4869]: I0312 15:07:52.130509 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6","Type":"ContainerStarted","Data":"6af107b35856299eb7786081ab0f1d7915dbb1dec4b6a48b4a08830298eb68bd"} Mar 12 15:07:52 crc kubenswrapper[4869]: I0312 15:07:52.148048 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f798b7b68-jtm8h" event={"ID":"b4d63031-e072-466e-ae3c-d829a699b197","Type":"ContainerStarted","Data":"8d06094a459cda2058ff32c4ae52ed8cd8b1f55cc120d4cd9b716f47aa66a7f1"} Mar 12 15:07:52 crc kubenswrapper[4869]: I0312 15:07:52.166562 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.1665253 podStartE2EDuration="4.1665253s" podCreationTimestamp="2026-03-12 15:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:07:52.15290257 +0000 UTC m=+1224.438127858" watchObservedRunningTime="2026-03-12 15:07:52.1665253 +0000 UTC m=+1224.451750578" Mar 12 15:07:52 crc kubenswrapper[4869]: I0312 15:07:52.197499 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7f798b7b68-jtm8h" podStartSLOduration=16.424809477 podStartE2EDuration="17.197475357s" podCreationTimestamp="2026-03-12 15:07:35 +0000 UTC" firstStartedPulling="2026-03-12 15:07:49.831747661 +0000 UTC m=+1222.116972939" lastFinishedPulling="2026-03-12 15:07:50.604413541 +0000 UTC m=+1222.889638819" observedRunningTime="2026-03-12 15:07:52.183727133 +0000 UTC m=+1224.468952411" watchObservedRunningTime="2026-03-12 15:07:52.197475357 +0000 UTC m=+1224.482700635" Mar 12 15:07:52 crc kubenswrapper[4869]: I0312 15:07:52.245169 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.245145813 podStartE2EDuration="4.245145813s" podCreationTimestamp="2026-03-12 15:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:07:52.211579991 +0000 UTC m=+1224.496805269" watchObservedRunningTime="2026-03-12 15:07:52.245145813 +0000 UTC m=+1224.530371091" Mar 12 15:07:52 crc kubenswrapper[4869]: I0312 15:07:52.558948 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-556xl" Mar 12 15:07:52 crc kubenswrapper[4869]: I0312 15:07:52.582097 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/50098315-1895-433b-9d66-df198c579b4e-config\") pod \"50098315-1895-433b-9d66-df198c579b4e\" (UID: \"50098315-1895-433b-9d66-df198c579b4e\") " Mar 12 15:07:52 crc kubenswrapper[4869]: I0312 15:07:52.582231 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50098315-1895-433b-9d66-df198c579b4e-combined-ca-bundle\") pod \"50098315-1895-433b-9d66-df198c579b4e\" (UID: \"50098315-1895-433b-9d66-df198c579b4e\") " Mar 12 15:07:52 crc kubenswrapper[4869]: I0312 15:07:52.582322 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx8lf\" (UniqueName: \"kubernetes.io/projected/50098315-1895-433b-9d66-df198c579b4e-kube-api-access-hx8lf\") pod \"50098315-1895-433b-9d66-df198c579b4e\" (UID: \"50098315-1895-433b-9d66-df198c579b4e\") " Mar 12 15:07:52 crc kubenswrapper[4869]: I0312 15:07:52.599900 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50098315-1895-433b-9d66-df198c579b4e-kube-api-access-hx8lf" (OuterVolumeSpecName: "kube-api-access-hx8lf") pod "50098315-1895-433b-9d66-df198c579b4e" (UID: "50098315-1895-433b-9d66-df198c579b4e"). InnerVolumeSpecName "kube-api-access-hx8lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:07:52 crc kubenswrapper[4869]: I0312 15:07:52.609377 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50098315-1895-433b-9d66-df198c579b4e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50098315-1895-433b-9d66-df198c579b4e" (UID: "50098315-1895-433b-9d66-df198c579b4e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:07:52 crc kubenswrapper[4869]: I0312 15:07:52.627316 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50098315-1895-433b-9d66-df198c579b4e-config" (OuterVolumeSpecName: "config") pod "50098315-1895-433b-9d66-df198c579b4e" (UID: "50098315-1895-433b-9d66-df198c579b4e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:07:52 crc kubenswrapper[4869]: I0312 15:07:52.684016 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50098315-1895-433b-9d66-df198c579b4e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:52 crc kubenswrapper[4869]: I0312 15:07:52.684323 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hx8lf\" (UniqueName: \"kubernetes.io/projected/50098315-1895-433b-9d66-df198c579b4e-kube-api-access-hx8lf\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:52 crc kubenswrapper[4869]: I0312 15:07:52.684335 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/50098315-1895-433b-9d66-df198c579b4e-config\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.184120 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-556xl" event={"ID":"50098315-1895-433b-9d66-df198c579b4e","Type":"ContainerDied","Data":"ed812f60a1fe5812ad3dce32b3f2b09d3ed6f3ab4ba28b142e317d52c839509c"} Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.184178 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed812f60a1fe5812ad3dce32b3f2b09d3ed6f3ab4ba28b142e317d52c839509c" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.184245 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-556xl" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.220013 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-78k6r"] Mar 12 15:07:53 crc kubenswrapper[4869]: E0312 15:07:53.220398 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50098315-1895-433b-9d66-df198c579b4e" containerName="neutron-db-sync" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.220415 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="50098315-1895-433b-9d66-df198c579b4e" containerName="neutron-db-sync" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.220613 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="50098315-1895-433b-9d66-df198c579b4e" containerName="neutron-db-sync" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.221451 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-78k6r" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.246422 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-78k6r"] Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.295555 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/95e0a90a-0269-471b-bd8d-a110809af063-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-78k6r\" (UID: \"95e0a90a-0269-471b-bd8d-a110809af063\") " pod="openstack/dnsmasq-dns-55f844cf75-78k6r" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.295644 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pzfj\" (UniqueName: \"kubernetes.io/projected/95e0a90a-0269-471b-bd8d-a110809af063-kube-api-access-7pzfj\") pod \"dnsmasq-dns-55f844cf75-78k6r\" (UID: \"95e0a90a-0269-471b-bd8d-a110809af063\") " pod="openstack/dnsmasq-dns-55f844cf75-78k6r" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.295747 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95e0a90a-0269-471b-bd8d-a110809af063-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-78k6r\" (UID: \"95e0a90a-0269-471b-bd8d-a110809af063\") " pod="openstack/dnsmasq-dns-55f844cf75-78k6r" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.295836 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95e0a90a-0269-471b-bd8d-a110809af063-dns-svc\") pod \"dnsmasq-dns-55f844cf75-78k6r\" (UID: \"95e0a90a-0269-471b-bd8d-a110809af063\") " pod="openstack/dnsmasq-dns-55f844cf75-78k6r" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.295908 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95e0a90a-0269-471b-bd8d-a110809af063-config\") pod \"dnsmasq-dns-55f844cf75-78k6r\" (UID: \"95e0a90a-0269-471b-bd8d-a110809af063\") " pod="openstack/dnsmasq-dns-55f844cf75-78k6r" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.295925 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95e0a90a-0269-471b-bd8d-a110809af063-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-78k6r\" (UID: \"95e0a90a-0269-471b-bd8d-a110809af063\") " pod="openstack/dnsmasq-dns-55f844cf75-78k6r" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.316281 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7f7dc9cccb-h2224"] Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.327649 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f7dc9cccb-h2224" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.329461 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7f7dc9cccb-h2224"] Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.333109 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.333601 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.334343 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.334426 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-dhzrk" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.401080 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d33ab78c-e70a-4b83-9c6d-d94fab2fbd55-httpd-config\") pod \"neutron-7f7dc9cccb-h2224\" (UID: \"d33ab78c-e70a-4b83-9c6d-d94fab2fbd55\") " pod="openstack/neutron-7f7dc9cccb-h2224" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.401146 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95e0a90a-0269-471b-bd8d-a110809af063-config\") pod \"dnsmasq-dns-55f844cf75-78k6r\" (UID: \"95e0a90a-0269-471b-bd8d-a110809af063\") " pod="openstack/dnsmasq-dns-55f844cf75-78k6r" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.401173 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95e0a90a-0269-471b-bd8d-a110809af063-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-78k6r\" (UID: \"95e0a90a-0269-471b-bd8d-a110809af063\") " pod="openstack/dnsmasq-dns-55f844cf75-78k6r" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.401222 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/95e0a90a-0269-471b-bd8d-a110809af063-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-78k6r\" (UID: \"95e0a90a-0269-471b-bd8d-a110809af063\") " pod="openstack/dnsmasq-dns-55f844cf75-78k6r" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.401272 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pzfj\" (UniqueName: \"kubernetes.io/projected/95e0a90a-0269-471b-bd8d-a110809af063-kube-api-access-7pzfj\") pod \"dnsmasq-dns-55f844cf75-78k6r\" (UID: \"95e0a90a-0269-471b-bd8d-a110809af063\") " pod="openstack/dnsmasq-dns-55f844cf75-78k6r" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.401290 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d33ab78c-e70a-4b83-9c6d-d94fab2fbd55-config\") pod \"neutron-7f7dc9cccb-h2224\" (UID: \"d33ab78c-e70a-4b83-9c6d-d94fab2fbd55\") " pod="openstack/neutron-7f7dc9cccb-h2224" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.401338 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95e0a90a-0269-471b-bd8d-a110809af063-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-78k6r\" (UID: \"95e0a90a-0269-471b-bd8d-a110809af063\") " pod="openstack/dnsmasq-dns-55f844cf75-78k6r" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.401429 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d33ab78c-e70a-4b83-9c6d-d94fab2fbd55-combined-ca-bundle\") pod \"neutron-7f7dc9cccb-h2224\" (UID: \"d33ab78c-e70a-4b83-9c6d-d94fab2fbd55\") " pod="openstack/neutron-7f7dc9cccb-h2224" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.401472 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d33ab78c-e70a-4b83-9c6d-d94fab2fbd55-ovndb-tls-certs\") pod \"neutron-7f7dc9cccb-h2224\" (UID: \"d33ab78c-e70a-4b83-9c6d-d94fab2fbd55\") " pod="openstack/neutron-7f7dc9cccb-h2224" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.401525 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tkpf\" (UniqueName: \"kubernetes.io/projected/d33ab78c-e70a-4b83-9c6d-d94fab2fbd55-kube-api-access-9tkpf\") pod \"neutron-7f7dc9cccb-h2224\" (UID: \"d33ab78c-e70a-4b83-9c6d-d94fab2fbd55\") " pod="openstack/neutron-7f7dc9cccb-h2224" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.401580 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95e0a90a-0269-471b-bd8d-a110809af063-dns-svc\") pod \"dnsmasq-dns-55f844cf75-78k6r\" (UID: \"95e0a90a-0269-471b-bd8d-a110809af063\") " pod="openstack/dnsmasq-dns-55f844cf75-78k6r" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.402711 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95e0a90a-0269-471b-bd8d-a110809af063-dns-svc\") pod \"dnsmasq-dns-55f844cf75-78k6r\" (UID: \"95e0a90a-0269-471b-bd8d-a110809af063\") " pod="openstack/dnsmasq-dns-55f844cf75-78k6r" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.405757 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95e0a90a-0269-471b-bd8d-a110809af063-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-78k6r\" (UID: \"95e0a90a-0269-471b-bd8d-a110809af063\") " pod="openstack/dnsmasq-dns-55f844cf75-78k6r" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.409520 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/95e0a90a-0269-471b-bd8d-a110809af063-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-78k6r\" (UID: \"95e0a90a-0269-471b-bd8d-a110809af063\") " pod="openstack/dnsmasq-dns-55f844cf75-78k6r" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.409520 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95e0a90a-0269-471b-bd8d-a110809af063-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-78k6r\" (UID: \"95e0a90a-0269-471b-bd8d-a110809af063\") " pod="openstack/dnsmasq-dns-55f844cf75-78k6r" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.410224 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95e0a90a-0269-471b-bd8d-a110809af063-config\") pod \"dnsmasq-dns-55f844cf75-78k6r\" (UID: \"95e0a90a-0269-471b-bd8d-a110809af063\") " pod="openstack/dnsmasq-dns-55f844cf75-78k6r" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.464347 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pzfj\" (UniqueName: \"kubernetes.io/projected/95e0a90a-0269-471b-bd8d-a110809af063-kube-api-access-7pzfj\") pod \"dnsmasq-dns-55f844cf75-78k6r\" (UID: \"95e0a90a-0269-471b-bd8d-a110809af063\") " pod="openstack/dnsmasq-dns-55f844cf75-78k6r" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.503034 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d33ab78c-e70a-4b83-9c6d-d94fab2fbd55-ovndb-tls-certs\") pod \"neutron-7f7dc9cccb-h2224\" (UID: \"d33ab78c-e70a-4b83-9c6d-d94fab2fbd55\") " pod="openstack/neutron-7f7dc9cccb-h2224" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.503353 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tkpf\" (UniqueName: \"kubernetes.io/projected/d33ab78c-e70a-4b83-9c6d-d94fab2fbd55-kube-api-access-9tkpf\") pod \"neutron-7f7dc9cccb-h2224\" (UID: \"d33ab78c-e70a-4b83-9c6d-d94fab2fbd55\") " pod="openstack/neutron-7f7dc9cccb-h2224" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.503501 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d33ab78c-e70a-4b83-9c6d-d94fab2fbd55-httpd-config\") pod \"neutron-7f7dc9cccb-h2224\" (UID: \"d33ab78c-e70a-4b83-9c6d-d94fab2fbd55\") " pod="openstack/neutron-7f7dc9cccb-h2224" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.504113 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d33ab78c-e70a-4b83-9c6d-d94fab2fbd55-config\") pod \"neutron-7f7dc9cccb-h2224\" (UID: \"d33ab78c-e70a-4b83-9c6d-d94fab2fbd55\") " pod="openstack/neutron-7f7dc9cccb-h2224" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.504324 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d33ab78c-e70a-4b83-9c6d-d94fab2fbd55-combined-ca-bundle\") pod \"neutron-7f7dc9cccb-h2224\" (UID: \"d33ab78c-e70a-4b83-9c6d-d94fab2fbd55\") " pod="openstack/neutron-7f7dc9cccb-h2224" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.508430 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d33ab78c-e70a-4b83-9c6d-d94fab2fbd55-httpd-config\") pod \"neutron-7f7dc9cccb-h2224\" (UID: \"d33ab78c-e70a-4b83-9c6d-d94fab2fbd55\") " pod="openstack/neutron-7f7dc9cccb-h2224" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.510353 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d33ab78c-e70a-4b83-9c6d-d94fab2fbd55-config\") pod \"neutron-7f7dc9cccb-h2224\" (UID: \"d33ab78c-e70a-4b83-9c6d-d94fab2fbd55\") " pod="openstack/neutron-7f7dc9cccb-h2224" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.510827 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d33ab78c-e70a-4b83-9c6d-d94fab2fbd55-ovndb-tls-certs\") pod \"neutron-7f7dc9cccb-h2224\" (UID: \"d33ab78c-e70a-4b83-9c6d-d94fab2fbd55\") " pod="openstack/neutron-7f7dc9cccb-h2224" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.510951 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d33ab78c-e70a-4b83-9c6d-d94fab2fbd55-combined-ca-bundle\") pod \"neutron-7f7dc9cccb-h2224\" (UID: \"d33ab78c-e70a-4b83-9c6d-d94fab2fbd55\") " pod="openstack/neutron-7f7dc9cccb-h2224" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.537463 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tkpf\" (UniqueName: \"kubernetes.io/projected/d33ab78c-e70a-4b83-9c6d-d94fab2fbd55-kube-api-access-9tkpf\") pod \"neutron-7f7dc9cccb-h2224\" (UID: \"d33ab78c-e70a-4b83-9c6d-d94fab2fbd55\") " pod="openstack/neutron-7f7dc9cccb-h2224" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.605811 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-78k6r" Mar 12 15:07:53 crc kubenswrapper[4869]: I0312 15:07:53.647269 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f7dc9cccb-h2224" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.222878 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b444668b-c6e2-42fc-93e2-8b14ef77eef3","Type":"ContainerStarted","Data":"d14023009c101fa53090e88a111afcc29bfeb3173093256ce2dac885f10fddf3"} Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.316415 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-55d7966795-cnsx8"] Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.318345 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55d7966795-cnsx8" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.321945 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.322002 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.331855 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-55d7966795-cnsx8"] Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.337265 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-internal-tls-certs\") pod \"neutron-55d7966795-cnsx8\" (UID: \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\") " pod="openstack/neutron-55d7966795-cnsx8" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.337309 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-combined-ca-bundle\") pod \"neutron-55d7966795-cnsx8\" (UID: \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\") " pod="openstack/neutron-55d7966795-cnsx8" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.337347 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-public-tls-certs\") pod \"neutron-55d7966795-cnsx8\" (UID: \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\") " pod="openstack/neutron-55d7966795-cnsx8" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.337377 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lqw7\" (UniqueName: \"kubernetes.io/projected/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-kube-api-access-8lqw7\") pod \"neutron-55d7966795-cnsx8\" (UID: \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\") " pod="openstack/neutron-55d7966795-cnsx8" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.337502 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-config\") pod \"neutron-55d7966795-cnsx8\" (UID: \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\") " pod="openstack/neutron-55d7966795-cnsx8" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.337569 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-ovndb-tls-certs\") pod \"neutron-55d7966795-cnsx8\" (UID: \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\") " pod="openstack/neutron-55d7966795-cnsx8" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.337757 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-httpd-config\") pod \"neutron-55d7966795-cnsx8\" (UID: \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\") " pod="openstack/neutron-55d7966795-cnsx8" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.438792 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-combined-ca-bundle\") pod \"neutron-55d7966795-cnsx8\" (UID: \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\") " pod="openstack/neutron-55d7966795-cnsx8" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.438843 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-public-tls-certs\") pod \"neutron-55d7966795-cnsx8\" (UID: \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\") " pod="openstack/neutron-55d7966795-cnsx8" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.438865 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lqw7\" (UniqueName: \"kubernetes.io/projected/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-kube-api-access-8lqw7\") pod \"neutron-55d7966795-cnsx8\" (UID: \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\") " pod="openstack/neutron-55d7966795-cnsx8" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.438900 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-config\") pod \"neutron-55d7966795-cnsx8\" (UID: \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\") " pod="openstack/neutron-55d7966795-cnsx8" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.438928 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-ovndb-tls-certs\") pod \"neutron-55d7966795-cnsx8\" (UID: \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\") " pod="openstack/neutron-55d7966795-cnsx8" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.438993 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-httpd-config\") pod \"neutron-55d7966795-cnsx8\" (UID: \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\") " pod="openstack/neutron-55d7966795-cnsx8" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.439036 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-internal-tls-certs\") pod \"neutron-55d7966795-cnsx8\" (UID: \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\") " pod="openstack/neutron-55d7966795-cnsx8" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.445866 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-combined-ca-bundle\") pod \"neutron-55d7966795-cnsx8\" (UID: \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\") " pod="openstack/neutron-55d7966795-cnsx8" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.448010 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-config\") pod \"neutron-55d7966795-cnsx8\" (UID: \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\") " pod="openstack/neutron-55d7966795-cnsx8" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.449212 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-ovndb-tls-certs\") pod \"neutron-55d7966795-cnsx8\" (UID: \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\") " pod="openstack/neutron-55d7966795-cnsx8" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.450292 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-internal-tls-certs\") pod \"neutron-55d7966795-cnsx8\" (UID: \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\") " pod="openstack/neutron-55d7966795-cnsx8" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.450654 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-public-tls-certs\") pod \"neutron-55d7966795-cnsx8\" (UID: \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\") " pod="openstack/neutron-55d7966795-cnsx8" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.451011 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-httpd-config\") pod \"neutron-55d7966795-cnsx8\" (UID: \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\") " pod="openstack/neutron-55d7966795-cnsx8" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.466373 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lqw7\" (UniqueName: \"kubernetes.io/projected/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-kube-api-access-8lqw7\") pod \"neutron-55d7966795-cnsx8\" (UID: \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\") " pod="openstack/neutron-55d7966795-cnsx8" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.574257 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-vxdx8" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.640247 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55d7966795-cnsx8" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.749229 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cadca9f-20ef-432c-8816-e5fea0d0c93e-combined-ca-bundle\") pod \"2cadca9f-20ef-432c-8816-e5fea0d0c93e\" (UID: \"2cadca9f-20ef-432c-8816-e5fea0d0c93e\") " Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.749350 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cadca9f-20ef-432c-8816-e5fea0d0c93e-logs\") pod \"2cadca9f-20ef-432c-8816-e5fea0d0c93e\" (UID: \"2cadca9f-20ef-432c-8816-e5fea0d0c93e\") " Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.749370 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zgmn\" (UniqueName: \"kubernetes.io/projected/2cadca9f-20ef-432c-8816-e5fea0d0c93e-kube-api-access-5zgmn\") pod \"2cadca9f-20ef-432c-8816-e5fea0d0c93e\" (UID: \"2cadca9f-20ef-432c-8816-e5fea0d0c93e\") " Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.749422 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cadca9f-20ef-432c-8816-e5fea0d0c93e-config-data\") pod \"2cadca9f-20ef-432c-8816-e5fea0d0c93e\" (UID: \"2cadca9f-20ef-432c-8816-e5fea0d0c93e\") " Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.749570 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cadca9f-20ef-432c-8816-e5fea0d0c93e-scripts\") pod \"2cadca9f-20ef-432c-8816-e5fea0d0c93e\" (UID: \"2cadca9f-20ef-432c-8816-e5fea0d0c93e\") " Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.749941 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cadca9f-20ef-432c-8816-e5fea0d0c93e-logs" (OuterVolumeSpecName: "logs") pod "2cadca9f-20ef-432c-8816-e5fea0d0c93e" (UID: "2cadca9f-20ef-432c-8816-e5fea0d0c93e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.754500 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cadca9f-20ef-432c-8816-e5fea0d0c93e-scripts" (OuterVolumeSpecName: "scripts") pod "2cadca9f-20ef-432c-8816-e5fea0d0c93e" (UID: "2cadca9f-20ef-432c-8816-e5fea0d0c93e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.761757 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cadca9f-20ef-432c-8816-e5fea0d0c93e-kube-api-access-5zgmn" (OuterVolumeSpecName: "kube-api-access-5zgmn") pod "2cadca9f-20ef-432c-8816-e5fea0d0c93e" (UID: "2cadca9f-20ef-432c-8816-e5fea0d0c93e"). InnerVolumeSpecName "kube-api-access-5zgmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.765927 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7f798b7b68-jtm8h" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.766022 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7f798b7b68-jtm8h" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.775801 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cadca9f-20ef-432c-8816-e5fea0d0c93e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2cadca9f-20ef-432c-8816-e5fea0d0c93e" (UID: "2cadca9f-20ef-432c-8816-e5fea0d0c93e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.776675 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cadca9f-20ef-432c-8816-e5fea0d0c93e-config-data" (OuterVolumeSpecName: "config-data") pod "2cadca9f-20ef-432c-8816-e5fea0d0c93e" (UID: "2cadca9f-20ef-432c-8816-e5fea0d0c93e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.831296 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-776cdff46d-hvjw9" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.832127 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-776cdff46d-hvjw9" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.852003 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cadca9f-20ef-432c-8816-e5fea0d0c93e-logs\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.852033 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zgmn\" (UniqueName: \"kubernetes.io/projected/2cadca9f-20ef-432c-8816-e5fea0d0c93e-kube-api-access-5zgmn\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.852045 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cadca9f-20ef-432c-8816-e5fea0d0c93e-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.852055 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cadca9f-20ef-432c-8816-e5fea0d0c93e-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.852066 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cadca9f-20ef-432c-8816-e5fea0d0c93e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:55 crc kubenswrapper[4869]: I0312 15:07:55.993256 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-78k6r"] Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.100024 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-55d7966795-cnsx8"] Mar 12 15:07:56 crc kubenswrapper[4869]: W0312 15:07:56.110980 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc4e8fd1_48da_4dac_8d0b_bed00a35ad66.slice/crio-e52be672c76eb099580f00a5b20ddd70cf897c2aefb5749b43d653d20d568f8a WatchSource:0}: Error finding container e52be672c76eb099580f00a5b20ddd70cf897c2aefb5749b43d653d20d568f8a: Status 404 returned error can't find the container with id e52be672c76eb099580f00a5b20ddd70cf897c2aefb5749b43d653d20d568f8a Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.233514 4869 generic.go:334] "Generic (PLEG): container finished" podID="95e0a90a-0269-471b-bd8d-a110809af063" containerID="6e007d00dcfa5ddc50317efa17d7f0343c22e8dafe1fe1f1b4c1137ced411ee9" exitCode=0 Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.233610 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-78k6r" event={"ID":"95e0a90a-0269-471b-bd8d-a110809af063","Type":"ContainerDied","Data":"6e007d00dcfa5ddc50317efa17d7f0343c22e8dafe1fe1f1b4c1137ced411ee9"} Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.233852 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-78k6r" event={"ID":"95e0a90a-0269-471b-bd8d-a110809af063","Type":"ContainerStarted","Data":"26c39212bc9b61dd7d52fb1ab0bd238568eff9f3f56ce2547dd801cf4f2da319"} Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.236386 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55d7966795-cnsx8" event={"ID":"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66","Type":"ContainerStarted","Data":"e52be672c76eb099580f00a5b20ddd70cf897c2aefb5749b43d653d20d568f8a"} Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.238608 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-vxdx8" event={"ID":"2cadca9f-20ef-432c-8816-e5fea0d0c93e","Type":"ContainerDied","Data":"62cc4671254e525dedec9a6f9463ca964477738887c8e971a10fa649f947755a"} Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.238651 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62cc4671254e525dedec9a6f9463ca964477738887c8e971a10fa649f947755a" Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.238698 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-vxdx8" Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.705284 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-666b564cfb-pk78f"] Mar 12 15:07:56 crc kubenswrapper[4869]: E0312 15:07:56.706404 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cadca9f-20ef-432c-8816-e5fea0d0c93e" containerName="placement-db-sync" Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.706427 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cadca9f-20ef-432c-8816-e5fea0d0c93e" containerName="placement-db-sync" Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.706632 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cadca9f-20ef-432c-8816-e5fea0d0c93e" containerName="placement-db-sync" Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.707674 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-666b564cfb-pk78f" Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.718890 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.719041 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-k4mns" Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.719317 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.719433 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.719617 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.722875 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-666b564cfb-pk78f"] Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.889060 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/340fff5b-1313-4cbd-82fc-fee5941c1aef-logs\") pod \"placement-666b564cfb-pk78f\" (UID: \"340fff5b-1313-4cbd-82fc-fee5941c1aef\") " pod="openstack/placement-666b564cfb-pk78f" Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.889113 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/340fff5b-1313-4cbd-82fc-fee5941c1aef-combined-ca-bundle\") pod \"placement-666b564cfb-pk78f\" (UID: \"340fff5b-1313-4cbd-82fc-fee5941c1aef\") " pod="openstack/placement-666b564cfb-pk78f" Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.889155 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/340fff5b-1313-4cbd-82fc-fee5941c1aef-public-tls-certs\") pod \"placement-666b564cfb-pk78f\" (UID: \"340fff5b-1313-4cbd-82fc-fee5941c1aef\") " pod="openstack/placement-666b564cfb-pk78f" Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.889172 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ldzv\" (UniqueName: \"kubernetes.io/projected/340fff5b-1313-4cbd-82fc-fee5941c1aef-kube-api-access-9ldzv\") pod \"placement-666b564cfb-pk78f\" (UID: \"340fff5b-1313-4cbd-82fc-fee5941c1aef\") " pod="openstack/placement-666b564cfb-pk78f" Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.889199 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/340fff5b-1313-4cbd-82fc-fee5941c1aef-internal-tls-certs\") pod \"placement-666b564cfb-pk78f\" (UID: \"340fff5b-1313-4cbd-82fc-fee5941c1aef\") " pod="openstack/placement-666b564cfb-pk78f" Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.889237 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/340fff5b-1313-4cbd-82fc-fee5941c1aef-scripts\") pod \"placement-666b564cfb-pk78f\" (UID: \"340fff5b-1313-4cbd-82fc-fee5941c1aef\") " pod="openstack/placement-666b564cfb-pk78f" Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.889255 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/340fff5b-1313-4cbd-82fc-fee5941c1aef-config-data\") pod \"placement-666b564cfb-pk78f\" (UID: \"340fff5b-1313-4cbd-82fc-fee5941c1aef\") " pod="openstack/placement-666b564cfb-pk78f" Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.932274 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7f7dc9cccb-h2224"] Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.990571 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/340fff5b-1313-4cbd-82fc-fee5941c1aef-public-tls-certs\") pod \"placement-666b564cfb-pk78f\" (UID: \"340fff5b-1313-4cbd-82fc-fee5941c1aef\") " pod="openstack/placement-666b564cfb-pk78f" Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.990615 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ldzv\" (UniqueName: \"kubernetes.io/projected/340fff5b-1313-4cbd-82fc-fee5941c1aef-kube-api-access-9ldzv\") pod \"placement-666b564cfb-pk78f\" (UID: \"340fff5b-1313-4cbd-82fc-fee5941c1aef\") " pod="openstack/placement-666b564cfb-pk78f" Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.990647 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/340fff5b-1313-4cbd-82fc-fee5941c1aef-internal-tls-certs\") pod \"placement-666b564cfb-pk78f\" (UID: \"340fff5b-1313-4cbd-82fc-fee5941c1aef\") " pod="openstack/placement-666b564cfb-pk78f" Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.990686 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/340fff5b-1313-4cbd-82fc-fee5941c1aef-scripts\") pod \"placement-666b564cfb-pk78f\" (UID: \"340fff5b-1313-4cbd-82fc-fee5941c1aef\") " pod="openstack/placement-666b564cfb-pk78f" Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.990710 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/340fff5b-1313-4cbd-82fc-fee5941c1aef-config-data\") pod \"placement-666b564cfb-pk78f\" (UID: \"340fff5b-1313-4cbd-82fc-fee5941c1aef\") " pod="openstack/placement-666b564cfb-pk78f" Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.990775 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/340fff5b-1313-4cbd-82fc-fee5941c1aef-logs\") pod \"placement-666b564cfb-pk78f\" (UID: \"340fff5b-1313-4cbd-82fc-fee5941c1aef\") " pod="openstack/placement-666b564cfb-pk78f" Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.990803 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/340fff5b-1313-4cbd-82fc-fee5941c1aef-combined-ca-bundle\") pod \"placement-666b564cfb-pk78f\" (UID: \"340fff5b-1313-4cbd-82fc-fee5941c1aef\") " pod="openstack/placement-666b564cfb-pk78f" Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.993284 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/340fff5b-1313-4cbd-82fc-fee5941c1aef-logs\") pod \"placement-666b564cfb-pk78f\" (UID: \"340fff5b-1313-4cbd-82fc-fee5941c1aef\") " pod="openstack/placement-666b564cfb-pk78f" Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.995393 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/340fff5b-1313-4cbd-82fc-fee5941c1aef-combined-ca-bundle\") pod \"placement-666b564cfb-pk78f\" (UID: \"340fff5b-1313-4cbd-82fc-fee5941c1aef\") " pod="openstack/placement-666b564cfb-pk78f" Mar 12 15:07:56 crc kubenswrapper[4869]: I0312 15:07:56.996928 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/340fff5b-1313-4cbd-82fc-fee5941c1aef-public-tls-certs\") pod \"placement-666b564cfb-pk78f\" (UID: \"340fff5b-1313-4cbd-82fc-fee5941c1aef\") " pod="openstack/placement-666b564cfb-pk78f" Mar 12 15:07:57 crc kubenswrapper[4869]: I0312 15:07:57.003374 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/340fff5b-1313-4cbd-82fc-fee5941c1aef-scripts\") pod \"placement-666b564cfb-pk78f\" (UID: \"340fff5b-1313-4cbd-82fc-fee5941c1aef\") " pod="openstack/placement-666b564cfb-pk78f" Mar 12 15:07:57 crc kubenswrapper[4869]: I0312 15:07:57.004048 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/340fff5b-1313-4cbd-82fc-fee5941c1aef-internal-tls-certs\") pod \"placement-666b564cfb-pk78f\" (UID: \"340fff5b-1313-4cbd-82fc-fee5941c1aef\") " pod="openstack/placement-666b564cfb-pk78f" Mar 12 15:07:57 crc kubenswrapper[4869]: I0312 15:07:57.004280 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/340fff5b-1313-4cbd-82fc-fee5941c1aef-config-data\") pod \"placement-666b564cfb-pk78f\" (UID: \"340fff5b-1313-4cbd-82fc-fee5941c1aef\") " pod="openstack/placement-666b564cfb-pk78f" Mar 12 15:07:57 crc kubenswrapper[4869]: I0312 15:07:57.072723 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ldzv\" (UniqueName: \"kubernetes.io/projected/340fff5b-1313-4cbd-82fc-fee5941c1aef-kube-api-access-9ldzv\") pod \"placement-666b564cfb-pk78f\" (UID: \"340fff5b-1313-4cbd-82fc-fee5941c1aef\") " pod="openstack/placement-666b564cfb-pk78f" Mar 12 15:07:57 crc kubenswrapper[4869]: I0312 15:07:57.250193 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f7dc9cccb-h2224" event={"ID":"d33ab78c-e70a-4b83-9c6d-d94fab2fbd55","Type":"ContainerStarted","Data":"1278217bb1774c11ce94df9c8b2d85393904c84d9a532b6e7bd9869fc86adf7b"} Mar 12 15:07:57 crc kubenswrapper[4869]: I0312 15:07:57.250528 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f7dc9cccb-h2224" event={"ID":"d33ab78c-e70a-4b83-9c6d-d94fab2fbd55","Type":"ContainerStarted","Data":"d40b7506475eaea5422b6633c3e9528b8dc954bd0913f646753609e71f8341aa"} Mar 12 15:07:57 crc kubenswrapper[4869]: I0312 15:07:57.252578 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-78k6r" event={"ID":"95e0a90a-0269-471b-bd8d-a110809af063","Type":"ContainerStarted","Data":"779f0090be3e86acdb1e053767078ea8011c69e92098502aee66b94993914d15"} Mar 12 15:07:57 crc kubenswrapper[4869]: I0312 15:07:57.253155 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-78k6r" Mar 12 15:07:57 crc kubenswrapper[4869]: I0312 15:07:57.255092 4869 generic.go:334] "Generic (PLEG): container finished" podID="011be9f6-aaa2-48f8-8ef0-aa4c21e2d718" containerID="749a7118013cbc6f8c1739816d664a47f147c25b58b38882f1e0f5390a076d26" exitCode=0 Mar 12 15:07:57 crc kubenswrapper[4869]: I0312 15:07:57.255121 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jpbgs" event={"ID":"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718","Type":"ContainerDied","Data":"749a7118013cbc6f8c1739816d664a47f147c25b58b38882f1e0f5390a076d26"} Mar 12 15:07:57 crc kubenswrapper[4869]: I0312 15:07:57.258033 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55d7966795-cnsx8" event={"ID":"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66","Type":"ContainerStarted","Data":"eb3eceb6c713703cf2f509c9a1ecb5be324b76c42402dc90cd6f8aaaf2e69c4d"} Mar 12 15:07:57 crc kubenswrapper[4869]: I0312 15:07:57.258802 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-55d7966795-cnsx8" Mar 12 15:07:57 crc kubenswrapper[4869]: I0312 15:07:57.258814 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55d7966795-cnsx8" event={"ID":"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66","Type":"ContainerStarted","Data":"8a021443c4c1a11e6a0d7c8fb63dbb3f3362f0318db827eb7d54796300ac257b"} Mar 12 15:07:57 crc kubenswrapper[4869]: I0312 15:07:57.290358 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-78k6r" podStartSLOduration=4.290341209 podStartE2EDuration="4.290341209s" podCreationTimestamp="2026-03-12 15:07:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:07:57.272515639 +0000 UTC m=+1229.557740927" watchObservedRunningTime="2026-03-12 15:07:57.290341209 +0000 UTC m=+1229.575566487" Mar 12 15:07:57 crc kubenswrapper[4869]: I0312 15:07:57.314233 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-55d7966795-cnsx8" podStartSLOduration=2.314211802 podStartE2EDuration="2.314211802s" podCreationTimestamp="2026-03-12 15:07:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:07:57.30715371 +0000 UTC m=+1229.592378988" watchObservedRunningTime="2026-03-12 15:07:57.314211802 +0000 UTC m=+1229.599437080" Mar 12 15:07:57 crc kubenswrapper[4869]: I0312 15:07:57.345804 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-666b564cfb-pk78f" Mar 12 15:07:57 crc kubenswrapper[4869]: I0312 15:07:57.878250 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-666b564cfb-pk78f"] Mar 12 15:07:57 crc kubenswrapper[4869]: W0312 15:07:57.891797 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod340fff5b_1313_4cbd_82fc_fee5941c1aef.slice/crio-a3b98fcc48b220040e732cb5d31b573aa2ddaf500c1fcc624754b7d414654846 WatchSource:0}: Error finding container a3b98fcc48b220040e732cb5d31b573aa2ddaf500c1fcc624754b7d414654846: Status 404 returned error can't find the container with id a3b98fcc48b220040e732cb5d31b573aa2ddaf500c1fcc624754b7d414654846 Mar 12 15:07:58 crc kubenswrapper[4869]: I0312 15:07:58.266757 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-666b564cfb-pk78f" event={"ID":"340fff5b-1313-4cbd-82fc-fee5941c1aef","Type":"ContainerStarted","Data":"a3b98fcc48b220040e732cb5d31b573aa2ddaf500c1fcc624754b7d414654846"} Mar 12 15:07:58 crc kubenswrapper[4869]: I0312 15:07:58.272159 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f7dc9cccb-h2224" event={"ID":"d33ab78c-e70a-4b83-9c6d-d94fab2fbd55","Type":"ContainerStarted","Data":"3b8f0131e0017262115ec0dc23c7b2937c9c0540e2564adb238f25d2f1afd88c"} Mar 12 15:07:58 crc kubenswrapper[4869]: I0312 15:07:58.272388 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7f7dc9cccb-h2224" Mar 12 15:07:58 crc kubenswrapper[4869]: I0312 15:07:58.302166 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7f7dc9cccb-h2224" podStartSLOduration=5.302147101 podStartE2EDuration="5.302147101s" podCreationTimestamp="2026-03-12 15:07:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:07:58.292496935 +0000 UTC m=+1230.577722213" watchObservedRunningTime="2026-03-12 15:07:58.302147101 +0000 UTC m=+1230.587372379" Mar 12 15:07:58 crc kubenswrapper[4869]: I0312 15:07:58.759674 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jpbgs" Mar 12 15:07:58 crc kubenswrapper[4869]: I0312 15:07:58.830879 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-fernet-keys\") pod \"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718\" (UID: \"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718\") " Mar 12 15:07:58 crc kubenswrapper[4869]: I0312 15:07:58.831019 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-scripts\") pod \"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718\" (UID: \"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718\") " Mar 12 15:07:58 crc kubenswrapper[4869]: I0312 15:07:58.831041 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-credential-keys\") pod \"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718\" (UID: \"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718\") " Mar 12 15:07:58 crc kubenswrapper[4869]: I0312 15:07:58.831072 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4zfk\" (UniqueName: \"kubernetes.io/projected/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-kube-api-access-h4zfk\") pod \"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718\" (UID: \"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718\") " Mar 12 15:07:58 crc kubenswrapper[4869]: I0312 15:07:58.831121 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-config-data\") pod \"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718\" (UID: \"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718\") " Mar 12 15:07:58 crc kubenswrapper[4869]: I0312 15:07:58.831159 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-combined-ca-bundle\") pod \"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718\" (UID: \"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718\") " Mar 12 15:07:58 crc kubenswrapper[4869]: I0312 15:07:58.838725 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-kube-api-access-h4zfk" (OuterVolumeSpecName: "kube-api-access-h4zfk") pod "011be9f6-aaa2-48f8-8ef0-aa4c21e2d718" (UID: "011be9f6-aaa2-48f8-8ef0-aa4c21e2d718"). InnerVolumeSpecName "kube-api-access-h4zfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:07:58 crc kubenswrapper[4869]: I0312 15:07:58.840688 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "011be9f6-aaa2-48f8-8ef0-aa4c21e2d718" (UID: "011be9f6-aaa2-48f8-8ef0-aa4c21e2d718"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:07:58 crc kubenswrapper[4869]: I0312 15:07:58.840754 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "011be9f6-aaa2-48f8-8ef0-aa4c21e2d718" (UID: "011be9f6-aaa2-48f8-8ef0-aa4c21e2d718"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:07:58 crc kubenswrapper[4869]: I0312 15:07:58.840991 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-scripts" (OuterVolumeSpecName: "scripts") pod "011be9f6-aaa2-48f8-8ef0-aa4c21e2d718" (UID: "011be9f6-aaa2-48f8-8ef0-aa4c21e2d718"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:07:58 crc kubenswrapper[4869]: I0312 15:07:58.870633 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "011be9f6-aaa2-48f8-8ef0-aa4c21e2d718" (UID: "011be9f6-aaa2-48f8-8ef0-aa4c21e2d718"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:07:58 crc kubenswrapper[4869]: I0312 15:07:58.873188 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-config-data" (OuterVolumeSpecName: "config-data") pod "011be9f6-aaa2-48f8-8ef0-aa4c21e2d718" (UID: "011be9f6-aaa2-48f8-8ef0-aa4c21e2d718"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:07:58 crc kubenswrapper[4869]: I0312 15:07:58.933403 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:58 crc kubenswrapper[4869]: I0312 15:07:58.933433 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4zfk\" (UniqueName: \"kubernetes.io/projected/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-kube-api-access-h4zfk\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:58 crc kubenswrapper[4869]: I0312 15:07:58.933444 4869 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:58 crc kubenswrapper[4869]: I0312 15:07:58.933452 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:58 crc kubenswrapper[4869]: I0312 15:07:58.933462 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:58 crc kubenswrapper[4869]: I0312 15:07:58.933473 4869 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.279447 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jpbgs" event={"ID":"011be9f6-aaa2-48f8-8ef0-aa4c21e2d718","Type":"ContainerDied","Data":"31f1258ddb97f63c00b4197490116e1d79d34fee5bcbf84faf549183f728e5d7"} Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.279484 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jpbgs" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.279500 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31f1258ddb97f63c00b4197490116e1d79d34fee5bcbf84faf549183f728e5d7" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.283631 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-666b564cfb-pk78f" event={"ID":"340fff5b-1313-4cbd-82fc-fee5941c1aef","Type":"ContainerStarted","Data":"c48102a109e8a9305cb4408ad78c9f91ca883cff4ad62cb7da26befaae38e271"} Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.283665 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-666b564cfb-pk78f" event={"ID":"340fff5b-1313-4cbd-82fc-fee5941c1aef","Type":"ContainerStarted","Data":"cbdd6143d6e8baeff3ee039b1708c0b35bbd32ab8aeef7031d49822ee148b006"} Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.283709 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-666b564cfb-pk78f" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.283752 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-666b564cfb-pk78f" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.331706 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-666b564cfb-pk78f" podStartSLOduration=3.331686179 podStartE2EDuration="3.331686179s" podCreationTimestamp="2026-03-12 15:07:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:07:59.311821961 +0000 UTC m=+1231.597047239" watchObservedRunningTime="2026-03-12 15:07:59.331686179 +0000 UTC m=+1231.616911457" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.472599 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-ccddd4fb7-22sz6"] Mar 12 15:07:59 crc kubenswrapper[4869]: E0312 15:07:59.473031 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="011be9f6-aaa2-48f8-8ef0-aa4c21e2d718" containerName="keystone-bootstrap" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.473049 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="011be9f6-aaa2-48f8-8ef0-aa4c21e2d718" containerName="keystone-bootstrap" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.473224 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="011be9f6-aaa2-48f8-8ef0-aa4c21e2d718" containerName="keystone-bootstrap" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.473898 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ccddd4fb7-22sz6" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.481986 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ccddd4fb7-22sz6"] Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.482509 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.484117 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.484267 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-cl6hz" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.484344 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.484525 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.485263 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.552818 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/791764fa-f933-49c0-82ee-0a716cce2f01-combined-ca-bundle\") pod \"keystone-ccddd4fb7-22sz6\" (UID: \"791764fa-f933-49c0-82ee-0a716cce2f01\") " pod="openstack/keystone-ccddd4fb7-22sz6" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.552943 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/791764fa-f933-49c0-82ee-0a716cce2f01-public-tls-certs\") pod \"keystone-ccddd4fb7-22sz6\" (UID: \"791764fa-f933-49c0-82ee-0a716cce2f01\") " pod="openstack/keystone-ccddd4fb7-22sz6" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.552980 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/791764fa-f933-49c0-82ee-0a716cce2f01-internal-tls-certs\") pod \"keystone-ccddd4fb7-22sz6\" (UID: \"791764fa-f933-49c0-82ee-0a716cce2f01\") " pod="openstack/keystone-ccddd4fb7-22sz6" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.553125 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/791764fa-f933-49c0-82ee-0a716cce2f01-scripts\") pod \"keystone-ccddd4fb7-22sz6\" (UID: \"791764fa-f933-49c0-82ee-0a716cce2f01\") " pod="openstack/keystone-ccddd4fb7-22sz6" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.553231 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/791764fa-f933-49c0-82ee-0a716cce2f01-fernet-keys\") pod \"keystone-ccddd4fb7-22sz6\" (UID: \"791764fa-f933-49c0-82ee-0a716cce2f01\") " pod="openstack/keystone-ccddd4fb7-22sz6" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.553264 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/791764fa-f933-49c0-82ee-0a716cce2f01-credential-keys\") pod \"keystone-ccddd4fb7-22sz6\" (UID: \"791764fa-f933-49c0-82ee-0a716cce2f01\") " pod="openstack/keystone-ccddd4fb7-22sz6" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.553309 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rl9g\" (UniqueName: \"kubernetes.io/projected/791764fa-f933-49c0-82ee-0a716cce2f01-kube-api-access-8rl9g\") pod \"keystone-ccddd4fb7-22sz6\" (UID: \"791764fa-f933-49c0-82ee-0a716cce2f01\") " pod="openstack/keystone-ccddd4fb7-22sz6" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.553511 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/791764fa-f933-49c0-82ee-0a716cce2f01-config-data\") pod \"keystone-ccddd4fb7-22sz6\" (UID: \"791764fa-f933-49c0-82ee-0a716cce2f01\") " pod="openstack/keystone-ccddd4fb7-22sz6" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.655697 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rl9g\" (UniqueName: \"kubernetes.io/projected/791764fa-f933-49c0-82ee-0a716cce2f01-kube-api-access-8rl9g\") pod \"keystone-ccddd4fb7-22sz6\" (UID: \"791764fa-f933-49c0-82ee-0a716cce2f01\") " pod="openstack/keystone-ccddd4fb7-22sz6" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.655772 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/791764fa-f933-49c0-82ee-0a716cce2f01-config-data\") pod \"keystone-ccddd4fb7-22sz6\" (UID: \"791764fa-f933-49c0-82ee-0a716cce2f01\") " pod="openstack/keystone-ccddd4fb7-22sz6" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.655813 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/791764fa-f933-49c0-82ee-0a716cce2f01-combined-ca-bundle\") pod \"keystone-ccddd4fb7-22sz6\" (UID: \"791764fa-f933-49c0-82ee-0a716cce2f01\") " pod="openstack/keystone-ccddd4fb7-22sz6" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.655855 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/791764fa-f933-49c0-82ee-0a716cce2f01-public-tls-certs\") pod \"keystone-ccddd4fb7-22sz6\" (UID: \"791764fa-f933-49c0-82ee-0a716cce2f01\") " pod="openstack/keystone-ccddd4fb7-22sz6" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.655882 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/791764fa-f933-49c0-82ee-0a716cce2f01-internal-tls-certs\") pod \"keystone-ccddd4fb7-22sz6\" (UID: \"791764fa-f933-49c0-82ee-0a716cce2f01\") " pod="openstack/keystone-ccddd4fb7-22sz6" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.655948 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/791764fa-f933-49c0-82ee-0a716cce2f01-scripts\") pod \"keystone-ccddd4fb7-22sz6\" (UID: \"791764fa-f933-49c0-82ee-0a716cce2f01\") " pod="openstack/keystone-ccddd4fb7-22sz6" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.656027 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/791764fa-f933-49c0-82ee-0a716cce2f01-fernet-keys\") pod \"keystone-ccddd4fb7-22sz6\" (UID: \"791764fa-f933-49c0-82ee-0a716cce2f01\") " pod="openstack/keystone-ccddd4fb7-22sz6" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.656050 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/791764fa-f933-49c0-82ee-0a716cce2f01-credential-keys\") pod \"keystone-ccddd4fb7-22sz6\" (UID: \"791764fa-f933-49c0-82ee-0a716cce2f01\") " pod="openstack/keystone-ccddd4fb7-22sz6" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.665471 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/791764fa-f933-49c0-82ee-0a716cce2f01-fernet-keys\") pod \"keystone-ccddd4fb7-22sz6\" (UID: \"791764fa-f933-49c0-82ee-0a716cce2f01\") " pod="openstack/keystone-ccddd4fb7-22sz6" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.667751 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/791764fa-f933-49c0-82ee-0a716cce2f01-credential-keys\") pod \"keystone-ccddd4fb7-22sz6\" (UID: \"791764fa-f933-49c0-82ee-0a716cce2f01\") " pod="openstack/keystone-ccddd4fb7-22sz6" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.668076 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/791764fa-f933-49c0-82ee-0a716cce2f01-internal-tls-certs\") pod \"keystone-ccddd4fb7-22sz6\" (UID: \"791764fa-f933-49c0-82ee-0a716cce2f01\") " pod="openstack/keystone-ccddd4fb7-22sz6" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.668355 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/791764fa-f933-49c0-82ee-0a716cce2f01-scripts\") pod \"keystone-ccddd4fb7-22sz6\" (UID: \"791764fa-f933-49c0-82ee-0a716cce2f01\") " pod="openstack/keystone-ccddd4fb7-22sz6" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.684432 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/791764fa-f933-49c0-82ee-0a716cce2f01-config-data\") pod \"keystone-ccddd4fb7-22sz6\" (UID: \"791764fa-f933-49c0-82ee-0a716cce2f01\") " pod="openstack/keystone-ccddd4fb7-22sz6" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.688328 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/791764fa-f933-49c0-82ee-0a716cce2f01-combined-ca-bundle\") pod \"keystone-ccddd4fb7-22sz6\" (UID: \"791764fa-f933-49c0-82ee-0a716cce2f01\") " pod="openstack/keystone-ccddd4fb7-22sz6" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.689005 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rl9g\" (UniqueName: \"kubernetes.io/projected/791764fa-f933-49c0-82ee-0a716cce2f01-kube-api-access-8rl9g\") pod \"keystone-ccddd4fb7-22sz6\" (UID: \"791764fa-f933-49c0-82ee-0a716cce2f01\") " pod="openstack/keystone-ccddd4fb7-22sz6" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.689345 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/791764fa-f933-49c0-82ee-0a716cce2f01-public-tls-certs\") pod \"keystone-ccddd4fb7-22sz6\" (UID: \"791764fa-f933-49c0-82ee-0a716cce2f01\") " pod="openstack/keystone-ccddd4fb7-22sz6" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.709771 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.709828 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.730429 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.731060 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.757708 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.759878 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.808021 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.818098 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ccddd4fb7-22sz6" Mar 12 15:07:59 crc kubenswrapper[4869]: I0312 15:07:59.865729 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 12 15:08:00 crc kubenswrapper[4869]: I0312 15:08:00.138775 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555468-sjc6s"] Mar 12 15:08:00 crc kubenswrapper[4869]: I0312 15:08:00.140235 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555468-sjc6s" Mar 12 15:08:00 crc kubenswrapper[4869]: I0312 15:08:00.143638 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:08:00 crc kubenswrapper[4869]: I0312 15:08:00.143761 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:08:00 crc kubenswrapper[4869]: I0312 15:08:00.143655 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 15:08:00 crc kubenswrapper[4869]: I0312 15:08:00.168319 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555468-sjc6s"] Mar 12 15:08:00 crc kubenswrapper[4869]: I0312 15:08:00.265655 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzcb2\" (UniqueName: \"kubernetes.io/projected/762c4ca7-73e4-4585-be50-f2308260ce36-kube-api-access-gzcb2\") pod \"auto-csr-approver-29555468-sjc6s\" (UID: \"762c4ca7-73e4-4585-be50-f2308260ce36\") " pod="openshift-infra/auto-csr-approver-29555468-sjc6s" Mar 12 15:08:00 crc kubenswrapper[4869]: I0312 15:08:00.297335 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 12 15:08:00 crc kubenswrapper[4869]: I0312 15:08:00.297367 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 12 15:08:00 crc kubenswrapper[4869]: I0312 15:08:00.297386 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 12 15:08:00 crc kubenswrapper[4869]: I0312 15:08:00.297397 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 12 15:08:00 crc kubenswrapper[4869]: I0312 15:08:00.367769 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzcb2\" (UniqueName: \"kubernetes.io/projected/762c4ca7-73e4-4585-be50-f2308260ce36-kube-api-access-gzcb2\") pod \"auto-csr-approver-29555468-sjc6s\" (UID: \"762c4ca7-73e4-4585-be50-f2308260ce36\") " pod="openshift-infra/auto-csr-approver-29555468-sjc6s" Mar 12 15:08:00 crc kubenswrapper[4869]: I0312 15:08:00.404945 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzcb2\" (UniqueName: \"kubernetes.io/projected/762c4ca7-73e4-4585-be50-f2308260ce36-kube-api-access-gzcb2\") pod \"auto-csr-approver-29555468-sjc6s\" (UID: \"762c4ca7-73e4-4585-be50-f2308260ce36\") " pod="openshift-infra/auto-csr-approver-29555468-sjc6s" Mar 12 15:08:00 crc kubenswrapper[4869]: I0312 15:08:00.464709 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555468-sjc6s" Mar 12 15:08:02 crc kubenswrapper[4869]: I0312 15:08:02.318276 4869 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 15:08:02 crc kubenswrapper[4869]: I0312 15:08:02.318791 4869 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 15:08:02 crc kubenswrapper[4869]: I0312 15:08:02.432078 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 12 15:08:02 crc kubenswrapper[4869]: I0312 15:08:02.545919 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 12 15:08:02 crc kubenswrapper[4869]: I0312 15:08:02.546021 4869 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 15:08:02 crc kubenswrapper[4869]: I0312 15:08:02.789153 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 12 15:08:02 crc kubenswrapper[4869]: I0312 15:08:02.849639 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 12 15:08:03 crc kubenswrapper[4869]: I0312 15:08:03.607688 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-78k6r" Mar 12 15:08:03 crc kubenswrapper[4869]: I0312 15:08:03.701198 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2pbrx"] Mar 12 15:08:03 crc kubenswrapper[4869]: I0312 15:08:03.701441 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" podUID="e724f55d-2897-4b64-8a80-cdee522a7143" containerName="dnsmasq-dns" containerID="cri-o://c707e908255bc7787c8e32007a043fb2f818792e04634519ab5b21f0e75e694d" gracePeriod=10 Mar 12 15:08:04 crc kubenswrapper[4869]: I0312 15:08:04.355837 4869 generic.go:334] "Generic (PLEG): container finished" podID="e724f55d-2897-4b64-8a80-cdee522a7143" containerID="c707e908255bc7787c8e32007a043fb2f818792e04634519ab5b21f0e75e694d" exitCode=0 Mar 12 15:08:04 crc kubenswrapper[4869]: I0312 15:08:04.355918 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" event={"ID":"e724f55d-2897-4b64-8a80-cdee522a7143","Type":"ContainerDied","Data":"c707e908255bc7787c8e32007a043fb2f818792e04634519ab5b21f0e75e694d"} Mar 12 15:08:05 crc kubenswrapper[4869]: I0312 15:08:05.767852 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7f798b7b68-jtm8h" podUID="b4d63031-e072-466e-ae3c-d829a699b197" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Mar 12 15:08:05 crc kubenswrapper[4869]: I0312 15:08:05.833394 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-776cdff46d-hvjw9" podUID="f1a34267-2bcd-4a01-b2b6-7528c474a7a2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.155:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.155:8443: connect: connection refused" Mar 12 15:08:07 crc kubenswrapper[4869]: I0312 15:08:07.551882 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" podUID="e724f55d-2897-4b64-8a80-cdee522a7143" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.146:5353: connect: connection refused" Mar 12 15:08:11 crc kubenswrapper[4869]: E0312 15:08:11.288211 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/sg-core:latest" Mar 12 15:08:11 crc kubenswrapper[4869]: E0312 15:08:11.288899 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:sg-core,Image:quay.io/openstack-k8s-operators/sg-core:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:sg-core-conf-yaml,ReadOnly:false,MountPath:/etc/sg-core.conf.yaml,SubPath:sg-core.conf.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2mqqv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(b444668b-c6e2-42fc-93e2-8b14ef77eef3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 15:08:12 crc kubenswrapper[4869]: E0312 15:08:12.377608 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 12 15:08:12 crc kubenswrapper[4869]: E0312 15:08:12.378068 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s57vj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-t7fgp_openstack(ca1953bb-fc5d-4285-8de4-b67746201d05): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 15:08:12 crc kubenswrapper[4869]: E0312 15:08:12.379270 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-t7fgp" podUID="ca1953bb-fc5d-4285-8de4-b67746201d05" Mar 12 15:08:12 crc kubenswrapper[4869]: I0312 15:08:12.468826 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" event={"ID":"e724f55d-2897-4b64-8a80-cdee522a7143","Type":"ContainerDied","Data":"e49f1770b143a9135a560b637f31aaac5c06affa416c85df779f0c67bc032070"} Mar 12 15:08:12 crc kubenswrapper[4869]: I0312 15:08:12.469330 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e49f1770b143a9135a560b637f31aaac5c06affa416c85df779f0c67bc032070" Mar 12 15:08:12 crc kubenswrapper[4869]: I0312 15:08:12.660512 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" Mar 12 15:08:12 crc kubenswrapper[4869]: I0312 15:08:12.742576 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-config\") pod \"e724f55d-2897-4b64-8a80-cdee522a7143\" (UID: \"e724f55d-2897-4b64-8a80-cdee522a7143\") " Mar 12 15:08:12 crc kubenswrapper[4869]: I0312 15:08:12.742971 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6xfc\" (UniqueName: \"kubernetes.io/projected/e724f55d-2897-4b64-8a80-cdee522a7143-kube-api-access-k6xfc\") pod \"e724f55d-2897-4b64-8a80-cdee522a7143\" (UID: \"e724f55d-2897-4b64-8a80-cdee522a7143\") " Mar 12 15:08:12 crc kubenswrapper[4869]: I0312 15:08:12.743845 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-ovsdbserver-nb\") pod \"e724f55d-2897-4b64-8a80-cdee522a7143\" (UID: \"e724f55d-2897-4b64-8a80-cdee522a7143\") " Mar 12 15:08:12 crc kubenswrapper[4869]: I0312 15:08:12.743877 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-ovsdbserver-sb\") pod \"e724f55d-2897-4b64-8a80-cdee522a7143\" (UID: \"e724f55d-2897-4b64-8a80-cdee522a7143\") " Mar 12 15:08:12 crc kubenswrapper[4869]: I0312 15:08:12.743893 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-dns-svc\") pod \"e724f55d-2897-4b64-8a80-cdee522a7143\" (UID: \"e724f55d-2897-4b64-8a80-cdee522a7143\") " Mar 12 15:08:12 crc kubenswrapper[4869]: I0312 15:08:12.743934 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-dns-swift-storage-0\") pod \"e724f55d-2897-4b64-8a80-cdee522a7143\" (UID: \"e724f55d-2897-4b64-8a80-cdee522a7143\") " Mar 12 15:08:12 crc kubenswrapper[4869]: I0312 15:08:12.748524 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e724f55d-2897-4b64-8a80-cdee522a7143-kube-api-access-k6xfc" (OuterVolumeSpecName: "kube-api-access-k6xfc") pod "e724f55d-2897-4b64-8a80-cdee522a7143" (UID: "e724f55d-2897-4b64-8a80-cdee522a7143"). InnerVolumeSpecName "kube-api-access-k6xfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:08:12 crc kubenswrapper[4869]: I0312 15:08:12.798643 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e724f55d-2897-4b64-8a80-cdee522a7143" (UID: "e724f55d-2897-4b64-8a80-cdee522a7143"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:08:12 crc kubenswrapper[4869]: I0312 15:08:12.819716 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e724f55d-2897-4b64-8a80-cdee522a7143" (UID: "e724f55d-2897-4b64-8a80-cdee522a7143"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:08:12 crc kubenswrapper[4869]: I0312 15:08:12.819735 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-config" (OuterVolumeSpecName: "config") pod "e724f55d-2897-4b64-8a80-cdee522a7143" (UID: "e724f55d-2897-4b64-8a80-cdee522a7143"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:08:12 crc kubenswrapper[4869]: E0312 15:08:12.843953 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-dns-swift-storage-0 podName:e724f55d-2897-4b64-8a80-cdee522a7143 nodeName:}" failed. No retries permitted until 2026-03-12 15:08:13.343901459 +0000 UTC m=+1245.629126727 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "dns-swift-storage-0" (UniqueName: "kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-dns-swift-storage-0") pod "e724f55d-2897-4b64-8a80-cdee522a7143" (UID: "e724f55d-2897-4b64-8a80-cdee522a7143") : error deleting /var/lib/kubelet/pods/e724f55d-2897-4b64-8a80-cdee522a7143/volume-subpaths: remove /var/lib/kubelet/pods/e724f55d-2897-4b64-8a80-cdee522a7143/volume-subpaths: no such file or directory Mar 12 15:08:12 crc kubenswrapper[4869]: I0312 15:08:12.844085 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e724f55d-2897-4b64-8a80-cdee522a7143" (UID: "e724f55d-2897-4b64-8a80-cdee522a7143"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:08:12 crc kubenswrapper[4869]: I0312 15:08:12.850764 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:12 crc kubenswrapper[4869]: I0312 15:08:12.850794 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:12 crc kubenswrapper[4869]: I0312 15:08:12.850804 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-config\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:12 crc kubenswrapper[4869]: I0312 15:08:12.850812 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6xfc\" (UniqueName: \"kubernetes.io/projected/e724f55d-2897-4b64-8a80-cdee522a7143-kube-api-access-k6xfc\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:12 crc kubenswrapper[4869]: I0312 15:08:12.850823 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:12 crc kubenswrapper[4869]: I0312 15:08:12.864964 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ccddd4fb7-22sz6"] Mar 12 15:08:12 crc kubenswrapper[4869]: I0312 15:08:12.924525 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555468-sjc6s"] Mar 12 15:08:12 crc kubenswrapper[4869]: W0312 15:08:12.927272 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod762c4ca7_73e4_4585_be50_f2308260ce36.slice/crio-8cce8eff4843bf026d5941c7565127142a6148a79bf43d8ce75aa8a33607a8cb WatchSource:0}: Error finding container 8cce8eff4843bf026d5941c7565127142a6148a79bf43d8ce75aa8a33607a8cb: Status 404 returned error can't find the container with id 8cce8eff4843bf026d5941c7565127142a6148a79bf43d8ce75aa8a33607a8cb Mar 12 15:08:13 crc kubenswrapper[4869]: I0312 15:08:13.360335 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-dns-swift-storage-0\") pod \"e724f55d-2897-4b64-8a80-cdee522a7143\" (UID: \"e724f55d-2897-4b64-8a80-cdee522a7143\") " Mar 12 15:08:13 crc kubenswrapper[4869]: I0312 15:08:13.360740 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e724f55d-2897-4b64-8a80-cdee522a7143" (UID: "e724f55d-2897-4b64-8a80-cdee522a7143"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:08:13 crc kubenswrapper[4869]: I0312 15:08:13.362061 4869 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e724f55d-2897-4b64-8a80-cdee522a7143-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:13 crc kubenswrapper[4869]: I0312 15:08:13.480712 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555468-sjc6s" event={"ID":"762c4ca7-73e4-4585-be50-f2308260ce36","Type":"ContainerStarted","Data":"8cce8eff4843bf026d5941c7565127142a6148a79bf43d8ce75aa8a33607a8cb"} Mar 12 15:08:13 crc kubenswrapper[4869]: I0312 15:08:13.482833 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ccddd4fb7-22sz6" event={"ID":"791764fa-f933-49c0-82ee-0a716cce2f01","Type":"ContainerStarted","Data":"2955f0f000d1849767d84ee457059afb84af8cdcacf899a63d16aadd35090a12"} Mar 12 15:08:13 crc kubenswrapper[4869]: I0312 15:08:13.482870 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ccddd4fb7-22sz6" event={"ID":"791764fa-f933-49c0-82ee-0a716cce2f01","Type":"ContainerStarted","Data":"6921986a1cda2e7e7d0695bede8701fd133dcd384650c05fe95c63cb7c9d5eb7"} Mar 12 15:08:13 crc kubenswrapper[4869]: I0312 15:08:13.482905 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-ccddd4fb7-22sz6" Mar 12 15:08:13 crc kubenswrapper[4869]: I0312 15:08:13.485730 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-9p9gw" event={"ID":"e8721cab-3eb8-4c80-a0c8-79c7e007b614","Type":"ContainerStarted","Data":"08b04517a54d502fe370e31cd01123f53f5eb592994f193a2f2359fd30ade15f"} Mar 12 15:08:13 crc kubenswrapper[4869]: I0312 15:08:13.487840 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" Mar 12 15:08:13 crc kubenswrapper[4869]: I0312 15:08:13.489673 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-4rkhc" event={"ID":"bb0e86fd-502f-4fef-9f29-4d612a8d111f","Type":"ContainerStarted","Data":"089f1642db2255e3eebe62fe11a94058c554f002ae443794d01353199c0fce28"} Mar 12 15:08:13 crc kubenswrapper[4869]: I0312 15:08:13.525284 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-ccddd4fb7-22sz6" podStartSLOduration=14.525266619 podStartE2EDuration="14.525266619s" podCreationTimestamp="2026-03-12 15:07:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:08:13.510455116 +0000 UTC m=+1245.795680414" watchObservedRunningTime="2026-03-12 15:08:13.525266619 +0000 UTC m=+1245.810491897" Mar 12 15:08:13 crc kubenswrapper[4869]: I0312 15:08:13.532731 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2pbrx"] Mar 12 15:08:13 crc kubenswrapper[4869]: I0312 15:08:13.552040 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2pbrx"] Mar 12 15:08:13 crc kubenswrapper[4869]: I0312 15:08:13.553365 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-9p9gw" podStartSLOduration=3.712556403 podStartE2EDuration="47.553341832s" podCreationTimestamp="2026-03-12 15:07:26 +0000 UTC" firstStartedPulling="2026-03-12 15:07:28.597940962 +0000 UTC m=+1200.883166240" lastFinishedPulling="2026-03-12 15:08:12.438726401 +0000 UTC m=+1244.723951669" observedRunningTime="2026-03-12 15:08:13.537759327 +0000 UTC m=+1245.822984605" watchObservedRunningTime="2026-03-12 15:08:13.553341832 +0000 UTC m=+1245.838567110" Mar 12 15:08:13 crc kubenswrapper[4869]: I0312 15:08:13.583728 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-4rkhc" podStartSLOduration=2.977005749 podStartE2EDuration="46.58368601s" podCreationTimestamp="2026-03-12 15:07:27 +0000 UTC" firstStartedPulling="2026-03-12 15:07:28.824098652 +0000 UTC m=+1201.109323920" lastFinishedPulling="2026-03-12 15:08:12.430778903 +0000 UTC m=+1244.716004181" observedRunningTime="2026-03-12 15:08:13.552143948 +0000 UTC m=+1245.837369236" watchObservedRunningTime="2026-03-12 15:08:13.58368601 +0000 UTC m=+1245.868911288" Mar 12 15:08:14 crc kubenswrapper[4869]: I0312 15:08:14.357750 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e724f55d-2897-4b64-8a80-cdee522a7143" path="/var/lib/kubelet/pods/e724f55d-2897-4b64-8a80-cdee522a7143/volumes" Mar 12 15:08:14 crc kubenswrapper[4869]: I0312 15:08:14.497670 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555468-sjc6s" event={"ID":"762c4ca7-73e4-4585-be50-f2308260ce36","Type":"ContainerStarted","Data":"8ee38970a43dd928d743c1150be98a5af45f8d202531e2c2a85553d9c175dd60"} Mar 12 15:08:14 crc kubenswrapper[4869]: I0312 15:08:14.518986 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555468-sjc6s" podStartSLOduration=13.430036975 podStartE2EDuration="14.518969983s" podCreationTimestamp="2026-03-12 15:08:00 +0000 UTC" firstStartedPulling="2026-03-12 15:08:12.930755544 +0000 UTC m=+1245.215980822" lastFinishedPulling="2026-03-12 15:08:14.019688552 +0000 UTC m=+1246.304913830" observedRunningTime="2026-03-12 15:08:14.513440735 +0000 UTC m=+1246.798666023" watchObservedRunningTime="2026-03-12 15:08:14.518969983 +0000 UTC m=+1246.804195261" Mar 12 15:08:15 crc kubenswrapper[4869]: I0312 15:08:15.510123 4869 generic.go:334] "Generic (PLEG): container finished" podID="762c4ca7-73e4-4585-be50-f2308260ce36" containerID="8ee38970a43dd928d743c1150be98a5af45f8d202531e2c2a85553d9c175dd60" exitCode=0 Mar 12 15:08:15 crc kubenswrapper[4869]: I0312 15:08:15.510194 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555468-sjc6s" event={"ID":"762c4ca7-73e4-4585-be50-f2308260ce36","Type":"ContainerDied","Data":"8ee38970a43dd928d743c1150be98a5af45f8d202531e2c2a85553d9c175dd60"} Mar 12 15:08:15 crc kubenswrapper[4869]: I0312 15:08:15.512736 4869 generic.go:334] "Generic (PLEG): container finished" podID="bb0e86fd-502f-4fef-9f29-4d612a8d111f" containerID="089f1642db2255e3eebe62fe11a94058c554f002ae443794d01353199c0fce28" exitCode=0 Mar 12 15:08:15 crc kubenswrapper[4869]: I0312 15:08:15.512785 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-4rkhc" event={"ID":"bb0e86fd-502f-4fef-9f29-4d612a8d111f","Type":"ContainerDied","Data":"089f1642db2255e3eebe62fe11a94058c554f002ae443794d01353199c0fce28"} Mar 12 15:08:17 crc kubenswrapper[4869]: I0312 15:08:17.552864 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-785d8bcb8c-2pbrx" podUID="e724f55d-2897-4b64-8a80-cdee522a7143" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.146:5353: i/o timeout" Mar 12 15:08:17 crc kubenswrapper[4869]: I0312 15:08:17.659901 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7f798b7b68-jtm8h" Mar 12 15:08:17 crc kubenswrapper[4869]: I0312 15:08:17.738739 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-776cdff46d-hvjw9" Mar 12 15:08:19 crc kubenswrapper[4869]: I0312 15:08:19.378006 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7f798b7b68-jtm8h" Mar 12 15:08:19 crc kubenswrapper[4869]: I0312 15:08:19.413241 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-776cdff46d-hvjw9" Mar 12 15:08:19 crc kubenswrapper[4869]: I0312 15:08:19.474538 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7f798b7b68-jtm8h"] Mar 12 15:08:19 crc kubenswrapper[4869]: I0312 15:08:19.542796 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7f798b7b68-jtm8h" podUID="b4d63031-e072-466e-ae3c-d829a699b197" containerName="horizon-log" containerID="cri-o://a26eec3d4fd18f4819fca706f27ddd2aa65eae9751ad98f439f13147a5de8480" gracePeriod=30 Mar 12 15:08:19 crc kubenswrapper[4869]: I0312 15:08:19.543249 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7f798b7b68-jtm8h" podUID="b4d63031-e072-466e-ae3c-d829a699b197" containerName="horizon" containerID="cri-o://8d06094a459cda2058ff32c4ae52ed8cd8b1f55cc120d4cd9b716f47aa66a7f1" gracePeriod=30 Mar 12 15:08:19 crc kubenswrapper[4869]: I0312 15:08:19.684425 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:08:19 crc kubenswrapper[4869]: I0312 15:08:19.684488 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:08:22 crc kubenswrapper[4869]: I0312 15:08:22.540615 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555468-sjc6s" Mar 12 15:08:22 crc kubenswrapper[4869]: I0312 15:08:22.550993 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-4rkhc" Mar 12 15:08:22 crc kubenswrapper[4869]: I0312 15:08:22.564996 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-4rkhc" Mar 12 15:08:22 crc kubenswrapper[4869]: I0312 15:08:22.565354 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-4rkhc" event={"ID":"bb0e86fd-502f-4fef-9f29-4d612a8d111f","Type":"ContainerDied","Data":"e80c3e3b9e25726dfdf4341591f776898e52d4221fa26e6ca804babe3f5f6ffd"} Mar 12 15:08:22 crc kubenswrapper[4869]: I0312 15:08:22.565383 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e80c3e3b9e25726dfdf4341591f776898e52d4221fa26e6ca804babe3f5f6ffd" Mar 12 15:08:22 crc kubenswrapper[4869]: I0312 15:08:22.578934 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555468-sjc6s" event={"ID":"762c4ca7-73e4-4585-be50-f2308260ce36","Type":"ContainerDied","Data":"8cce8eff4843bf026d5941c7565127142a6148a79bf43d8ce75aa8a33607a8cb"} Mar 12 15:08:22 crc kubenswrapper[4869]: I0312 15:08:22.578972 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cce8eff4843bf026d5941c7565127142a6148a79bf43d8ce75aa8a33607a8cb" Mar 12 15:08:22 crc kubenswrapper[4869]: I0312 15:08:22.579024 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555468-sjc6s" Mar 12 15:08:22 crc kubenswrapper[4869]: I0312 15:08:22.640508 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0e86fd-502f-4fef-9f29-4d612a8d111f-combined-ca-bundle\") pod \"bb0e86fd-502f-4fef-9f29-4d612a8d111f\" (UID: \"bb0e86fd-502f-4fef-9f29-4d612a8d111f\") " Mar 12 15:08:22 crc kubenswrapper[4869]: I0312 15:08:22.640765 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bb0e86fd-502f-4fef-9f29-4d612a8d111f-db-sync-config-data\") pod \"bb0e86fd-502f-4fef-9f29-4d612a8d111f\" (UID: \"bb0e86fd-502f-4fef-9f29-4d612a8d111f\") " Mar 12 15:08:22 crc kubenswrapper[4869]: I0312 15:08:22.640817 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzcb2\" (UniqueName: \"kubernetes.io/projected/762c4ca7-73e4-4585-be50-f2308260ce36-kube-api-access-gzcb2\") pod \"762c4ca7-73e4-4585-be50-f2308260ce36\" (UID: \"762c4ca7-73e4-4585-be50-f2308260ce36\") " Mar 12 15:08:22 crc kubenswrapper[4869]: I0312 15:08:22.640904 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2rtz\" (UniqueName: \"kubernetes.io/projected/bb0e86fd-502f-4fef-9f29-4d612a8d111f-kube-api-access-v2rtz\") pod \"bb0e86fd-502f-4fef-9f29-4d612a8d111f\" (UID: \"bb0e86fd-502f-4fef-9f29-4d612a8d111f\") " Mar 12 15:08:22 crc kubenswrapper[4869]: I0312 15:08:22.650889 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/762c4ca7-73e4-4585-be50-f2308260ce36-kube-api-access-gzcb2" (OuterVolumeSpecName: "kube-api-access-gzcb2") pod "762c4ca7-73e4-4585-be50-f2308260ce36" (UID: "762c4ca7-73e4-4585-be50-f2308260ce36"). InnerVolumeSpecName "kube-api-access-gzcb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:08:22 crc kubenswrapper[4869]: I0312 15:08:22.659954 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb0e86fd-502f-4fef-9f29-4d612a8d111f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "bb0e86fd-502f-4fef-9f29-4d612a8d111f" (UID: "bb0e86fd-502f-4fef-9f29-4d612a8d111f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:22 crc kubenswrapper[4869]: I0312 15:08:22.707986 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb0e86fd-502f-4fef-9f29-4d612a8d111f-kube-api-access-v2rtz" (OuterVolumeSpecName: "kube-api-access-v2rtz") pod "bb0e86fd-502f-4fef-9f29-4d612a8d111f" (UID: "bb0e86fd-502f-4fef-9f29-4d612a8d111f"). InnerVolumeSpecName "kube-api-access-v2rtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:08:22 crc kubenswrapper[4869]: I0312 15:08:22.741138 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb0e86fd-502f-4fef-9f29-4d612a8d111f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb0e86fd-502f-4fef-9f29-4d612a8d111f" (UID: "bb0e86fd-502f-4fef-9f29-4d612a8d111f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:22 crc kubenswrapper[4869]: I0312 15:08:22.742517 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2rtz\" (UniqueName: \"kubernetes.io/projected/bb0e86fd-502f-4fef-9f29-4d612a8d111f-kube-api-access-v2rtz\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:22 crc kubenswrapper[4869]: I0312 15:08:22.743615 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0e86fd-502f-4fef-9f29-4d612a8d111f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:22 crc kubenswrapper[4869]: I0312 15:08:22.743717 4869 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bb0e86fd-502f-4fef-9f29-4d612a8d111f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:22 crc kubenswrapper[4869]: I0312 15:08:22.743777 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzcb2\" (UniqueName: \"kubernetes.io/projected/762c4ca7-73e4-4585-be50-f2308260ce36-kube-api-access-gzcb2\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:23 crc kubenswrapper[4869]: E0312 15:08:23.393885 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"sg-core\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="b444668b-c6e2-42fc-93e2-8b14ef77eef3" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.590237 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b444668b-c6e2-42fc-93e2-8b14ef77eef3","Type":"ContainerStarted","Data":"a44fa9d068dd3a2cb5ec245dc60d6c66591654071eae21033192d8c47b0db88b"} Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.591089 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b444668b-c6e2-42fc-93e2-8b14ef77eef3" containerName="ceilometer-central-agent" containerID="cri-o://f1039753306fc2ec2026e325eee31c594a3f58104c33a5cc050b97797fd018cc" gracePeriod=30 Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.591470 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.591958 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b444668b-c6e2-42fc-93e2-8b14ef77eef3" containerName="proxy-httpd" containerID="cri-o://a44fa9d068dd3a2cb5ec245dc60d6c66591654071eae21033192d8c47b0db88b" gracePeriod=30 Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.592366 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b444668b-c6e2-42fc-93e2-8b14ef77eef3" containerName="ceilometer-notification-agent" containerID="cri-o://d14023009c101fa53090e88a111afcc29bfeb3173093256ce2dac885f10fddf3" gracePeriod=30 Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.599295 4869 generic.go:334] "Generic (PLEG): container finished" podID="b4d63031-e072-466e-ae3c-d829a699b197" containerID="8d06094a459cda2058ff32c4ae52ed8cd8b1f55cc120d4cd9b716f47aa66a7f1" exitCode=0 Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.599354 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f798b7b68-jtm8h" event={"ID":"b4d63031-e072-466e-ae3c-d829a699b197","Type":"ContainerDied","Data":"8d06094a459cda2058ff32c4ae52ed8cd8b1f55cc120d4cd9b716f47aa66a7f1"} Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.604751 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555462-wdv62"] Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.620530 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555462-wdv62"] Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.655761 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7f7dc9cccb-h2224" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.842864 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5fd5649b5f-5rjlk"] Mar 12 15:08:23 crc kubenswrapper[4869]: E0312 15:08:23.843521 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="762c4ca7-73e4-4585-be50-f2308260ce36" containerName="oc" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.843552 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="762c4ca7-73e4-4585-be50-f2308260ce36" containerName="oc" Mar 12 15:08:23 crc kubenswrapper[4869]: E0312 15:08:23.843567 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e724f55d-2897-4b64-8a80-cdee522a7143" containerName="init" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.843573 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e724f55d-2897-4b64-8a80-cdee522a7143" containerName="init" Mar 12 15:08:23 crc kubenswrapper[4869]: E0312 15:08:23.843602 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e724f55d-2897-4b64-8a80-cdee522a7143" containerName="dnsmasq-dns" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.843608 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e724f55d-2897-4b64-8a80-cdee522a7143" containerName="dnsmasq-dns" Mar 12 15:08:23 crc kubenswrapper[4869]: E0312 15:08:23.843621 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb0e86fd-502f-4fef-9f29-4d612a8d111f" containerName="barbican-db-sync" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.843627 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb0e86fd-502f-4fef-9f29-4d612a8d111f" containerName="barbican-db-sync" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.843813 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb0e86fd-502f-4fef-9f29-4d612a8d111f" containerName="barbican-db-sync" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.843837 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="762c4ca7-73e4-4585-be50-f2308260ce36" containerName="oc" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.843852 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="e724f55d-2897-4b64-8a80-cdee522a7143" containerName="dnsmasq-dns" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.844839 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5fd5649b5f-5rjlk" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.850281 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-b57d7b9cd-fglxg"] Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.854638 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.855450 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.855629 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-vkmmd" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.857788 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-b57d7b9cd-fglxg" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.859987 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41e8dfe2-49a6-48a2-a037-d090aa545010-logs\") pod \"barbican-keystone-listener-b57d7b9cd-fglxg\" (UID: \"41e8dfe2-49a6-48a2-a037-d090aa545010\") " pod="openstack/barbican-keystone-listener-b57d7b9cd-fglxg" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.860038 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d671489-0a10-4779-8567-c34e80544dbb-logs\") pod \"barbican-worker-5fd5649b5f-5rjlk\" (UID: \"6d671489-0a10-4779-8567-c34e80544dbb\") " pod="openstack/barbican-worker-5fd5649b5f-5rjlk" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.860063 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41e8dfe2-49a6-48a2-a037-d090aa545010-config-data\") pod \"barbican-keystone-listener-b57d7b9cd-fglxg\" (UID: \"41e8dfe2-49a6-48a2-a037-d090aa545010\") " pod="openstack/barbican-keystone-listener-b57d7b9cd-fglxg" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.860120 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41e8dfe2-49a6-48a2-a037-d090aa545010-config-data-custom\") pod \"barbican-keystone-listener-b57d7b9cd-fglxg\" (UID: \"41e8dfe2-49a6-48a2-a037-d090aa545010\") " pod="openstack/barbican-keystone-listener-b57d7b9cd-fglxg" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.860157 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkmlg\" (UniqueName: \"kubernetes.io/projected/6d671489-0a10-4779-8567-c34e80544dbb-kube-api-access-fkmlg\") pod \"barbican-worker-5fd5649b5f-5rjlk\" (UID: \"6d671489-0a10-4779-8567-c34e80544dbb\") " pod="openstack/barbican-worker-5fd5649b5f-5rjlk" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.860176 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d671489-0a10-4779-8567-c34e80544dbb-config-data\") pod \"barbican-worker-5fd5649b5f-5rjlk\" (UID: \"6d671489-0a10-4779-8567-c34e80544dbb\") " pod="openstack/barbican-worker-5fd5649b5f-5rjlk" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.860194 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d671489-0a10-4779-8567-c34e80544dbb-combined-ca-bundle\") pod \"barbican-worker-5fd5649b5f-5rjlk\" (UID: \"6d671489-0a10-4779-8567-c34e80544dbb\") " pod="openstack/barbican-worker-5fd5649b5f-5rjlk" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.860211 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41e8dfe2-49a6-48a2-a037-d090aa545010-combined-ca-bundle\") pod \"barbican-keystone-listener-b57d7b9cd-fglxg\" (UID: \"41e8dfe2-49a6-48a2-a037-d090aa545010\") " pod="openstack/barbican-keystone-listener-b57d7b9cd-fglxg" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.860236 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d671489-0a10-4779-8567-c34e80544dbb-config-data-custom\") pod \"barbican-worker-5fd5649b5f-5rjlk\" (UID: \"6d671489-0a10-4779-8567-c34e80544dbb\") " pod="openstack/barbican-worker-5fd5649b5f-5rjlk" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.860260 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-png6m\" (UniqueName: \"kubernetes.io/projected/41e8dfe2-49a6-48a2-a037-d090aa545010-kube-api-access-png6m\") pod \"barbican-keystone-listener-b57d7b9cd-fglxg\" (UID: \"41e8dfe2-49a6-48a2-a037-d090aa545010\") " pod="openstack/barbican-keystone-listener-b57d7b9cd-fglxg" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.872786 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.874559 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5fd5649b5f-5rjlk"] Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.933980 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-b57d7b9cd-fglxg"] Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.961278 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d671489-0a10-4779-8567-c34e80544dbb-logs\") pod \"barbican-worker-5fd5649b5f-5rjlk\" (UID: \"6d671489-0a10-4779-8567-c34e80544dbb\") " pod="openstack/barbican-worker-5fd5649b5f-5rjlk" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.961327 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41e8dfe2-49a6-48a2-a037-d090aa545010-config-data\") pod \"barbican-keystone-listener-b57d7b9cd-fglxg\" (UID: \"41e8dfe2-49a6-48a2-a037-d090aa545010\") " pod="openstack/barbican-keystone-listener-b57d7b9cd-fglxg" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.962196 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41e8dfe2-49a6-48a2-a037-d090aa545010-config-data-custom\") pod \"barbican-keystone-listener-b57d7b9cd-fglxg\" (UID: \"41e8dfe2-49a6-48a2-a037-d090aa545010\") " pod="openstack/barbican-keystone-listener-b57d7b9cd-fglxg" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.962276 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkmlg\" (UniqueName: \"kubernetes.io/projected/6d671489-0a10-4779-8567-c34e80544dbb-kube-api-access-fkmlg\") pod \"barbican-worker-5fd5649b5f-5rjlk\" (UID: \"6d671489-0a10-4779-8567-c34e80544dbb\") " pod="openstack/barbican-worker-5fd5649b5f-5rjlk" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.962311 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d671489-0a10-4779-8567-c34e80544dbb-config-data\") pod \"barbican-worker-5fd5649b5f-5rjlk\" (UID: \"6d671489-0a10-4779-8567-c34e80544dbb\") " pod="openstack/barbican-worker-5fd5649b5f-5rjlk" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.962330 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d671489-0a10-4779-8567-c34e80544dbb-combined-ca-bundle\") pod \"barbican-worker-5fd5649b5f-5rjlk\" (UID: \"6d671489-0a10-4779-8567-c34e80544dbb\") " pod="openstack/barbican-worker-5fd5649b5f-5rjlk" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.962355 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41e8dfe2-49a6-48a2-a037-d090aa545010-combined-ca-bundle\") pod \"barbican-keystone-listener-b57d7b9cd-fglxg\" (UID: \"41e8dfe2-49a6-48a2-a037-d090aa545010\") " pod="openstack/barbican-keystone-listener-b57d7b9cd-fglxg" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.962398 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d671489-0a10-4779-8567-c34e80544dbb-config-data-custom\") pod \"barbican-worker-5fd5649b5f-5rjlk\" (UID: \"6d671489-0a10-4779-8567-c34e80544dbb\") " pod="openstack/barbican-worker-5fd5649b5f-5rjlk" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.962433 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-png6m\" (UniqueName: \"kubernetes.io/projected/41e8dfe2-49a6-48a2-a037-d090aa545010-kube-api-access-png6m\") pod \"barbican-keystone-listener-b57d7b9cd-fglxg\" (UID: \"41e8dfe2-49a6-48a2-a037-d090aa545010\") " pod="openstack/barbican-keystone-listener-b57d7b9cd-fglxg" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.962452 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41e8dfe2-49a6-48a2-a037-d090aa545010-logs\") pod \"barbican-keystone-listener-b57d7b9cd-fglxg\" (UID: \"41e8dfe2-49a6-48a2-a037-d090aa545010\") " pod="openstack/barbican-keystone-listener-b57d7b9cd-fglxg" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.962834 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41e8dfe2-49a6-48a2-a037-d090aa545010-logs\") pod \"barbican-keystone-listener-b57d7b9cd-fglxg\" (UID: \"41e8dfe2-49a6-48a2-a037-d090aa545010\") " pod="openstack/barbican-keystone-listener-b57d7b9cd-fglxg" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.964417 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-gp8rq"] Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.966393 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-gp8rq" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.969039 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d671489-0a10-4779-8567-c34e80544dbb-logs\") pod \"barbican-worker-5fd5649b5f-5rjlk\" (UID: \"6d671489-0a10-4779-8567-c34e80544dbb\") " pod="openstack/barbican-worker-5fd5649b5f-5rjlk" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.975563 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41e8dfe2-49a6-48a2-a037-d090aa545010-config-data-custom\") pod \"barbican-keystone-listener-b57d7b9cd-fglxg\" (UID: \"41e8dfe2-49a6-48a2-a037-d090aa545010\") " pod="openstack/barbican-keystone-listener-b57d7b9cd-fglxg" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.981971 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d671489-0a10-4779-8567-c34e80544dbb-config-data\") pod \"barbican-worker-5fd5649b5f-5rjlk\" (UID: \"6d671489-0a10-4779-8567-c34e80544dbb\") " pod="openstack/barbican-worker-5fd5649b5f-5rjlk" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.982512 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d671489-0a10-4779-8567-c34e80544dbb-combined-ca-bundle\") pod \"barbican-worker-5fd5649b5f-5rjlk\" (UID: \"6d671489-0a10-4779-8567-c34e80544dbb\") " pod="openstack/barbican-worker-5fd5649b5f-5rjlk" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.984452 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d671489-0a10-4779-8567-c34e80544dbb-config-data-custom\") pod \"barbican-worker-5fd5649b5f-5rjlk\" (UID: \"6d671489-0a10-4779-8567-c34e80544dbb\") " pod="openstack/barbican-worker-5fd5649b5f-5rjlk" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.984641 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41e8dfe2-49a6-48a2-a037-d090aa545010-combined-ca-bundle\") pod \"barbican-keystone-listener-b57d7b9cd-fglxg\" (UID: \"41e8dfe2-49a6-48a2-a037-d090aa545010\") " pod="openstack/barbican-keystone-listener-b57d7b9cd-fglxg" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.985385 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41e8dfe2-49a6-48a2-a037-d090aa545010-config-data\") pod \"barbican-keystone-listener-b57d7b9cd-fglxg\" (UID: \"41e8dfe2-49a6-48a2-a037-d090aa545010\") " pod="openstack/barbican-keystone-listener-b57d7b9cd-fglxg" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.995430 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-png6m\" (UniqueName: \"kubernetes.io/projected/41e8dfe2-49a6-48a2-a037-d090aa545010-kube-api-access-png6m\") pod \"barbican-keystone-listener-b57d7b9cd-fglxg\" (UID: \"41e8dfe2-49a6-48a2-a037-d090aa545010\") " pod="openstack/barbican-keystone-listener-b57d7b9cd-fglxg" Mar 12 15:08:23 crc kubenswrapper[4869]: I0312 15:08:23.999294 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkmlg\" (UniqueName: \"kubernetes.io/projected/6d671489-0a10-4779-8567-c34e80544dbb-kube-api-access-fkmlg\") pod \"barbican-worker-5fd5649b5f-5rjlk\" (UID: \"6d671489-0a10-4779-8567-c34e80544dbb\") " pod="openstack/barbican-worker-5fd5649b5f-5rjlk" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.017588 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-gp8rq"] Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.059457 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-55d7966795-cnsx8"] Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.059720 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-55d7966795-cnsx8" podUID="bc4e8fd1-48da-4dac-8d0b-bed00a35ad66" containerName="neutron-api" containerID="cri-o://8a021443c4c1a11e6a0d7c8fb63dbb3f3362f0318db827eb7d54796300ac257b" gracePeriod=30 Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.060451 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-55d7966795-cnsx8" podUID="bc4e8fd1-48da-4dac-8d0b-bed00a35ad66" containerName="neutron-httpd" containerID="cri-o://eb3eceb6c713703cf2f509c9a1ecb5be324b76c42402dc90cd6f8aaaf2e69c4d" gracePeriod=30 Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.072697 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-75658d85fd-ktwbd"] Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.074633 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75658d85fd-ktwbd" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.078528 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.093789 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-75658d85fd-ktwbd"] Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.105604 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-79647c85bf-8lt5x"] Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.107094 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79647c85bf-8lt5x" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.163621 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-79647c85bf-8lt5x"] Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.173513 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/898fa39d-459b-4d0b-969a-b8e93ea03fe1-combined-ca-bundle\") pod \"barbican-api-75658d85fd-ktwbd\" (UID: \"898fa39d-459b-4d0b-969a-b8e93ea03fe1\") " pod="openstack/barbican-api-75658d85fd-ktwbd" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.173603 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-gp8rq\" (UID: \"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07\") " pod="openstack/dnsmasq-dns-85ff748b95-gp8rq" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.173627 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-gp8rq\" (UID: \"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07\") " pod="openstack/dnsmasq-dns-85ff748b95-gp8rq" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.173649 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/898fa39d-459b-4d0b-969a-b8e93ea03fe1-logs\") pod \"barbican-api-75658d85fd-ktwbd\" (UID: \"898fa39d-459b-4d0b-969a-b8e93ea03fe1\") " pod="openstack/barbican-api-75658d85fd-ktwbd" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.173665 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/898fa39d-459b-4d0b-969a-b8e93ea03fe1-config-data-custom\") pod \"barbican-api-75658d85fd-ktwbd\" (UID: \"898fa39d-459b-4d0b-969a-b8e93ea03fe1\") " pod="openstack/barbican-api-75658d85fd-ktwbd" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.173679 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/898fa39d-459b-4d0b-969a-b8e93ea03fe1-config-data\") pod \"barbican-api-75658d85fd-ktwbd\" (UID: \"898fa39d-459b-4d0b-969a-b8e93ea03fe1\") " pod="openstack/barbican-api-75658d85fd-ktwbd" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.173723 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-config\") pod \"dnsmasq-dns-85ff748b95-gp8rq\" (UID: \"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07\") " pod="openstack/dnsmasq-dns-85ff748b95-gp8rq" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.173760 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdztz\" (UniqueName: \"kubernetes.io/projected/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-kube-api-access-bdztz\") pod \"dnsmasq-dns-85ff748b95-gp8rq\" (UID: \"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07\") " pod="openstack/dnsmasq-dns-85ff748b95-gp8rq" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.173780 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-gp8rq\" (UID: \"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07\") " pod="openstack/dnsmasq-dns-85ff748b95-gp8rq" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.173816 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-dns-svc\") pod \"dnsmasq-dns-85ff748b95-gp8rq\" (UID: \"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07\") " pod="openstack/dnsmasq-dns-85ff748b95-gp8rq" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.173835 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbfrj\" (UniqueName: \"kubernetes.io/projected/898fa39d-459b-4d0b-969a-b8e93ea03fe1-kube-api-access-vbfrj\") pod \"barbican-api-75658d85fd-ktwbd\" (UID: \"898fa39d-459b-4d0b-969a-b8e93ea03fe1\") " pod="openstack/barbican-api-75658d85fd-ktwbd" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.177014 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5fd5649b5f-5rjlk" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.207077 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-b57d7b9cd-fglxg" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.285117 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdztz\" (UniqueName: \"kubernetes.io/projected/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-kube-api-access-bdztz\") pod \"dnsmasq-dns-85ff748b95-gp8rq\" (UID: \"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07\") " pod="openstack/dnsmasq-dns-85ff748b95-gp8rq" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.285163 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4e6ce48-0823-4cd4-8b05-252a9ecbe205-combined-ca-bundle\") pod \"neutron-79647c85bf-8lt5x\" (UID: \"b4e6ce48-0823-4cd4-8b05-252a9ecbe205\") " pod="openstack/neutron-79647c85bf-8lt5x" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.285187 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b4e6ce48-0823-4cd4-8b05-252a9ecbe205-httpd-config\") pod \"neutron-79647c85bf-8lt5x\" (UID: \"b4e6ce48-0823-4cd4-8b05-252a9ecbe205\") " pod="openstack/neutron-79647c85bf-8lt5x" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.285212 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-gp8rq\" (UID: \"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07\") " pod="openstack/dnsmasq-dns-85ff748b95-gp8rq" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.285305 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-dns-svc\") pod \"dnsmasq-dns-85ff748b95-gp8rq\" (UID: \"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07\") " pod="openstack/dnsmasq-dns-85ff748b95-gp8rq" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.285337 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbfrj\" (UniqueName: \"kubernetes.io/projected/898fa39d-459b-4d0b-969a-b8e93ea03fe1-kube-api-access-vbfrj\") pod \"barbican-api-75658d85fd-ktwbd\" (UID: \"898fa39d-459b-4d0b-969a-b8e93ea03fe1\") " pod="openstack/barbican-api-75658d85fd-ktwbd" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.285388 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs5kx\" (UniqueName: \"kubernetes.io/projected/b4e6ce48-0823-4cd4-8b05-252a9ecbe205-kube-api-access-qs5kx\") pod \"neutron-79647c85bf-8lt5x\" (UID: \"b4e6ce48-0823-4cd4-8b05-252a9ecbe205\") " pod="openstack/neutron-79647c85bf-8lt5x" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.285432 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b4e6ce48-0823-4cd4-8b05-252a9ecbe205-config\") pod \"neutron-79647c85bf-8lt5x\" (UID: \"b4e6ce48-0823-4cd4-8b05-252a9ecbe205\") " pod="openstack/neutron-79647c85bf-8lt5x" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.285483 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/898fa39d-459b-4d0b-969a-b8e93ea03fe1-combined-ca-bundle\") pod \"barbican-api-75658d85fd-ktwbd\" (UID: \"898fa39d-459b-4d0b-969a-b8e93ea03fe1\") " pod="openstack/barbican-api-75658d85fd-ktwbd" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.285516 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4e6ce48-0823-4cd4-8b05-252a9ecbe205-internal-tls-certs\") pod \"neutron-79647c85bf-8lt5x\" (UID: \"b4e6ce48-0823-4cd4-8b05-252a9ecbe205\") " pod="openstack/neutron-79647c85bf-8lt5x" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.285574 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4e6ce48-0823-4cd4-8b05-252a9ecbe205-ovndb-tls-certs\") pod \"neutron-79647c85bf-8lt5x\" (UID: \"b4e6ce48-0823-4cd4-8b05-252a9ecbe205\") " pod="openstack/neutron-79647c85bf-8lt5x" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.285730 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-gp8rq\" (UID: \"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07\") " pod="openstack/dnsmasq-dns-85ff748b95-gp8rq" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.285767 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-gp8rq\" (UID: \"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07\") " pod="openstack/dnsmasq-dns-85ff748b95-gp8rq" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.285808 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/898fa39d-459b-4d0b-969a-b8e93ea03fe1-config-data-custom\") pod \"barbican-api-75658d85fd-ktwbd\" (UID: \"898fa39d-459b-4d0b-969a-b8e93ea03fe1\") " pod="openstack/barbican-api-75658d85fd-ktwbd" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.285822 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/898fa39d-459b-4d0b-969a-b8e93ea03fe1-logs\") pod \"barbican-api-75658d85fd-ktwbd\" (UID: \"898fa39d-459b-4d0b-969a-b8e93ea03fe1\") " pod="openstack/barbican-api-75658d85fd-ktwbd" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.285840 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/898fa39d-459b-4d0b-969a-b8e93ea03fe1-config-data\") pod \"barbican-api-75658d85fd-ktwbd\" (UID: \"898fa39d-459b-4d0b-969a-b8e93ea03fe1\") " pod="openstack/barbican-api-75658d85fd-ktwbd" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.285922 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4e6ce48-0823-4cd4-8b05-252a9ecbe205-public-tls-certs\") pod \"neutron-79647c85bf-8lt5x\" (UID: \"b4e6ce48-0823-4cd4-8b05-252a9ecbe205\") " pod="openstack/neutron-79647c85bf-8lt5x" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.285963 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-config\") pod \"dnsmasq-dns-85ff748b95-gp8rq\" (UID: \"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07\") " pod="openstack/dnsmasq-dns-85ff748b95-gp8rq" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.287079 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-gp8rq\" (UID: \"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07\") " pod="openstack/dnsmasq-dns-85ff748b95-gp8rq" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.287376 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-config\") pod \"dnsmasq-dns-85ff748b95-gp8rq\" (UID: \"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07\") " pod="openstack/dnsmasq-dns-85ff748b95-gp8rq" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.287783 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/898fa39d-459b-4d0b-969a-b8e93ea03fe1-logs\") pod \"barbican-api-75658d85fd-ktwbd\" (UID: \"898fa39d-459b-4d0b-969a-b8e93ea03fe1\") " pod="openstack/barbican-api-75658d85fd-ktwbd" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.288125 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-gp8rq\" (UID: \"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07\") " pod="openstack/dnsmasq-dns-85ff748b95-gp8rq" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.290185 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-dns-svc\") pod \"dnsmasq-dns-85ff748b95-gp8rq\" (UID: \"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07\") " pod="openstack/dnsmasq-dns-85ff748b95-gp8rq" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.290299 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-gp8rq\" (UID: \"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07\") " pod="openstack/dnsmasq-dns-85ff748b95-gp8rq" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.291511 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/898fa39d-459b-4d0b-969a-b8e93ea03fe1-combined-ca-bundle\") pod \"barbican-api-75658d85fd-ktwbd\" (UID: \"898fa39d-459b-4d0b-969a-b8e93ea03fe1\") " pod="openstack/barbican-api-75658d85fd-ktwbd" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.294559 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/898fa39d-459b-4d0b-969a-b8e93ea03fe1-config-data\") pod \"barbican-api-75658d85fd-ktwbd\" (UID: \"898fa39d-459b-4d0b-969a-b8e93ea03fe1\") " pod="openstack/barbican-api-75658d85fd-ktwbd" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.295477 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/898fa39d-459b-4d0b-969a-b8e93ea03fe1-config-data-custom\") pod \"barbican-api-75658d85fd-ktwbd\" (UID: \"898fa39d-459b-4d0b-969a-b8e93ea03fe1\") " pod="openstack/barbican-api-75658d85fd-ktwbd" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.308846 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdztz\" (UniqueName: \"kubernetes.io/projected/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-kube-api-access-bdztz\") pod \"dnsmasq-dns-85ff748b95-gp8rq\" (UID: \"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07\") " pod="openstack/dnsmasq-dns-85ff748b95-gp8rq" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.311098 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbfrj\" (UniqueName: \"kubernetes.io/projected/898fa39d-459b-4d0b-969a-b8e93ea03fe1-kube-api-access-vbfrj\") pod \"barbican-api-75658d85fd-ktwbd\" (UID: \"898fa39d-459b-4d0b-969a-b8e93ea03fe1\") " pod="openstack/barbican-api-75658d85fd-ktwbd" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.365228 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd70d1b1-5059-4e5f-9217-eeb022665766" path="/var/lib/kubelet/pods/dd70d1b1-5059-4e5f-9217-eeb022665766/volumes" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.388531 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b4e6ce48-0823-4cd4-8b05-252a9ecbe205-httpd-config\") pod \"neutron-79647c85bf-8lt5x\" (UID: \"b4e6ce48-0823-4cd4-8b05-252a9ecbe205\") " pod="openstack/neutron-79647c85bf-8lt5x" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.392768 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs5kx\" (UniqueName: \"kubernetes.io/projected/b4e6ce48-0823-4cd4-8b05-252a9ecbe205-kube-api-access-qs5kx\") pod \"neutron-79647c85bf-8lt5x\" (UID: \"b4e6ce48-0823-4cd4-8b05-252a9ecbe205\") " pod="openstack/neutron-79647c85bf-8lt5x" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.392821 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b4e6ce48-0823-4cd4-8b05-252a9ecbe205-config\") pod \"neutron-79647c85bf-8lt5x\" (UID: \"b4e6ce48-0823-4cd4-8b05-252a9ecbe205\") " pod="openstack/neutron-79647c85bf-8lt5x" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.392928 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4e6ce48-0823-4cd4-8b05-252a9ecbe205-internal-tls-certs\") pod \"neutron-79647c85bf-8lt5x\" (UID: \"b4e6ce48-0823-4cd4-8b05-252a9ecbe205\") " pod="openstack/neutron-79647c85bf-8lt5x" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.392981 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4e6ce48-0823-4cd4-8b05-252a9ecbe205-ovndb-tls-certs\") pod \"neutron-79647c85bf-8lt5x\" (UID: \"b4e6ce48-0823-4cd4-8b05-252a9ecbe205\") " pod="openstack/neutron-79647c85bf-8lt5x" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.393126 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4e6ce48-0823-4cd4-8b05-252a9ecbe205-public-tls-certs\") pod \"neutron-79647c85bf-8lt5x\" (UID: \"b4e6ce48-0823-4cd4-8b05-252a9ecbe205\") " pod="openstack/neutron-79647c85bf-8lt5x" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.393192 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4e6ce48-0823-4cd4-8b05-252a9ecbe205-combined-ca-bundle\") pod \"neutron-79647c85bf-8lt5x\" (UID: \"b4e6ce48-0823-4cd4-8b05-252a9ecbe205\") " pod="openstack/neutron-79647c85bf-8lt5x" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.394838 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-gp8rq" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.394853 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b4e6ce48-0823-4cd4-8b05-252a9ecbe205-httpd-config\") pod \"neutron-79647c85bf-8lt5x\" (UID: \"b4e6ce48-0823-4cd4-8b05-252a9ecbe205\") " pod="openstack/neutron-79647c85bf-8lt5x" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.402029 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4e6ce48-0823-4cd4-8b05-252a9ecbe205-public-tls-certs\") pod \"neutron-79647c85bf-8lt5x\" (UID: \"b4e6ce48-0823-4cd4-8b05-252a9ecbe205\") " pod="openstack/neutron-79647c85bf-8lt5x" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.402252 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b4e6ce48-0823-4cd4-8b05-252a9ecbe205-config\") pod \"neutron-79647c85bf-8lt5x\" (UID: \"b4e6ce48-0823-4cd4-8b05-252a9ecbe205\") " pod="openstack/neutron-79647c85bf-8lt5x" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.404357 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4e6ce48-0823-4cd4-8b05-252a9ecbe205-ovndb-tls-certs\") pod \"neutron-79647c85bf-8lt5x\" (UID: \"b4e6ce48-0823-4cd4-8b05-252a9ecbe205\") " pod="openstack/neutron-79647c85bf-8lt5x" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.404529 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4e6ce48-0823-4cd4-8b05-252a9ecbe205-internal-tls-certs\") pod \"neutron-79647c85bf-8lt5x\" (UID: \"b4e6ce48-0823-4cd4-8b05-252a9ecbe205\") " pod="openstack/neutron-79647c85bf-8lt5x" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.404700 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4e6ce48-0823-4cd4-8b05-252a9ecbe205-combined-ca-bundle\") pod \"neutron-79647c85bf-8lt5x\" (UID: \"b4e6ce48-0823-4cd4-8b05-252a9ecbe205\") " pod="openstack/neutron-79647c85bf-8lt5x" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.417252 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs5kx\" (UniqueName: \"kubernetes.io/projected/b4e6ce48-0823-4cd4-8b05-252a9ecbe205-kube-api-access-qs5kx\") pod \"neutron-79647c85bf-8lt5x\" (UID: \"b4e6ce48-0823-4cd4-8b05-252a9ecbe205\") " pod="openstack/neutron-79647c85bf-8lt5x" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.425232 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75658d85fd-ktwbd" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.438185 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79647c85bf-8lt5x" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.445424 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-55d7966795-cnsx8" podUID="bc4e8fd1-48da-4dac-8d0b-bed00a35ad66" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.161:9696/\": read tcp 10.217.0.2:51662->10.217.0.161:9696: read: connection reset by peer" Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.618663 4869 generic.go:334] "Generic (PLEG): container finished" podID="bc4e8fd1-48da-4dac-8d0b-bed00a35ad66" containerID="eb3eceb6c713703cf2f509c9a1ecb5be324b76c42402dc90cd6f8aaaf2e69c4d" exitCode=0 Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.619146 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55d7966795-cnsx8" event={"ID":"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66","Type":"ContainerDied","Data":"eb3eceb6c713703cf2f509c9a1ecb5be324b76c42402dc90cd6f8aaaf2e69c4d"} Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.623714 4869 generic.go:334] "Generic (PLEG): container finished" podID="b444668b-c6e2-42fc-93e2-8b14ef77eef3" containerID="a44fa9d068dd3a2cb5ec245dc60d6c66591654071eae21033192d8c47b0db88b" exitCode=0 Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.623765 4869 generic.go:334] "Generic (PLEG): container finished" podID="b444668b-c6e2-42fc-93e2-8b14ef77eef3" containerID="f1039753306fc2ec2026e325eee31c594a3f58104c33a5cc050b97797fd018cc" exitCode=0 Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.623765 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b444668b-c6e2-42fc-93e2-8b14ef77eef3","Type":"ContainerDied","Data":"a44fa9d068dd3a2cb5ec245dc60d6c66591654071eae21033192d8c47b0db88b"} Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.623822 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b444668b-c6e2-42fc-93e2-8b14ef77eef3","Type":"ContainerDied","Data":"f1039753306fc2ec2026e325eee31c594a3f58104c33a5cc050b97797fd018cc"} Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.829999 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5fd5649b5f-5rjlk"] Mar 12 15:08:24 crc kubenswrapper[4869]: I0312 15:08:24.951031 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-b57d7b9cd-fglxg"] Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.099955 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-gp8rq"] Mar 12 15:08:25 crc kubenswrapper[4869]: W0312 15:08:25.104605 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75f308bb_16c9_4cf1_a9fb_1b5e4c1bbc07.slice/crio-cebe402b114474f0290c30836f0cfa5d0b1971908410c9ed2cbe05d206cbfe8a WatchSource:0}: Error finding container cebe402b114474f0290c30836f0cfa5d0b1971908410c9ed2cbe05d206cbfe8a: Status 404 returned error can't find the container with id cebe402b114474f0290c30836f0cfa5d0b1971908410c9ed2cbe05d206cbfe8a Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.183912 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-75658d85fd-ktwbd"] Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.258149 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-79647c85bf-8lt5x"] Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.319273 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 15:08:25 crc kubenswrapper[4869]: E0312 15:08:25.343269 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-t7fgp" podUID="ca1953bb-fc5d-4285-8de4-b67746201d05" Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.431656 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b444668b-c6e2-42fc-93e2-8b14ef77eef3-log-httpd\") pod \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\" (UID: \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\") " Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.432175 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b444668b-c6e2-42fc-93e2-8b14ef77eef3-config-data\") pod \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\" (UID: \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\") " Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.432718 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mqqv\" (UniqueName: \"kubernetes.io/projected/b444668b-c6e2-42fc-93e2-8b14ef77eef3-kube-api-access-2mqqv\") pod \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\" (UID: \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\") " Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.432706 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b444668b-c6e2-42fc-93e2-8b14ef77eef3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b444668b-c6e2-42fc-93e2-8b14ef77eef3" (UID: "b444668b-c6e2-42fc-93e2-8b14ef77eef3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.432769 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b444668b-c6e2-42fc-93e2-8b14ef77eef3-run-httpd\") pod \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\" (UID: \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\") " Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.432831 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b444668b-c6e2-42fc-93e2-8b14ef77eef3-sg-core-conf-yaml\") pod \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\" (UID: \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\") " Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.433175 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b444668b-c6e2-42fc-93e2-8b14ef77eef3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b444668b-c6e2-42fc-93e2-8b14ef77eef3" (UID: "b444668b-c6e2-42fc-93e2-8b14ef77eef3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.433400 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b444668b-c6e2-42fc-93e2-8b14ef77eef3-combined-ca-bundle\") pod \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\" (UID: \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\") " Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.433562 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b444668b-c6e2-42fc-93e2-8b14ef77eef3-scripts\") pod \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\" (UID: \"b444668b-c6e2-42fc-93e2-8b14ef77eef3\") " Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.438856 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b444668b-c6e2-42fc-93e2-8b14ef77eef3-scripts" (OuterVolumeSpecName: "scripts") pod "b444668b-c6e2-42fc-93e2-8b14ef77eef3" (UID: "b444668b-c6e2-42fc-93e2-8b14ef77eef3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.441234 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b444668b-c6e2-42fc-93e2-8b14ef77eef3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b444668b-c6e2-42fc-93e2-8b14ef77eef3" (UID: "b444668b-c6e2-42fc-93e2-8b14ef77eef3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.441628 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b444668b-c6e2-42fc-93e2-8b14ef77eef3-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.441866 4869 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b444668b-c6e2-42fc-93e2-8b14ef77eef3-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.441937 4869 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b444668b-c6e2-42fc-93e2-8b14ef77eef3-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.443103 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b444668b-c6e2-42fc-93e2-8b14ef77eef3-kube-api-access-2mqqv" (OuterVolumeSpecName: "kube-api-access-2mqqv") pod "b444668b-c6e2-42fc-93e2-8b14ef77eef3" (UID: "b444668b-c6e2-42fc-93e2-8b14ef77eef3"). InnerVolumeSpecName "kube-api-access-2mqqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.544119 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mqqv\" (UniqueName: \"kubernetes.io/projected/b444668b-c6e2-42fc-93e2-8b14ef77eef3-kube-api-access-2mqqv\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.544173 4869 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b444668b-c6e2-42fc-93e2-8b14ef77eef3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.582705 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b444668b-c6e2-42fc-93e2-8b14ef77eef3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b444668b-c6e2-42fc-93e2-8b14ef77eef3" (UID: "b444668b-c6e2-42fc-93e2-8b14ef77eef3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.606872 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b444668b-c6e2-42fc-93e2-8b14ef77eef3-config-data" (OuterVolumeSpecName: "config-data") pod "b444668b-c6e2-42fc-93e2-8b14ef77eef3" (UID: "b444668b-c6e2-42fc-93e2-8b14ef77eef3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.645910 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b444668b-c6e2-42fc-93e2-8b14ef77eef3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.645959 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b444668b-c6e2-42fc-93e2-8b14ef77eef3-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.656413 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-55d7966795-cnsx8" podUID="bc4e8fd1-48da-4dac-8d0b-bed00a35ad66" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.161:9696/\": dial tcp 10.217.0.161:9696: connect: connection refused" Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.671355 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79647c85bf-8lt5x" event={"ID":"b4e6ce48-0823-4cd4-8b05-252a9ecbe205","Type":"ContainerStarted","Data":"3c1e9130aaa4776232fa8ab283f5dfc606a553e254e2abe93261b6a93d093482"} Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.681618 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75658d85fd-ktwbd" event={"ID":"898fa39d-459b-4d0b-969a-b8e93ea03fe1","Type":"ContainerStarted","Data":"4a4631081d5e5fc7edf8ec441b4e6021fccc8a0eb43129ead7c218c73fc3c349"} Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.681658 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75658d85fd-ktwbd" event={"ID":"898fa39d-459b-4d0b-969a-b8e93ea03fe1","Type":"ContainerStarted","Data":"c559838923928761160e138484aa661c04e2927ea8c18f5a76dfa8f5e0350a76"} Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.685444 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-gp8rq" event={"ID":"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07","Type":"ContainerStarted","Data":"8bad80e7ca3862201ed9397ef85339a0a24474d2913af32a9dc8a4b734152c8b"} Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.685713 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-gp8rq" event={"ID":"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07","Type":"ContainerStarted","Data":"cebe402b114474f0290c30836f0cfa5d0b1971908410c9ed2cbe05d206cbfe8a"} Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.688641 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-b57d7b9cd-fglxg" event={"ID":"41e8dfe2-49a6-48a2-a037-d090aa545010","Type":"ContainerStarted","Data":"e7c6d3c87e2d16800ccd808d7dcf498fc824ba7b47b7d2d06a3f1f7744126165"} Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.690418 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5fd5649b5f-5rjlk" event={"ID":"6d671489-0a10-4779-8567-c34e80544dbb","Type":"ContainerStarted","Data":"4a12448cfa113d57f55d4ebe04a961437830977ec4ea75ee3c932cbf69e1ae28"} Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.692759 4869 generic.go:334] "Generic (PLEG): container finished" podID="b444668b-c6e2-42fc-93e2-8b14ef77eef3" containerID="d14023009c101fa53090e88a111afcc29bfeb3173093256ce2dac885f10fddf3" exitCode=0 Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.692792 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b444668b-c6e2-42fc-93e2-8b14ef77eef3","Type":"ContainerDied","Data":"d14023009c101fa53090e88a111afcc29bfeb3173093256ce2dac885f10fddf3"} Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.692811 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b444668b-c6e2-42fc-93e2-8b14ef77eef3","Type":"ContainerDied","Data":"788a5b2fafa6aa775313d1284d6a52f23ee3f8dbac0ea70c73d3d1f9827438e5"} Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.692828 4869 scope.go:117] "RemoveContainer" containerID="a44fa9d068dd3a2cb5ec245dc60d6c66591654071eae21033192d8c47b0db88b" Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.692824 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.773115 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7f798b7b68-jtm8h" podUID="b4d63031-e072-466e-ae3c-d829a699b197" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.897828 4869 scope.go:117] "RemoveContainer" containerID="d14023009c101fa53090e88a111afcc29bfeb3173093256ce2dac885f10fddf3" Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.925831 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.940843 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.973400 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:08:25 crc kubenswrapper[4869]: E0312 15:08:25.973864 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b444668b-c6e2-42fc-93e2-8b14ef77eef3" containerName="proxy-httpd" Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.973884 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="b444668b-c6e2-42fc-93e2-8b14ef77eef3" containerName="proxy-httpd" Mar 12 15:08:25 crc kubenswrapper[4869]: E0312 15:08:25.973917 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b444668b-c6e2-42fc-93e2-8b14ef77eef3" containerName="ceilometer-central-agent" Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.973925 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="b444668b-c6e2-42fc-93e2-8b14ef77eef3" containerName="ceilometer-central-agent" Mar 12 15:08:25 crc kubenswrapper[4869]: E0312 15:08:25.973938 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b444668b-c6e2-42fc-93e2-8b14ef77eef3" containerName="ceilometer-notification-agent" Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.973946 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="b444668b-c6e2-42fc-93e2-8b14ef77eef3" containerName="ceilometer-notification-agent" Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.974154 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="b444668b-c6e2-42fc-93e2-8b14ef77eef3" containerName="ceilometer-central-agent" Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.974191 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="b444668b-c6e2-42fc-93e2-8b14ef77eef3" containerName="proxy-httpd" Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.974216 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="b444668b-c6e2-42fc-93e2-8b14ef77eef3" containerName="ceilometer-notification-agent" Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.974837 4869 scope.go:117] "RemoveContainer" containerID="f1039753306fc2ec2026e325eee31c594a3f58104c33a5cc050b97797fd018cc" Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.980900 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.986892 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.987164 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 12 15:08:25 crc kubenswrapper[4869]: I0312 15:08:25.996968 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.053309 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gbg5\" (UniqueName: \"kubernetes.io/projected/c67d578b-b8ef-43a0-a170-2f4f1ca48195-kube-api-access-2gbg5\") pod \"ceilometer-0\" (UID: \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\") " pod="openstack/ceilometer-0" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.053641 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c67d578b-b8ef-43a0-a170-2f4f1ca48195-config-data\") pod \"ceilometer-0\" (UID: \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\") " pod="openstack/ceilometer-0" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.053791 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c67d578b-b8ef-43a0-a170-2f4f1ca48195-run-httpd\") pod \"ceilometer-0\" (UID: \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\") " pod="openstack/ceilometer-0" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.053915 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c67d578b-b8ef-43a0-a170-2f4f1ca48195-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\") " pod="openstack/ceilometer-0" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.053941 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c67d578b-b8ef-43a0-a170-2f4f1ca48195-log-httpd\") pod \"ceilometer-0\" (UID: \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\") " pod="openstack/ceilometer-0" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.053973 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c67d578b-b8ef-43a0-a170-2f4f1ca48195-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\") " pod="openstack/ceilometer-0" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.054108 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c67d578b-b8ef-43a0-a170-2f4f1ca48195-scripts\") pod \"ceilometer-0\" (UID: \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\") " pod="openstack/ceilometer-0" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.059709 4869 scope.go:117] "RemoveContainer" containerID="a44fa9d068dd3a2cb5ec245dc60d6c66591654071eae21033192d8c47b0db88b" Mar 12 15:08:26 crc kubenswrapper[4869]: E0312 15:08:26.060119 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a44fa9d068dd3a2cb5ec245dc60d6c66591654071eae21033192d8c47b0db88b\": container with ID starting with a44fa9d068dd3a2cb5ec245dc60d6c66591654071eae21033192d8c47b0db88b not found: ID does not exist" containerID="a44fa9d068dd3a2cb5ec245dc60d6c66591654071eae21033192d8c47b0db88b" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.060150 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a44fa9d068dd3a2cb5ec245dc60d6c66591654071eae21033192d8c47b0db88b"} err="failed to get container status \"a44fa9d068dd3a2cb5ec245dc60d6c66591654071eae21033192d8c47b0db88b\": rpc error: code = NotFound desc = could not find container \"a44fa9d068dd3a2cb5ec245dc60d6c66591654071eae21033192d8c47b0db88b\": container with ID starting with a44fa9d068dd3a2cb5ec245dc60d6c66591654071eae21033192d8c47b0db88b not found: ID does not exist" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.060168 4869 scope.go:117] "RemoveContainer" containerID="d14023009c101fa53090e88a111afcc29bfeb3173093256ce2dac885f10fddf3" Mar 12 15:08:26 crc kubenswrapper[4869]: E0312 15:08:26.060417 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d14023009c101fa53090e88a111afcc29bfeb3173093256ce2dac885f10fddf3\": container with ID starting with d14023009c101fa53090e88a111afcc29bfeb3173093256ce2dac885f10fddf3 not found: ID does not exist" containerID="d14023009c101fa53090e88a111afcc29bfeb3173093256ce2dac885f10fddf3" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.060438 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d14023009c101fa53090e88a111afcc29bfeb3173093256ce2dac885f10fddf3"} err="failed to get container status \"d14023009c101fa53090e88a111afcc29bfeb3173093256ce2dac885f10fddf3\": rpc error: code = NotFound desc = could not find container \"d14023009c101fa53090e88a111afcc29bfeb3173093256ce2dac885f10fddf3\": container with ID starting with d14023009c101fa53090e88a111afcc29bfeb3173093256ce2dac885f10fddf3 not found: ID does not exist" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.060452 4869 scope.go:117] "RemoveContainer" containerID="f1039753306fc2ec2026e325eee31c594a3f58104c33a5cc050b97797fd018cc" Mar 12 15:08:26 crc kubenswrapper[4869]: E0312 15:08:26.061078 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1039753306fc2ec2026e325eee31c594a3f58104c33a5cc050b97797fd018cc\": container with ID starting with f1039753306fc2ec2026e325eee31c594a3f58104c33a5cc050b97797fd018cc not found: ID does not exist" containerID="f1039753306fc2ec2026e325eee31c594a3f58104c33a5cc050b97797fd018cc" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.061100 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1039753306fc2ec2026e325eee31c594a3f58104c33a5cc050b97797fd018cc"} err="failed to get container status \"f1039753306fc2ec2026e325eee31c594a3f58104c33a5cc050b97797fd018cc\": rpc error: code = NotFound desc = could not find container \"f1039753306fc2ec2026e325eee31c594a3f58104c33a5cc050b97797fd018cc\": container with ID starting with f1039753306fc2ec2026e325eee31c594a3f58104c33a5cc050b97797fd018cc not found: ID does not exist" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.155555 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c67d578b-b8ef-43a0-a170-2f4f1ca48195-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\") " pod="openstack/ceilometer-0" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.155615 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c67d578b-b8ef-43a0-a170-2f4f1ca48195-log-httpd\") pod \"ceilometer-0\" (UID: \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\") " pod="openstack/ceilometer-0" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.155648 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c67d578b-b8ef-43a0-a170-2f4f1ca48195-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\") " pod="openstack/ceilometer-0" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.155729 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c67d578b-b8ef-43a0-a170-2f4f1ca48195-scripts\") pod \"ceilometer-0\" (UID: \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\") " pod="openstack/ceilometer-0" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.155828 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gbg5\" (UniqueName: \"kubernetes.io/projected/c67d578b-b8ef-43a0-a170-2f4f1ca48195-kube-api-access-2gbg5\") pod \"ceilometer-0\" (UID: \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\") " pod="openstack/ceilometer-0" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.155866 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c67d578b-b8ef-43a0-a170-2f4f1ca48195-config-data\") pod \"ceilometer-0\" (UID: \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\") " pod="openstack/ceilometer-0" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.155897 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c67d578b-b8ef-43a0-a170-2f4f1ca48195-run-httpd\") pod \"ceilometer-0\" (UID: \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\") " pod="openstack/ceilometer-0" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.156377 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c67d578b-b8ef-43a0-a170-2f4f1ca48195-run-httpd\") pod \"ceilometer-0\" (UID: \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\") " pod="openstack/ceilometer-0" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.157217 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c67d578b-b8ef-43a0-a170-2f4f1ca48195-log-httpd\") pod \"ceilometer-0\" (UID: \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\") " pod="openstack/ceilometer-0" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.161800 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c67d578b-b8ef-43a0-a170-2f4f1ca48195-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\") " pod="openstack/ceilometer-0" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.162202 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c67d578b-b8ef-43a0-a170-2f4f1ca48195-scripts\") pod \"ceilometer-0\" (UID: \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\") " pod="openstack/ceilometer-0" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.173265 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c67d578b-b8ef-43a0-a170-2f4f1ca48195-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\") " pod="openstack/ceilometer-0" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.176649 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c67d578b-b8ef-43a0-a170-2f4f1ca48195-config-data\") pod \"ceilometer-0\" (UID: \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\") " pod="openstack/ceilometer-0" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.181426 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gbg5\" (UniqueName: \"kubernetes.io/projected/c67d578b-b8ef-43a0-a170-2f4f1ca48195-kube-api-access-2gbg5\") pod \"ceilometer-0\" (UID: \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\") " pod="openstack/ceilometer-0" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.355077 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.368637 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b444668b-c6e2-42fc-93e2-8b14ef77eef3" path="/var/lib/kubelet/pods/b444668b-c6e2-42fc-93e2-8b14ef77eef3/volumes" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.702582 4869 generic.go:334] "Generic (PLEG): container finished" podID="75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07" containerID="8bad80e7ca3862201ed9397ef85339a0a24474d2913af32a9dc8a4b734152c8b" exitCode=0 Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.702655 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-gp8rq" event={"ID":"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07","Type":"ContainerDied","Data":"8bad80e7ca3862201ed9397ef85339a0a24474d2913af32a9dc8a4b734152c8b"} Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.715384 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79647c85bf-8lt5x" event={"ID":"b4e6ce48-0823-4cd4-8b05-252a9ecbe205","Type":"ContainerStarted","Data":"a8d02d8ba152a57abdd5a1678703039ed81873f80903da94d7e7a992050573ff"} Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.715426 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79647c85bf-8lt5x" event={"ID":"b4e6ce48-0823-4cd4-8b05-252a9ecbe205","Type":"ContainerStarted","Data":"d9defcf28d20f8a4093879bb4a09e980b0651721cba6b7e5851301d430dcfdac"} Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.716164 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-79647c85bf-8lt5x" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.721173 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75658d85fd-ktwbd" event={"ID":"898fa39d-459b-4d0b-969a-b8e93ea03fe1","Type":"ContainerStarted","Data":"6691c87a5e7cc321e0c2d8e50068a6febe67be7c1ea060f760a86bb39e326983"} Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.721760 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-75658d85fd-ktwbd" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.721839 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-75658d85fd-ktwbd" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.786413 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-79647c85bf-8lt5x" podStartSLOduration=2.784506255 podStartE2EDuration="2.784506255s" podCreationTimestamp="2026-03-12 15:08:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:08:26.77980761 +0000 UTC m=+1259.065032888" watchObservedRunningTime="2026-03-12 15:08:26.784506255 +0000 UTC m=+1259.069731533" Mar 12 15:08:26 crc kubenswrapper[4869]: I0312 15:08:26.818094 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-75658d85fd-ktwbd" podStartSLOduration=2.818079465 podStartE2EDuration="2.818079465s" podCreationTimestamp="2026-03-12 15:08:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:08:26.816533321 +0000 UTC m=+1259.101758599" watchObservedRunningTime="2026-03-12 15:08:26.818079465 +0000 UTC m=+1259.103304743" Mar 12 15:08:27 crc kubenswrapper[4869]: I0312 15:08:27.330863 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-566d64c64b-vx76m"] Mar 12 15:08:27 crc kubenswrapper[4869]: I0312 15:08:27.333431 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-566d64c64b-vx76m" Mar 12 15:08:27 crc kubenswrapper[4869]: I0312 15:08:27.335295 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 12 15:08:27 crc kubenswrapper[4869]: I0312 15:08:27.336724 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 12 15:08:27 crc kubenswrapper[4869]: I0312 15:08:27.370472 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-566d64c64b-vx76m"] Mar 12 15:08:27 crc kubenswrapper[4869]: I0312 15:08:27.406453 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/573219b1-c8f0-49ab-86ac-d2861f55dfae-logs\") pod \"barbican-api-566d64c64b-vx76m\" (UID: \"573219b1-c8f0-49ab-86ac-d2861f55dfae\") " pod="openstack/barbican-api-566d64c64b-vx76m" Mar 12 15:08:27 crc kubenswrapper[4869]: I0312 15:08:27.406595 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/573219b1-c8f0-49ab-86ac-d2861f55dfae-public-tls-certs\") pod \"barbican-api-566d64c64b-vx76m\" (UID: \"573219b1-c8f0-49ab-86ac-d2861f55dfae\") " pod="openstack/barbican-api-566d64c64b-vx76m" Mar 12 15:08:27 crc kubenswrapper[4869]: I0312 15:08:27.406763 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/573219b1-c8f0-49ab-86ac-d2861f55dfae-combined-ca-bundle\") pod \"barbican-api-566d64c64b-vx76m\" (UID: \"573219b1-c8f0-49ab-86ac-d2861f55dfae\") " pod="openstack/barbican-api-566d64c64b-vx76m" Mar 12 15:08:27 crc kubenswrapper[4869]: I0312 15:08:27.406927 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/573219b1-c8f0-49ab-86ac-d2861f55dfae-internal-tls-certs\") pod \"barbican-api-566d64c64b-vx76m\" (UID: \"573219b1-c8f0-49ab-86ac-d2861f55dfae\") " pod="openstack/barbican-api-566d64c64b-vx76m" Mar 12 15:08:27 crc kubenswrapper[4869]: I0312 15:08:27.407016 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/573219b1-c8f0-49ab-86ac-d2861f55dfae-config-data-custom\") pod \"barbican-api-566d64c64b-vx76m\" (UID: \"573219b1-c8f0-49ab-86ac-d2861f55dfae\") " pod="openstack/barbican-api-566d64c64b-vx76m" Mar 12 15:08:27 crc kubenswrapper[4869]: I0312 15:08:27.407121 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7d57\" (UniqueName: \"kubernetes.io/projected/573219b1-c8f0-49ab-86ac-d2861f55dfae-kube-api-access-b7d57\") pod \"barbican-api-566d64c64b-vx76m\" (UID: \"573219b1-c8f0-49ab-86ac-d2861f55dfae\") " pod="openstack/barbican-api-566d64c64b-vx76m" Mar 12 15:08:27 crc kubenswrapper[4869]: I0312 15:08:27.407176 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/573219b1-c8f0-49ab-86ac-d2861f55dfae-config-data\") pod \"barbican-api-566d64c64b-vx76m\" (UID: \"573219b1-c8f0-49ab-86ac-d2861f55dfae\") " pod="openstack/barbican-api-566d64c64b-vx76m" Mar 12 15:08:27 crc kubenswrapper[4869]: I0312 15:08:27.509484 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/573219b1-c8f0-49ab-86ac-d2861f55dfae-config-data\") pod \"barbican-api-566d64c64b-vx76m\" (UID: \"573219b1-c8f0-49ab-86ac-d2861f55dfae\") " pod="openstack/barbican-api-566d64c64b-vx76m" Mar 12 15:08:27 crc kubenswrapper[4869]: I0312 15:08:27.509894 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/573219b1-c8f0-49ab-86ac-d2861f55dfae-logs\") pod \"barbican-api-566d64c64b-vx76m\" (UID: \"573219b1-c8f0-49ab-86ac-d2861f55dfae\") " pod="openstack/barbican-api-566d64c64b-vx76m" Mar 12 15:08:27 crc kubenswrapper[4869]: I0312 15:08:27.509923 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/573219b1-c8f0-49ab-86ac-d2861f55dfae-public-tls-certs\") pod \"barbican-api-566d64c64b-vx76m\" (UID: \"573219b1-c8f0-49ab-86ac-d2861f55dfae\") " pod="openstack/barbican-api-566d64c64b-vx76m" Mar 12 15:08:27 crc kubenswrapper[4869]: I0312 15:08:27.509970 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/573219b1-c8f0-49ab-86ac-d2861f55dfae-combined-ca-bundle\") pod \"barbican-api-566d64c64b-vx76m\" (UID: \"573219b1-c8f0-49ab-86ac-d2861f55dfae\") " pod="openstack/barbican-api-566d64c64b-vx76m" Mar 12 15:08:27 crc kubenswrapper[4869]: I0312 15:08:27.510033 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/573219b1-c8f0-49ab-86ac-d2861f55dfae-internal-tls-certs\") pod \"barbican-api-566d64c64b-vx76m\" (UID: \"573219b1-c8f0-49ab-86ac-d2861f55dfae\") " pod="openstack/barbican-api-566d64c64b-vx76m" Mar 12 15:08:27 crc kubenswrapper[4869]: I0312 15:08:27.510065 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/573219b1-c8f0-49ab-86ac-d2861f55dfae-config-data-custom\") pod \"barbican-api-566d64c64b-vx76m\" (UID: \"573219b1-c8f0-49ab-86ac-d2861f55dfae\") " pod="openstack/barbican-api-566d64c64b-vx76m" Mar 12 15:08:27 crc kubenswrapper[4869]: I0312 15:08:27.510108 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7d57\" (UniqueName: \"kubernetes.io/projected/573219b1-c8f0-49ab-86ac-d2861f55dfae-kube-api-access-b7d57\") pod \"barbican-api-566d64c64b-vx76m\" (UID: \"573219b1-c8f0-49ab-86ac-d2861f55dfae\") " pod="openstack/barbican-api-566d64c64b-vx76m" Mar 12 15:08:27 crc kubenswrapper[4869]: I0312 15:08:27.510993 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/573219b1-c8f0-49ab-86ac-d2861f55dfae-logs\") pod \"barbican-api-566d64c64b-vx76m\" (UID: \"573219b1-c8f0-49ab-86ac-d2861f55dfae\") " pod="openstack/barbican-api-566d64c64b-vx76m" Mar 12 15:08:27 crc kubenswrapper[4869]: I0312 15:08:27.530121 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/573219b1-c8f0-49ab-86ac-d2861f55dfae-config-data\") pod \"barbican-api-566d64c64b-vx76m\" (UID: \"573219b1-c8f0-49ab-86ac-d2861f55dfae\") " pod="openstack/barbican-api-566d64c64b-vx76m" Mar 12 15:08:27 crc kubenswrapper[4869]: I0312 15:08:27.530531 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/573219b1-c8f0-49ab-86ac-d2861f55dfae-public-tls-certs\") pod \"barbican-api-566d64c64b-vx76m\" (UID: \"573219b1-c8f0-49ab-86ac-d2861f55dfae\") " pod="openstack/barbican-api-566d64c64b-vx76m" Mar 12 15:08:27 crc kubenswrapper[4869]: I0312 15:08:27.532838 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/573219b1-c8f0-49ab-86ac-d2861f55dfae-internal-tls-certs\") pod \"barbican-api-566d64c64b-vx76m\" (UID: \"573219b1-c8f0-49ab-86ac-d2861f55dfae\") " pod="openstack/barbican-api-566d64c64b-vx76m" Mar 12 15:08:27 crc kubenswrapper[4869]: I0312 15:08:27.533508 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/573219b1-c8f0-49ab-86ac-d2861f55dfae-config-data-custom\") pod \"barbican-api-566d64c64b-vx76m\" (UID: \"573219b1-c8f0-49ab-86ac-d2861f55dfae\") " pod="openstack/barbican-api-566d64c64b-vx76m" Mar 12 15:08:27 crc kubenswrapper[4869]: I0312 15:08:27.534963 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7d57\" (UniqueName: \"kubernetes.io/projected/573219b1-c8f0-49ab-86ac-d2861f55dfae-kube-api-access-b7d57\") pod \"barbican-api-566d64c64b-vx76m\" (UID: \"573219b1-c8f0-49ab-86ac-d2861f55dfae\") " pod="openstack/barbican-api-566d64c64b-vx76m" Mar 12 15:08:27 crc kubenswrapper[4869]: I0312 15:08:27.537146 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/573219b1-c8f0-49ab-86ac-d2861f55dfae-combined-ca-bundle\") pod \"barbican-api-566d64c64b-vx76m\" (UID: \"573219b1-c8f0-49ab-86ac-d2861f55dfae\") " pod="openstack/barbican-api-566d64c64b-vx76m" Mar 12 15:08:27 crc kubenswrapper[4869]: I0312 15:08:27.663196 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-566d64c64b-vx76m" Mar 12 15:08:27 crc kubenswrapper[4869]: I0312 15:08:27.699959 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:08:27 crc kubenswrapper[4869]: I0312 15:08:27.740762 4869 generic.go:334] "Generic (PLEG): container finished" podID="bc4e8fd1-48da-4dac-8d0b-bed00a35ad66" containerID="8a021443c4c1a11e6a0d7c8fb63dbb3f3362f0318db827eb7d54796300ac257b" exitCode=0 Mar 12 15:08:27 crc kubenswrapper[4869]: I0312 15:08:27.740965 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55d7966795-cnsx8" event={"ID":"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66","Type":"ContainerDied","Data":"8a021443c4c1a11e6a0d7c8fb63dbb3f3362f0318db827eb7d54796300ac257b"} Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.218493 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55d7966795-cnsx8" Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.330904 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-config\") pod \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\" (UID: \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\") " Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.330939 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-combined-ca-bundle\") pod \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\" (UID: \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\") " Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.330985 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-internal-tls-certs\") pod \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\" (UID: \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\") " Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.331043 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lqw7\" (UniqueName: \"kubernetes.io/projected/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-kube-api-access-8lqw7\") pod \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\" (UID: \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\") " Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.331077 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-public-tls-certs\") pod \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\" (UID: \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\") " Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.331138 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-ovndb-tls-certs\") pod \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\" (UID: \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\") " Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.331160 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-httpd-config\") pod \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\" (UID: \"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66\") " Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.348800 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "bc4e8fd1-48da-4dac-8d0b-bed00a35ad66" (UID: "bc4e8fd1-48da-4dac-8d0b-bed00a35ad66"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.353674 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-kube-api-access-8lqw7" (OuterVolumeSpecName: "kube-api-access-8lqw7") pod "bc4e8fd1-48da-4dac-8d0b-bed00a35ad66" (UID: "bc4e8fd1-48da-4dac-8d0b-bed00a35ad66"). InnerVolumeSpecName "kube-api-access-8lqw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.426213 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-566d64c64b-vx76m"] Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.433909 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lqw7\" (UniqueName: \"kubernetes.io/projected/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-kube-api-access-8lqw7\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.433944 4869 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.517599 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc4e8fd1-48da-4dac-8d0b-bed00a35ad66" (UID: "bc4e8fd1-48da-4dac-8d0b-bed00a35ad66"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.526047 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-config" (OuterVolumeSpecName: "config") pod "bc4e8fd1-48da-4dac-8d0b-bed00a35ad66" (UID: "bc4e8fd1-48da-4dac-8d0b-bed00a35ad66"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.535681 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-config\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.535709 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.537315 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "bc4e8fd1-48da-4dac-8d0b-bed00a35ad66" (UID: "bc4e8fd1-48da-4dac-8d0b-bed00a35ad66"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.537893 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bc4e8fd1-48da-4dac-8d0b-bed00a35ad66" (UID: "bc4e8fd1-48da-4dac-8d0b-bed00a35ad66"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.539920 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bc4e8fd1-48da-4dac-8d0b-bed00a35ad66" (UID: "bc4e8fd1-48da-4dac-8d0b-bed00a35ad66"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.638273 4869 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.638427 4869 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.638521 4869 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.751391 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-gp8rq" event={"ID":"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07","Type":"ContainerStarted","Data":"8b3ee80975e1d3643409b5fb530997091c8816b2bd0ecfcbcaacde4a9d977fe7"} Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.751569 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-gp8rq" Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.754235 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-b57d7b9cd-fglxg" event={"ID":"41e8dfe2-49a6-48a2-a037-d090aa545010","Type":"ContainerStarted","Data":"29a59daff4cd097e79a9766f556c5d3fd1d081646d787d03d7b15e76b122c848"} Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.754275 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-b57d7b9cd-fglxg" event={"ID":"41e8dfe2-49a6-48a2-a037-d090aa545010","Type":"ContainerStarted","Data":"91a947b9d32c6fc9920d0537348bf74df04087c07a63ead4decf22f52d83601a"} Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.756003 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5fd5649b5f-5rjlk" event={"ID":"6d671489-0a10-4779-8567-c34e80544dbb","Type":"ContainerStarted","Data":"c34484be55483ab19f66b6e40aca2265db6867363d2036148815036c633e643e"} Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.756043 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5fd5649b5f-5rjlk" event={"ID":"6d671489-0a10-4779-8567-c34e80544dbb","Type":"ContainerStarted","Data":"12c6ab8d2cf4afbc39e0a5a8e36929cb7e4a42a307d5f7b9371b28c87f3280e0"} Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.785779 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55d7966795-cnsx8" event={"ID":"bc4e8fd1-48da-4dac-8d0b-bed00a35ad66","Type":"ContainerDied","Data":"e52be672c76eb099580f00a5b20ddd70cf897c2aefb5749b43d653d20d568f8a"} Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.785824 4869 scope.go:117] "RemoveContainer" containerID="eb3eceb6c713703cf2f509c9a1ecb5be324b76c42402dc90cd6f8aaaf2e69c4d" Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.785946 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55d7966795-cnsx8" Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.820900 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c67d578b-b8ef-43a0-a170-2f4f1ca48195","Type":"ContainerStarted","Data":"31666c6e447b5f2d0573a0d1facad1178976f9462fb95a093acb0ea4d43af7cd"} Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.836360 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-gp8rq" podStartSLOduration=5.836332554 podStartE2EDuration="5.836332554s" podCreationTimestamp="2026-03-12 15:08:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:08:28.82185027 +0000 UTC m=+1261.107075538" watchObservedRunningTime="2026-03-12 15:08:28.836332554 +0000 UTC m=+1261.121557832" Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.843562 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-566d64c64b-vx76m" event={"ID":"573219b1-c8f0-49ab-86ac-d2861f55dfae","Type":"ContainerStarted","Data":"5e000add996137671e80927a74bced4ae0203b03e7abe154e9f0572f11fc49ae"} Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.895507 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5fd5649b5f-5rjlk" podStartSLOduration=3.49175565 podStartE2EDuration="5.895489546s" podCreationTimestamp="2026-03-12 15:08:23 +0000 UTC" firstStartedPulling="2026-03-12 15:08:24.852393518 +0000 UTC m=+1257.137618796" lastFinishedPulling="2026-03-12 15:08:27.256127414 +0000 UTC m=+1259.541352692" observedRunningTime="2026-03-12 15:08:28.867219488 +0000 UTC m=+1261.152444766" watchObservedRunningTime="2026-03-12 15:08:28.895489546 +0000 UTC m=+1261.180714824" Mar 12 15:08:28 crc kubenswrapper[4869]: I0312 15:08:28.910055 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-b57d7b9cd-fglxg" podStartSLOduration=3.544707795 podStartE2EDuration="5.910039982s" podCreationTimestamp="2026-03-12 15:08:23 +0000 UTC" firstStartedPulling="2026-03-12 15:08:24.961103608 +0000 UTC m=+1257.246328886" lastFinishedPulling="2026-03-12 15:08:27.326435795 +0000 UTC m=+1259.611661073" observedRunningTime="2026-03-12 15:08:28.909829726 +0000 UTC m=+1261.195055004" watchObservedRunningTime="2026-03-12 15:08:28.910039982 +0000 UTC m=+1261.195265260" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.012794 4869 scope.go:117] "RemoveContainer" containerID="8a021443c4c1a11e6a0d7c8fb63dbb3f3362f0318db827eb7d54796300ac257b" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.031877 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-55d7966795-cnsx8"] Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.044430 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-55d7966795-cnsx8"] Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.135477 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-666b564cfb-pk78f" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.246438 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-666b564cfb-pk78f" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.499880 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-54df54bd-p64lp"] Mar 12 15:08:29 crc kubenswrapper[4869]: E0312 15:08:29.500261 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc4e8fd1-48da-4dac-8d0b-bed00a35ad66" containerName="neutron-api" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.500277 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4e8fd1-48da-4dac-8d0b-bed00a35ad66" containerName="neutron-api" Mar 12 15:08:29 crc kubenswrapper[4869]: E0312 15:08:29.500298 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc4e8fd1-48da-4dac-8d0b-bed00a35ad66" containerName="neutron-httpd" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.500305 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4e8fd1-48da-4dac-8d0b-bed00a35ad66" containerName="neutron-httpd" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.500463 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc4e8fd1-48da-4dac-8d0b-bed00a35ad66" containerName="neutron-httpd" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.500497 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc4e8fd1-48da-4dac-8d0b-bed00a35ad66" containerName="neutron-api" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.501481 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-54df54bd-p64lp" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.527942 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-54df54bd-p64lp"] Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.565204 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efae3fc6-7dc2-4ad5-86af-ca7e623670c8-scripts\") pod \"placement-54df54bd-p64lp\" (UID: \"efae3fc6-7dc2-4ad5-86af-ca7e623670c8\") " pod="openstack/placement-54df54bd-p64lp" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.565248 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efae3fc6-7dc2-4ad5-86af-ca7e623670c8-combined-ca-bundle\") pod \"placement-54df54bd-p64lp\" (UID: \"efae3fc6-7dc2-4ad5-86af-ca7e623670c8\") " pod="openstack/placement-54df54bd-p64lp" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.565294 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/efae3fc6-7dc2-4ad5-86af-ca7e623670c8-internal-tls-certs\") pod \"placement-54df54bd-p64lp\" (UID: \"efae3fc6-7dc2-4ad5-86af-ca7e623670c8\") " pod="openstack/placement-54df54bd-p64lp" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.565374 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/efae3fc6-7dc2-4ad5-86af-ca7e623670c8-public-tls-certs\") pod \"placement-54df54bd-p64lp\" (UID: \"efae3fc6-7dc2-4ad5-86af-ca7e623670c8\") " pod="openstack/placement-54df54bd-p64lp" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.565440 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efae3fc6-7dc2-4ad5-86af-ca7e623670c8-config-data\") pod \"placement-54df54bd-p64lp\" (UID: \"efae3fc6-7dc2-4ad5-86af-ca7e623670c8\") " pod="openstack/placement-54df54bd-p64lp" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.565461 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl766\" (UniqueName: \"kubernetes.io/projected/efae3fc6-7dc2-4ad5-86af-ca7e623670c8-kube-api-access-pl766\") pod \"placement-54df54bd-p64lp\" (UID: \"efae3fc6-7dc2-4ad5-86af-ca7e623670c8\") " pod="openstack/placement-54df54bd-p64lp" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.565479 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efae3fc6-7dc2-4ad5-86af-ca7e623670c8-logs\") pod \"placement-54df54bd-p64lp\" (UID: \"efae3fc6-7dc2-4ad5-86af-ca7e623670c8\") " pod="openstack/placement-54df54bd-p64lp" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.669769 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/efae3fc6-7dc2-4ad5-86af-ca7e623670c8-public-tls-certs\") pod \"placement-54df54bd-p64lp\" (UID: \"efae3fc6-7dc2-4ad5-86af-ca7e623670c8\") " pod="openstack/placement-54df54bd-p64lp" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.669889 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efae3fc6-7dc2-4ad5-86af-ca7e623670c8-config-data\") pod \"placement-54df54bd-p64lp\" (UID: \"efae3fc6-7dc2-4ad5-86af-ca7e623670c8\") " pod="openstack/placement-54df54bd-p64lp" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.669930 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl766\" (UniqueName: \"kubernetes.io/projected/efae3fc6-7dc2-4ad5-86af-ca7e623670c8-kube-api-access-pl766\") pod \"placement-54df54bd-p64lp\" (UID: \"efae3fc6-7dc2-4ad5-86af-ca7e623670c8\") " pod="openstack/placement-54df54bd-p64lp" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.669960 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efae3fc6-7dc2-4ad5-86af-ca7e623670c8-logs\") pod \"placement-54df54bd-p64lp\" (UID: \"efae3fc6-7dc2-4ad5-86af-ca7e623670c8\") " pod="openstack/placement-54df54bd-p64lp" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.669996 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efae3fc6-7dc2-4ad5-86af-ca7e623670c8-scripts\") pod \"placement-54df54bd-p64lp\" (UID: \"efae3fc6-7dc2-4ad5-86af-ca7e623670c8\") " pod="openstack/placement-54df54bd-p64lp" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.670022 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efae3fc6-7dc2-4ad5-86af-ca7e623670c8-combined-ca-bundle\") pod \"placement-54df54bd-p64lp\" (UID: \"efae3fc6-7dc2-4ad5-86af-ca7e623670c8\") " pod="openstack/placement-54df54bd-p64lp" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.670069 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/efae3fc6-7dc2-4ad5-86af-ca7e623670c8-internal-tls-certs\") pod \"placement-54df54bd-p64lp\" (UID: \"efae3fc6-7dc2-4ad5-86af-ca7e623670c8\") " pod="openstack/placement-54df54bd-p64lp" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.674943 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efae3fc6-7dc2-4ad5-86af-ca7e623670c8-logs\") pod \"placement-54df54bd-p64lp\" (UID: \"efae3fc6-7dc2-4ad5-86af-ca7e623670c8\") " pod="openstack/placement-54df54bd-p64lp" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.679369 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/efae3fc6-7dc2-4ad5-86af-ca7e623670c8-public-tls-certs\") pod \"placement-54df54bd-p64lp\" (UID: \"efae3fc6-7dc2-4ad5-86af-ca7e623670c8\") " pod="openstack/placement-54df54bd-p64lp" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.681429 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/efae3fc6-7dc2-4ad5-86af-ca7e623670c8-internal-tls-certs\") pod \"placement-54df54bd-p64lp\" (UID: \"efae3fc6-7dc2-4ad5-86af-ca7e623670c8\") " pod="openstack/placement-54df54bd-p64lp" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.682079 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efae3fc6-7dc2-4ad5-86af-ca7e623670c8-combined-ca-bundle\") pod \"placement-54df54bd-p64lp\" (UID: \"efae3fc6-7dc2-4ad5-86af-ca7e623670c8\") " pod="openstack/placement-54df54bd-p64lp" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.682845 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efae3fc6-7dc2-4ad5-86af-ca7e623670c8-scripts\") pod \"placement-54df54bd-p64lp\" (UID: \"efae3fc6-7dc2-4ad5-86af-ca7e623670c8\") " pod="openstack/placement-54df54bd-p64lp" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.683384 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efae3fc6-7dc2-4ad5-86af-ca7e623670c8-config-data\") pod \"placement-54df54bd-p64lp\" (UID: \"efae3fc6-7dc2-4ad5-86af-ca7e623670c8\") " pod="openstack/placement-54df54bd-p64lp" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.702372 4869 scope.go:117] "RemoveContainer" containerID="d9fe9d3940873f347f06b02ae3fc65897bb6879cd59752fc2822fffb67ef4540" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.721477 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl766\" (UniqueName: \"kubernetes.io/projected/efae3fc6-7dc2-4ad5-86af-ca7e623670c8-kube-api-access-pl766\") pod \"placement-54df54bd-p64lp\" (UID: \"efae3fc6-7dc2-4ad5-86af-ca7e623670c8\") " pod="openstack/placement-54df54bd-p64lp" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.839237 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-54df54bd-p64lp" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.863260 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-566d64c64b-vx76m" event={"ID":"573219b1-c8f0-49ab-86ac-d2861f55dfae","Type":"ContainerStarted","Data":"354245a04deb6afcb292430c01148c30f5393c32639cbf6a2491923f3436ea51"} Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.863304 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-566d64c64b-vx76m" event={"ID":"573219b1-c8f0-49ab-86ac-d2861f55dfae","Type":"ContainerStarted","Data":"fe0faadac8789b6c41820ed8010e3588357bef81e094ff3af06b5a7ed23abf92"} Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.864313 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-566d64c64b-vx76m" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.864337 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-566d64c64b-vx76m" Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.876204 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c67d578b-b8ef-43a0-a170-2f4f1ca48195","Type":"ContainerStarted","Data":"bbeb4784032f18091ab3ebb3557765d719c3cdb93ad1cb66ba79c60042c1a016"} Mar 12 15:08:29 crc kubenswrapper[4869]: I0312 15:08:29.895741 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-566d64c64b-vx76m" podStartSLOduration=2.8957191570000003 podStartE2EDuration="2.895719157s" podCreationTimestamp="2026-03-12 15:08:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:08:29.89127725 +0000 UTC m=+1262.176502548" watchObservedRunningTime="2026-03-12 15:08:29.895719157 +0000 UTC m=+1262.180944435" Mar 12 15:08:30 crc kubenswrapper[4869]: I0312 15:08:30.299550 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-54df54bd-p64lp"] Mar 12 15:08:30 crc kubenswrapper[4869]: W0312 15:08:30.307052 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefae3fc6_7dc2_4ad5_86af_ca7e623670c8.slice/crio-f8ee6d7a041f9ac1b688bb055980e25b55e708ea6ca15ae80a1b6638cbee370f WatchSource:0}: Error finding container f8ee6d7a041f9ac1b688bb055980e25b55e708ea6ca15ae80a1b6638cbee370f: Status 404 returned error can't find the container with id f8ee6d7a041f9ac1b688bb055980e25b55e708ea6ca15ae80a1b6638cbee370f Mar 12 15:08:30 crc kubenswrapper[4869]: I0312 15:08:30.350347 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc4e8fd1-48da-4dac-8d0b-bed00a35ad66" path="/var/lib/kubelet/pods/bc4e8fd1-48da-4dac-8d0b-bed00a35ad66/volumes" Mar 12 15:08:30 crc kubenswrapper[4869]: I0312 15:08:30.926905 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54df54bd-p64lp" event={"ID":"efae3fc6-7dc2-4ad5-86af-ca7e623670c8","Type":"ContainerStarted","Data":"c9e5dece2915cf527015336bcbd436de1a6dfec220af30a0006fecd78470fbad"} Mar 12 15:08:30 crc kubenswrapper[4869]: I0312 15:08:30.927163 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-54df54bd-p64lp" Mar 12 15:08:30 crc kubenswrapper[4869]: I0312 15:08:30.927174 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54df54bd-p64lp" event={"ID":"efae3fc6-7dc2-4ad5-86af-ca7e623670c8","Type":"ContainerStarted","Data":"1edc4a51b7d288c95b53a49a1e82c8649ecf25229c1a17221f7388c4d77d8b1f"} Mar 12 15:08:30 crc kubenswrapper[4869]: I0312 15:08:30.927185 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-54df54bd-p64lp" Mar 12 15:08:30 crc kubenswrapper[4869]: I0312 15:08:30.927193 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54df54bd-p64lp" event={"ID":"efae3fc6-7dc2-4ad5-86af-ca7e623670c8","Type":"ContainerStarted","Data":"f8ee6d7a041f9ac1b688bb055980e25b55e708ea6ca15ae80a1b6638cbee370f"} Mar 12 15:08:30 crc kubenswrapper[4869]: I0312 15:08:30.932654 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c67d578b-b8ef-43a0-a170-2f4f1ca48195","Type":"ContainerStarted","Data":"4329c2f75308269fa25f20e3a946f5880a52e98019bae0743d891f3e785ef8ff"} Mar 12 15:08:30 crc kubenswrapper[4869]: I0312 15:08:30.964764 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-54df54bd-p64lp" podStartSLOduration=1.964738424 podStartE2EDuration="1.964738424s" podCreationTimestamp="2026-03-12 15:08:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:08:30.947418719 +0000 UTC m=+1263.232643997" watchObservedRunningTime="2026-03-12 15:08:30.964738424 +0000 UTC m=+1263.249963702" Mar 12 15:08:31 crc kubenswrapper[4869]: I0312 15:08:31.941890 4869 generic.go:334] "Generic (PLEG): container finished" podID="e8721cab-3eb8-4c80-a0c8-79c7e007b614" containerID="08b04517a54d502fe370e31cd01123f53f5eb592994f193a2f2359fd30ade15f" exitCode=0 Mar 12 15:08:31 crc kubenswrapper[4869]: I0312 15:08:31.941997 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-9p9gw" event={"ID":"e8721cab-3eb8-4c80-a0c8-79c7e007b614","Type":"ContainerDied","Data":"08b04517a54d502fe370e31cd01123f53f5eb592994f193a2f2359fd30ade15f"} Mar 12 15:08:31 crc kubenswrapper[4869]: I0312 15:08:31.945465 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c67d578b-b8ef-43a0-a170-2f4f1ca48195","Type":"ContainerStarted","Data":"68f8228c41c7a1e89b5fa127af5baed49ccb31777ec001ffb327de077e00fa70"} Mar 12 15:08:32 crc kubenswrapper[4869]: I0312 15:08:32.380672 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-ccddd4fb7-22sz6" Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.398957 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-9p9gw" Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.574223 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8721cab-3eb8-4c80-a0c8-79c7e007b614-config-data\") pod \"e8721cab-3eb8-4c80-a0c8-79c7e007b614\" (UID: \"e8721cab-3eb8-4c80-a0c8-79c7e007b614\") " Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.574310 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8721cab-3eb8-4c80-a0c8-79c7e007b614-combined-ca-bundle\") pod \"e8721cab-3eb8-4c80-a0c8-79c7e007b614\" (UID: \"e8721cab-3eb8-4c80-a0c8-79c7e007b614\") " Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.574342 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r49ql\" (UniqueName: \"kubernetes.io/projected/e8721cab-3eb8-4c80-a0c8-79c7e007b614-kube-api-access-r49ql\") pod \"e8721cab-3eb8-4c80-a0c8-79c7e007b614\" (UID: \"e8721cab-3eb8-4c80-a0c8-79c7e007b614\") " Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.574442 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/e8721cab-3eb8-4c80-a0c8-79c7e007b614-job-config-data\") pod \"e8721cab-3eb8-4c80-a0c8-79c7e007b614\" (UID: \"e8721cab-3eb8-4c80-a0c8-79c7e007b614\") " Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.582917 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8721cab-3eb8-4c80-a0c8-79c7e007b614-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "e8721cab-3eb8-4c80-a0c8-79c7e007b614" (UID: "e8721cab-3eb8-4c80-a0c8-79c7e007b614"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.583014 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8721cab-3eb8-4c80-a0c8-79c7e007b614-kube-api-access-r49ql" (OuterVolumeSpecName: "kube-api-access-r49ql") pod "e8721cab-3eb8-4c80-a0c8-79c7e007b614" (UID: "e8721cab-3eb8-4c80-a0c8-79c7e007b614"). InnerVolumeSpecName "kube-api-access-r49ql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.591680 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8721cab-3eb8-4c80-a0c8-79c7e007b614-config-data" (OuterVolumeSpecName: "config-data") pod "e8721cab-3eb8-4c80-a0c8-79c7e007b614" (UID: "e8721cab-3eb8-4c80-a0c8-79c7e007b614"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.605638 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8721cab-3eb8-4c80-a0c8-79c7e007b614-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8721cab-3eb8-4c80-a0c8-79c7e007b614" (UID: "e8721cab-3eb8-4c80-a0c8-79c7e007b614"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.641300 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 12 15:08:33 crc kubenswrapper[4869]: E0312 15:08:33.641662 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8721cab-3eb8-4c80-a0c8-79c7e007b614" containerName="manila-db-sync" Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.641678 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8721cab-3eb8-4c80-a0c8-79c7e007b614" containerName="manila-db-sync" Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.641859 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8721cab-3eb8-4c80-a0c8-79c7e007b614" containerName="manila-db-sync" Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.642454 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.644606 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-6rjxs" Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.648574 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.648907 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.661972 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.676768 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8721cab-3eb8-4c80-a0c8-79c7e007b614-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.676803 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8721cab-3eb8-4c80-a0c8-79c7e007b614-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.676814 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r49ql\" (UniqueName: \"kubernetes.io/projected/e8721cab-3eb8-4c80-a0c8-79c7e007b614-kube-api-access-r49ql\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.676824 4869 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/e8721cab-3eb8-4c80-a0c8-79c7e007b614-job-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.778633 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmkb9\" (UniqueName: \"kubernetes.io/projected/ea3ffca3-9d12-4095-b382-52a0a55a7c69-kube-api-access-mmkb9\") pod \"openstackclient\" (UID: \"ea3ffca3-9d12-4095-b382-52a0a55a7c69\") " pod="openstack/openstackclient" Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.778702 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ea3ffca3-9d12-4095-b382-52a0a55a7c69-openstack-config\") pod \"openstackclient\" (UID: \"ea3ffca3-9d12-4095-b382-52a0a55a7c69\") " pod="openstack/openstackclient" Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.778941 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ea3ffca3-9d12-4095-b382-52a0a55a7c69-openstack-config-secret\") pod \"openstackclient\" (UID: \"ea3ffca3-9d12-4095-b382-52a0a55a7c69\") " pod="openstack/openstackclient" Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.779020 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea3ffca3-9d12-4095-b382-52a0a55a7c69-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ea3ffca3-9d12-4095-b382-52a0a55a7c69\") " pod="openstack/openstackclient" Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.880813 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ea3ffca3-9d12-4095-b382-52a0a55a7c69-openstack-config-secret\") pod \"openstackclient\" (UID: \"ea3ffca3-9d12-4095-b382-52a0a55a7c69\") " pod="openstack/openstackclient" Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.880870 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea3ffca3-9d12-4095-b382-52a0a55a7c69-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ea3ffca3-9d12-4095-b382-52a0a55a7c69\") " pod="openstack/openstackclient" Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.880957 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmkb9\" (UniqueName: \"kubernetes.io/projected/ea3ffca3-9d12-4095-b382-52a0a55a7c69-kube-api-access-mmkb9\") pod \"openstackclient\" (UID: \"ea3ffca3-9d12-4095-b382-52a0a55a7c69\") " pod="openstack/openstackclient" Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.880995 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ea3ffca3-9d12-4095-b382-52a0a55a7c69-openstack-config\") pod \"openstackclient\" (UID: \"ea3ffca3-9d12-4095-b382-52a0a55a7c69\") " pod="openstack/openstackclient" Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.881885 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ea3ffca3-9d12-4095-b382-52a0a55a7c69-openstack-config\") pod \"openstackclient\" (UID: \"ea3ffca3-9d12-4095-b382-52a0a55a7c69\") " pod="openstack/openstackclient" Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.887283 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ea3ffca3-9d12-4095-b382-52a0a55a7c69-openstack-config-secret\") pod \"openstackclient\" (UID: \"ea3ffca3-9d12-4095-b382-52a0a55a7c69\") " pod="openstack/openstackclient" Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.887324 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea3ffca3-9d12-4095-b382-52a0a55a7c69-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ea3ffca3-9d12-4095-b382-52a0a55a7c69\") " pod="openstack/openstackclient" Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.902168 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmkb9\" (UniqueName: \"kubernetes.io/projected/ea3ffca3-9d12-4095-b382-52a0a55a7c69-kube-api-access-mmkb9\") pod \"openstackclient\" (UID: \"ea3ffca3-9d12-4095-b382-52a0a55a7c69\") " pod="openstack/openstackclient" Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.905901 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.906577 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.956268 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.966448 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 12 15:08:33 crc kubenswrapper[4869]: I0312 15:08:33.968219 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.008719 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.011872 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c67d578b-b8ef-43a0-a170-2f4f1ca48195","Type":"ContainerStarted","Data":"e14e6076d0352ac1b10d52a6bfe7de727ec71a4c462e25668d80f34c59523246"} Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.012098 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.014153 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-9p9gw" event={"ID":"e8721cab-3eb8-4c80-a0c8-79c7e007b614","Type":"ContainerDied","Data":"cac60e0160fd577099f2d2db83bdb6f515cc375d92843fe6f69a71109923a491"} Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.014179 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cac60e0160fd577099f2d2db83bdb6f515cc375d92843fe6f69a71109923a491" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.014222 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-9p9gw" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.041975 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.7032500280000002 podStartE2EDuration="9.041958565s" podCreationTimestamp="2026-03-12 15:08:25 +0000 UTC" firstStartedPulling="2026-03-12 15:08:27.818974134 +0000 UTC m=+1260.104199412" lastFinishedPulling="2026-03-12 15:08:33.157682671 +0000 UTC m=+1265.442907949" observedRunningTime="2026-03-12 15:08:34.029746886 +0000 UTC m=+1266.314972164" watchObservedRunningTime="2026-03-12 15:08:34.041958565 +0000 UTC m=+1266.327183843" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.088729 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/df9ac527-ae76-4cb7-b474-61f5699e610f-openstack-config\") pod \"openstackclient\" (UID: \"df9ac527-ae76-4cb7-b474-61f5699e610f\") " pod="openstack/openstackclient" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.088774 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nhkl\" (UniqueName: \"kubernetes.io/projected/df9ac527-ae76-4cb7-b474-61f5699e610f-kube-api-access-6nhkl\") pod \"openstackclient\" (UID: \"df9ac527-ae76-4cb7-b474-61f5699e610f\") " pod="openstack/openstackclient" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.088797 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/df9ac527-ae76-4cb7-b474-61f5699e610f-openstack-config-secret\") pod \"openstackclient\" (UID: \"df9ac527-ae76-4cb7-b474-61f5699e610f\") " pod="openstack/openstackclient" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.088823 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df9ac527-ae76-4cb7-b474-61f5699e610f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"df9ac527-ae76-4cb7-b474-61f5699e610f\") " pod="openstack/openstackclient" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.190464 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/df9ac527-ae76-4cb7-b474-61f5699e610f-openstack-config\") pod \"openstackclient\" (UID: \"df9ac527-ae76-4cb7-b474-61f5699e610f\") " pod="openstack/openstackclient" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.190514 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nhkl\" (UniqueName: \"kubernetes.io/projected/df9ac527-ae76-4cb7-b474-61f5699e610f-kube-api-access-6nhkl\") pod \"openstackclient\" (UID: \"df9ac527-ae76-4cb7-b474-61f5699e610f\") " pod="openstack/openstackclient" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.190555 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/df9ac527-ae76-4cb7-b474-61f5699e610f-openstack-config-secret\") pod \"openstackclient\" (UID: \"df9ac527-ae76-4cb7-b474-61f5699e610f\") " pod="openstack/openstackclient" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.190576 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df9ac527-ae76-4cb7-b474-61f5699e610f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"df9ac527-ae76-4cb7-b474-61f5699e610f\") " pod="openstack/openstackclient" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.222503 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/df9ac527-ae76-4cb7-b474-61f5699e610f-openstack-config\") pod \"openstackclient\" (UID: \"df9ac527-ae76-4cb7-b474-61f5699e610f\") " pod="openstack/openstackclient" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.223845 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/df9ac527-ae76-4cb7-b474-61f5699e610f-openstack-config-secret\") pod \"openstackclient\" (UID: \"df9ac527-ae76-4cb7-b474-61f5699e610f\") " pod="openstack/openstackclient" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.223928 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.237312 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df9ac527-ae76-4cb7-b474-61f5699e610f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"df9ac527-ae76-4cb7-b474-61f5699e610f\") " pod="openstack/openstackclient" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.241152 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.244835 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.245246 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.246932 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.251624 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.286969 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nhkl\" (UniqueName: \"kubernetes.io/projected/df9ac527-ae76-4cb7-b474-61f5699e610f-kube-api-access-6nhkl\") pod \"openstackclient\" (UID: \"df9ac527-ae76-4cb7-b474-61f5699e610f\") " pod="openstack/openstackclient" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.246986 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-kftdm" Mar 12 15:08:34 crc kubenswrapper[4869]: E0312 15:08:34.313652 4869 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 12 15:08:34 crc kubenswrapper[4869]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_ea3ffca3-9d12-4095-b382-52a0a55a7c69_0(f98fc816cfb55c7453c7a49de9ed1e125eb03f072f8f084df808829b9d7dc0c6): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f98fc816cfb55c7453c7a49de9ed1e125eb03f072f8f084df808829b9d7dc0c6" Netns:"/var/run/netns/cb13981d-257d-4368-a8e6-605a8ae2bbf6" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=f98fc816cfb55c7453c7a49de9ed1e125eb03f072f8f084df808829b9d7dc0c6;K8S_POD_UID=ea3ffca3-9d12-4095-b382-52a0a55a7c69" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/ea3ffca3-9d12-4095-b382-52a0a55a7c69]: expected pod UID "ea3ffca3-9d12-4095-b382-52a0a55a7c69" but got "df9ac527-ae76-4cb7-b474-61f5699e610f" from Kube API Mar 12 15:08:34 crc kubenswrapper[4869]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 12 15:08:34 crc kubenswrapper[4869]: > Mar 12 15:08:34 crc kubenswrapper[4869]: E0312 15:08:34.313729 4869 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 12 15:08:34 crc kubenswrapper[4869]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_ea3ffca3-9d12-4095-b382-52a0a55a7c69_0(f98fc816cfb55c7453c7a49de9ed1e125eb03f072f8f084df808829b9d7dc0c6): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f98fc816cfb55c7453c7a49de9ed1e125eb03f072f8f084df808829b9d7dc0c6" Netns:"/var/run/netns/cb13981d-257d-4368-a8e6-605a8ae2bbf6" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=f98fc816cfb55c7453c7a49de9ed1e125eb03f072f8f084df808829b9d7dc0c6;K8S_POD_UID=ea3ffca3-9d12-4095-b382-52a0a55a7c69" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/ea3ffca3-9d12-4095-b382-52a0a55a7c69]: expected pod UID "ea3ffca3-9d12-4095-b382-52a0a55a7c69" but got "df9ac527-ae76-4cb7-b474-61f5699e610f" from Kube API Mar 12 15:08:34 crc kubenswrapper[4869]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 12 15:08:34 crc kubenswrapper[4869]: > pod="openstack/openstackclient" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.317346 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.423594 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.429078 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85ff748b95-gp8rq" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.429108 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.430497 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.431580 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-config-data\") pod \"manila-share-share1-0\" (UID: \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\") " pod="openstack/manila-share-share1-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.431604 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-scripts\") pod \"manila-share-share1-0\" (UID: \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\") " pod="openstack/manila-share-share1-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.433979 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.436113 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fa83a8e-fd16-444a-8967-17725d75565d-scripts\") pod \"manila-scheduler-0\" (UID: \"3fa83a8e-fd16-444a-8967-17725d75565d\") " pod="openstack/manila-scheduler-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.436247 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3fa83a8e-fd16-444a-8967-17725d75565d-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"3fa83a8e-fd16-444a-8967-17725d75565d\") " pod="openstack/manila-scheduler-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.436280 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\") " pod="openstack/manila-share-share1-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.436381 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\") " pod="openstack/manila-share-share1-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.436436 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\") " pod="openstack/manila-share-share1-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.436458 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-ceph\") pod \"manila-share-share1-0\" (UID: \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\") " pod="openstack/manila-share-share1-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.436475 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fa83a8e-fd16-444a-8967-17725d75565d-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"3fa83a8e-fd16-444a-8967-17725d75565d\") " pod="openstack/manila-scheduler-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.436498 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-799ln\" (UniqueName: \"kubernetes.io/projected/3fa83a8e-fd16-444a-8967-17725d75565d-kube-api-access-799ln\") pod \"manila-scheduler-0\" (UID: \"3fa83a8e-fd16-444a-8967-17725d75565d\") " pod="openstack/manila-scheduler-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.436519 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqc8q\" (UniqueName: \"kubernetes.io/projected/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-kube-api-access-kqc8q\") pod \"manila-share-share1-0\" (UID: \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\") " pod="openstack/manila-share-share1-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.436584 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fa83a8e-fd16-444a-8967-17725d75565d-config-data\") pod \"manila-scheduler-0\" (UID: \"3fa83a8e-fd16-444a-8967-17725d75565d\") " pod="openstack/manila-scheduler-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.436603 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fa83a8e-fd16-444a-8967-17725d75565d-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"3fa83a8e-fd16-444a-8967-17725d75565d\") " pod="openstack/manila-scheduler-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.436639 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\") " pod="openstack/manila-share-share1-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.445612 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-gp8rq"] Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.466250 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-95ccfc9f9-9rcc7"] Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.467851 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95ccfc9f9-9rcc7" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.476456 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95ccfc9f9-9rcc7"] Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.494975 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.497835 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.499922 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.514602 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.538071 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-config-data\") pod \"manila-share-share1-0\" (UID: \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\") " pod="openstack/manila-share-share1-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.538106 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-scripts\") pod \"manila-share-share1-0\" (UID: \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\") " pod="openstack/manila-share-share1-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.538125 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fa83a8e-fd16-444a-8967-17725d75565d-scripts\") pod \"manila-scheduler-0\" (UID: \"3fa83a8e-fd16-444a-8967-17725d75565d\") " pod="openstack/manila-scheduler-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.538166 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3fa83a8e-fd16-444a-8967-17725d75565d-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"3fa83a8e-fd16-444a-8967-17725d75565d\") " pod="openstack/manila-scheduler-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.538182 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\") " pod="openstack/manila-share-share1-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.538224 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\") " pod="openstack/manila-share-share1-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.538252 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\") " pod="openstack/manila-share-share1-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.538269 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-ceph\") pod \"manila-share-share1-0\" (UID: \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\") " pod="openstack/manila-share-share1-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.538284 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fa83a8e-fd16-444a-8967-17725d75565d-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"3fa83a8e-fd16-444a-8967-17725d75565d\") " pod="openstack/manila-scheduler-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.538303 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-799ln\" (UniqueName: \"kubernetes.io/projected/3fa83a8e-fd16-444a-8967-17725d75565d-kube-api-access-799ln\") pod \"manila-scheduler-0\" (UID: \"3fa83a8e-fd16-444a-8967-17725d75565d\") " pod="openstack/manila-scheduler-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.538344 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqc8q\" (UniqueName: \"kubernetes.io/projected/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-kube-api-access-kqc8q\") pod \"manila-share-share1-0\" (UID: \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\") " pod="openstack/manila-share-share1-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.538374 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fa83a8e-fd16-444a-8967-17725d75565d-config-data\") pod \"manila-scheduler-0\" (UID: \"3fa83a8e-fd16-444a-8967-17725d75565d\") " pod="openstack/manila-scheduler-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.538389 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fa83a8e-fd16-444a-8967-17725d75565d-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"3fa83a8e-fd16-444a-8967-17725d75565d\") " pod="openstack/manila-scheduler-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.538408 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\") " pod="openstack/manila-share-share1-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.541789 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\") " pod="openstack/manila-share-share1-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.544250 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3fa83a8e-fd16-444a-8967-17725d75565d-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"3fa83a8e-fd16-444a-8967-17725d75565d\") " pod="openstack/manila-scheduler-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.545819 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\") " pod="openstack/manila-share-share1-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.555592 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-config-data\") pod \"manila-share-share1-0\" (UID: \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\") " pod="openstack/manila-share-share1-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.556900 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\") " pod="openstack/manila-share-share1-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.557481 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-scripts\") pod \"manila-share-share1-0\" (UID: \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\") " pod="openstack/manila-share-share1-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.558268 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\") " pod="openstack/manila-share-share1-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.560231 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fa83a8e-fd16-444a-8967-17725d75565d-config-data\") pod \"manila-scheduler-0\" (UID: \"3fa83a8e-fd16-444a-8967-17725d75565d\") " pod="openstack/manila-scheduler-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.565825 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fa83a8e-fd16-444a-8967-17725d75565d-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"3fa83a8e-fd16-444a-8967-17725d75565d\") " pod="openstack/manila-scheduler-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.565800 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fa83a8e-fd16-444a-8967-17725d75565d-scripts\") pod \"manila-scheduler-0\" (UID: \"3fa83a8e-fd16-444a-8967-17725d75565d\") " pod="openstack/manila-scheduler-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.570201 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fa83a8e-fd16-444a-8967-17725d75565d-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"3fa83a8e-fd16-444a-8967-17725d75565d\") " pod="openstack/manila-scheduler-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.573299 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-ceph\") pod \"manila-share-share1-0\" (UID: \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\") " pod="openstack/manila-share-share1-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.583759 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqc8q\" (UniqueName: \"kubernetes.io/projected/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-kube-api-access-kqc8q\") pod \"manila-share-share1-0\" (UID: \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\") " pod="openstack/manila-share-share1-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.602514 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-799ln\" (UniqueName: \"kubernetes.io/projected/3fa83a8e-fd16-444a-8967-17725d75565d-kube-api-access-799ln\") pod \"manila-scheduler-0\" (UID: \"3fa83a8e-fd16-444a-8967-17725d75565d\") " pod="openstack/manila-scheduler-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.646051 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d159686-9ca4-48e1-8df5-445aa803eb8c-scripts\") pod \"manila-api-0\" (UID: \"1d159686-9ca4-48e1-8df5-445aa803eb8c\") " pod="openstack/manila-api-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.646114 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3110530e-08c4-483d-898f-bcb2eeb0cc62-config\") pod \"dnsmasq-dns-95ccfc9f9-9rcc7\" (UID: \"3110530e-08c4-483d-898f-bcb2eeb0cc62\") " pod="openstack/dnsmasq-dns-95ccfc9f9-9rcc7" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.646137 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d159686-9ca4-48e1-8df5-445aa803eb8c-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"1d159686-9ca4-48e1-8df5-445aa803eb8c\") " pod="openstack/manila-api-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.646183 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3110530e-08c4-483d-898f-bcb2eeb0cc62-ovsdbserver-nb\") pod \"dnsmasq-dns-95ccfc9f9-9rcc7\" (UID: \"3110530e-08c4-483d-898f-bcb2eeb0cc62\") " pod="openstack/dnsmasq-dns-95ccfc9f9-9rcc7" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.646202 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d159686-9ca4-48e1-8df5-445aa803eb8c-config-data\") pod \"manila-api-0\" (UID: \"1d159686-9ca4-48e1-8df5-445aa803eb8c\") " pod="openstack/manila-api-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.646270 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1d159686-9ca4-48e1-8df5-445aa803eb8c-etc-machine-id\") pod \"manila-api-0\" (UID: \"1d159686-9ca4-48e1-8df5-445aa803eb8c\") " pod="openstack/manila-api-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.646307 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44mbn\" (UniqueName: \"kubernetes.io/projected/3110530e-08c4-483d-898f-bcb2eeb0cc62-kube-api-access-44mbn\") pod \"dnsmasq-dns-95ccfc9f9-9rcc7\" (UID: \"3110530e-08c4-483d-898f-bcb2eeb0cc62\") " pod="openstack/dnsmasq-dns-95ccfc9f9-9rcc7" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.646365 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3110530e-08c4-483d-898f-bcb2eeb0cc62-dns-swift-storage-0\") pod \"dnsmasq-dns-95ccfc9f9-9rcc7\" (UID: \"3110530e-08c4-483d-898f-bcb2eeb0cc62\") " pod="openstack/dnsmasq-dns-95ccfc9f9-9rcc7" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.646391 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3110530e-08c4-483d-898f-bcb2eeb0cc62-dns-svc\") pod \"dnsmasq-dns-95ccfc9f9-9rcc7\" (UID: \"3110530e-08c4-483d-898f-bcb2eeb0cc62\") " pod="openstack/dnsmasq-dns-95ccfc9f9-9rcc7" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.646445 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d159686-9ca4-48e1-8df5-445aa803eb8c-logs\") pod \"manila-api-0\" (UID: \"1d159686-9ca4-48e1-8df5-445aa803eb8c\") " pod="openstack/manila-api-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.646487 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ksfk\" (UniqueName: \"kubernetes.io/projected/1d159686-9ca4-48e1-8df5-445aa803eb8c-kube-api-access-4ksfk\") pod \"manila-api-0\" (UID: \"1d159686-9ca4-48e1-8df5-445aa803eb8c\") " pod="openstack/manila-api-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.646508 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d159686-9ca4-48e1-8df5-445aa803eb8c-config-data-custom\") pod \"manila-api-0\" (UID: \"1d159686-9ca4-48e1-8df5-445aa803eb8c\") " pod="openstack/manila-api-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.646522 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3110530e-08c4-483d-898f-bcb2eeb0cc62-ovsdbserver-sb\") pod \"dnsmasq-dns-95ccfc9f9-9rcc7\" (UID: \"3110530e-08c4-483d-898f-bcb2eeb0cc62\") " pod="openstack/dnsmasq-dns-95ccfc9f9-9rcc7" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.747310 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.752284 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3110530e-08c4-483d-898f-bcb2eeb0cc62-dns-swift-storage-0\") pod \"dnsmasq-dns-95ccfc9f9-9rcc7\" (UID: \"3110530e-08c4-483d-898f-bcb2eeb0cc62\") " pod="openstack/dnsmasq-dns-95ccfc9f9-9rcc7" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.752315 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3110530e-08c4-483d-898f-bcb2eeb0cc62-dns-svc\") pod \"dnsmasq-dns-95ccfc9f9-9rcc7\" (UID: \"3110530e-08c4-483d-898f-bcb2eeb0cc62\") " pod="openstack/dnsmasq-dns-95ccfc9f9-9rcc7" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.752356 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d159686-9ca4-48e1-8df5-445aa803eb8c-logs\") pod \"manila-api-0\" (UID: \"1d159686-9ca4-48e1-8df5-445aa803eb8c\") " pod="openstack/manila-api-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.752398 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ksfk\" (UniqueName: \"kubernetes.io/projected/1d159686-9ca4-48e1-8df5-445aa803eb8c-kube-api-access-4ksfk\") pod \"manila-api-0\" (UID: \"1d159686-9ca4-48e1-8df5-445aa803eb8c\") " pod="openstack/manila-api-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.752509 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d159686-9ca4-48e1-8df5-445aa803eb8c-config-data-custom\") pod \"manila-api-0\" (UID: \"1d159686-9ca4-48e1-8df5-445aa803eb8c\") " pod="openstack/manila-api-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.752532 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3110530e-08c4-483d-898f-bcb2eeb0cc62-ovsdbserver-sb\") pod \"dnsmasq-dns-95ccfc9f9-9rcc7\" (UID: \"3110530e-08c4-483d-898f-bcb2eeb0cc62\") " pod="openstack/dnsmasq-dns-95ccfc9f9-9rcc7" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.752595 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d159686-9ca4-48e1-8df5-445aa803eb8c-scripts\") pod \"manila-api-0\" (UID: \"1d159686-9ca4-48e1-8df5-445aa803eb8c\") " pod="openstack/manila-api-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.752624 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3110530e-08c4-483d-898f-bcb2eeb0cc62-config\") pod \"dnsmasq-dns-95ccfc9f9-9rcc7\" (UID: \"3110530e-08c4-483d-898f-bcb2eeb0cc62\") " pod="openstack/dnsmasq-dns-95ccfc9f9-9rcc7" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.752641 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d159686-9ca4-48e1-8df5-445aa803eb8c-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"1d159686-9ca4-48e1-8df5-445aa803eb8c\") " pod="openstack/manila-api-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.752674 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3110530e-08c4-483d-898f-bcb2eeb0cc62-ovsdbserver-nb\") pod \"dnsmasq-dns-95ccfc9f9-9rcc7\" (UID: \"3110530e-08c4-483d-898f-bcb2eeb0cc62\") " pod="openstack/dnsmasq-dns-95ccfc9f9-9rcc7" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.752690 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d159686-9ca4-48e1-8df5-445aa803eb8c-config-data\") pod \"manila-api-0\" (UID: \"1d159686-9ca4-48e1-8df5-445aa803eb8c\") " pod="openstack/manila-api-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.752708 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1d159686-9ca4-48e1-8df5-445aa803eb8c-etc-machine-id\") pod \"manila-api-0\" (UID: \"1d159686-9ca4-48e1-8df5-445aa803eb8c\") " pod="openstack/manila-api-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.752739 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44mbn\" (UniqueName: \"kubernetes.io/projected/3110530e-08c4-483d-898f-bcb2eeb0cc62-kube-api-access-44mbn\") pod \"dnsmasq-dns-95ccfc9f9-9rcc7\" (UID: \"3110530e-08c4-483d-898f-bcb2eeb0cc62\") " pod="openstack/dnsmasq-dns-95ccfc9f9-9rcc7" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.753219 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3110530e-08c4-483d-898f-bcb2eeb0cc62-dns-swift-storage-0\") pod \"dnsmasq-dns-95ccfc9f9-9rcc7\" (UID: \"3110530e-08c4-483d-898f-bcb2eeb0cc62\") " pod="openstack/dnsmasq-dns-95ccfc9f9-9rcc7" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.753822 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3110530e-08c4-483d-898f-bcb2eeb0cc62-dns-svc\") pod \"dnsmasq-dns-95ccfc9f9-9rcc7\" (UID: \"3110530e-08c4-483d-898f-bcb2eeb0cc62\") " pod="openstack/dnsmasq-dns-95ccfc9f9-9rcc7" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.754357 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3110530e-08c4-483d-898f-bcb2eeb0cc62-config\") pod \"dnsmasq-dns-95ccfc9f9-9rcc7\" (UID: \"3110530e-08c4-483d-898f-bcb2eeb0cc62\") " pod="openstack/dnsmasq-dns-95ccfc9f9-9rcc7" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.754998 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d159686-9ca4-48e1-8df5-445aa803eb8c-logs\") pod \"manila-api-0\" (UID: \"1d159686-9ca4-48e1-8df5-445aa803eb8c\") " pod="openstack/manila-api-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.754999 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1d159686-9ca4-48e1-8df5-445aa803eb8c-etc-machine-id\") pod \"manila-api-0\" (UID: \"1d159686-9ca4-48e1-8df5-445aa803eb8c\") " pod="openstack/manila-api-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.755674 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3110530e-08c4-483d-898f-bcb2eeb0cc62-ovsdbserver-nb\") pod \"dnsmasq-dns-95ccfc9f9-9rcc7\" (UID: \"3110530e-08c4-483d-898f-bcb2eeb0cc62\") " pod="openstack/dnsmasq-dns-95ccfc9f9-9rcc7" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.756187 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3110530e-08c4-483d-898f-bcb2eeb0cc62-ovsdbserver-sb\") pod \"dnsmasq-dns-95ccfc9f9-9rcc7\" (UID: \"3110530e-08c4-483d-898f-bcb2eeb0cc62\") " pod="openstack/dnsmasq-dns-95ccfc9f9-9rcc7" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.765817 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d159686-9ca4-48e1-8df5-445aa803eb8c-config-data\") pod \"manila-api-0\" (UID: \"1d159686-9ca4-48e1-8df5-445aa803eb8c\") " pod="openstack/manila-api-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.766391 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d159686-9ca4-48e1-8df5-445aa803eb8c-config-data-custom\") pod \"manila-api-0\" (UID: \"1d159686-9ca4-48e1-8df5-445aa803eb8c\") " pod="openstack/manila-api-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.770135 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d159686-9ca4-48e1-8df5-445aa803eb8c-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"1d159686-9ca4-48e1-8df5-445aa803eb8c\") " pod="openstack/manila-api-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.770213 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d159686-9ca4-48e1-8df5-445aa803eb8c-scripts\") pod \"manila-api-0\" (UID: \"1d159686-9ca4-48e1-8df5-445aa803eb8c\") " pod="openstack/manila-api-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.782281 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ksfk\" (UniqueName: \"kubernetes.io/projected/1d159686-9ca4-48e1-8df5-445aa803eb8c-kube-api-access-4ksfk\") pod \"manila-api-0\" (UID: \"1d159686-9ca4-48e1-8df5-445aa803eb8c\") " pod="openstack/manila-api-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.787757 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.787929 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44mbn\" (UniqueName: \"kubernetes.io/projected/3110530e-08c4-483d-898f-bcb2eeb0cc62-kube-api-access-44mbn\") pod \"dnsmasq-dns-95ccfc9f9-9rcc7\" (UID: \"3110530e-08c4-483d-898f-bcb2eeb0cc62\") " pod="openstack/dnsmasq-dns-95ccfc9f9-9rcc7" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.811452 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95ccfc9f9-9rcc7" Mar 12 15:08:34 crc kubenswrapper[4869]: I0312 15:08:34.846973 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 12 15:08:35 crc kubenswrapper[4869]: I0312 15:08:35.027919 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-gp8rq" podUID="75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07" containerName="dnsmasq-dns" containerID="cri-o://8b3ee80975e1d3643409b5fb530997091c8816b2bd0ecfcbcaacde4a9d977fe7" gracePeriod=10 Mar 12 15:08:35 crc kubenswrapper[4869]: I0312 15:08:35.029163 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 12 15:08:35 crc kubenswrapper[4869]: I0312 15:08:35.047170 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 12 15:08:35 crc kubenswrapper[4869]: I0312 15:08:35.084326 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 12 15:08:35 crc kubenswrapper[4869]: I0312 15:08:35.089063 4869 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="ea3ffca3-9d12-4095-b382-52a0a55a7c69" podUID="df9ac527-ae76-4cb7-b474-61f5699e610f" Mar 12 15:08:35 crc kubenswrapper[4869]: I0312 15:08:35.263910 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ea3ffca3-9d12-4095-b382-52a0a55a7c69-openstack-config\") pod \"ea3ffca3-9d12-4095-b382-52a0a55a7c69\" (UID: \"ea3ffca3-9d12-4095-b382-52a0a55a7c69\") " Mar 12 15:08:35 crc kubenswrapper[4869]: I0312 15:08:35.264039 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea3ffca3-9d12-4095-b382-52a0a55a7c69-combined-ca-bundle\") pod \"ea3ffca3-9d12-4095-b382-52a0a55a7c69\" (UID: \"ea3ffca3-9d12-4095-b382-52a0a55a7c69\") " Mar 12 15:08:35 crc kubenswrapper[4869]: I0312 15:08:35.264110 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmkb9\" (UniqueName: \"kubernetes.io/projected/ea3ffca3-9d12-4095-b382-52a0a55a7c69-kube-api-access-mmkb9\") pod \"ea3ffca3-9d12-4095-b382-52a0a55a7c69\" (UID: \"ea3ffca3-9d12-4095-b382-52a0a55a7c69\") " Mar 12 15:08:35 crc kubenswrapper[4869]: I0312 15:08:35.264131 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ea3ffca3-9d12-4095-b382-52a0a55a7c69-openstack-config-secret\") pod \"ea3ffca3-9d12-4095-b382-52a0a55a7c69\" (UID: \"ea3ffca3-9d12-4095-b382-52a0a55a7c69\") " Mar 12 15:08:35 crc kubenswrapper[4869]: I0312 15:08:35.264508 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea3ffca3-9d12-4095-b382-52a0a55a7c69-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "ea3ffca3-9d12-4095-b382-52a0a55a7c69" (UID: "ea3ffca3-9d12-4095-b382-52a0a55a7c69"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:08:35 crc kubenswrapper[4869]: I0312 15:08:35.276394 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea3ffca3-9d12-4095-b382-52a0a55a7c69-kube-api-access-mmkb9" (OuterVolumeSpecName: "kube-api-access-mmkb9") pod "ea3ffca3-9d12-4095-b382-52a0a55a7c69" (UID: "ea3ffca3-9d12-4095-b382-52a0a55a7c69"). InnerVolumeSpecName "kube-api-access-mmkb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:08:35 crc kubenswrapper[4869]: I0312 15:08:35.285053 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea3ffca3-9d12-4095-b382-52a0a55a7c69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea3ffca3-9d12-4095-b382-52a0a55a7c69" (UID: "ea3ffca3-9d12-4095-b382-52a0a55a7c69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:35 crc kubenswrapper[4869]: I0312 15:08:35.285136 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea3ffca3-9d12-4095-b382-52a0a55a7c69-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "ea3ffca3-9d12-4095-b382-52a0a55a7c69" (UID: "ea3ffca3-9d12-4095-b382-52a0a55a7c69"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:35 crc kubenswrapper[4869]: I0312 15:08:35.368674 4869 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ea3ffca3-9d12-4095-b382-52a0a55a7c69-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:35 crc kubenswrapper[4869]: I0312 15:08:35.368702 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea3ffca3-9d12-4095-b382-52a0a55a7c69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:35 crc kubenswrapper[4869]: I0312 15:08:35.368711 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmkb9\" (UniqueName: \"kubernetes.io/projected/ea3ffca3-9d12-4095-b382-52a0a55a7c69-kube-api-access-mmkb9\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:35 crc kubenswrapper[4869]: I0312 15:08:35.368719 4869 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ea3ffca3-9d12-4095-b382-52a0a55a7c69-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:35 crc kubenswrapper[4869]: I0312 15:08:35.596139 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 12 15:08:35 crc kubenswrapper[4869]: I0312 15:08:35.632710 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95ccfc9f9-9rcc7"] Mar 12 15:08:35 crc kubenswrapper[4869]: I0312 15:08:35.702034 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 12 15:08:35 crc kubenswrapper[4869]: I0312 15:08:35.766071 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7f798b7b68-jtm8h" podUID="b4d63031-e072-466e-ae3c-d829a699b197" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Mar 12 15:08:35 crc kubenswrapper[4869]: I0312 15:08:35.908763 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 12 15:08:35 crc kubenswrapper[4869]: W0312 15:08:35.933964 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d159686_9ca4_48e1_8df5_445aa803eb8c.slice/crio-d4c5fefb2af0d84b1c13bdbe786df81c64f84d962fdc84ed630d87039d3cc24c WatchSource:0}: Error finding container d4c5fefb2af0d84b1c13bdbe786df81c64f84d962fdc84ed630d87039d3cc24c: Status 404 returned error can't find the container with id d4c5fefb2af0d84b1c13bdbe786df81c64f84d962fdc84ed630d87039d3cc24c Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.008964 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-gp8rq" Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.064723 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"3fa83a8e-fd16-444a-8967-17725d75565d","Type":"ContainerStarted","Data":"a4eacc1352efee4e3f0b177418ac7cc5acb4c13aadb0779ebb54e35c2a1aba2e"} Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.072659 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95ccfc9f9-9rcc7" event={"ID":"3110530e-08c4-483d-898f-bcb2eeb0cc62","Type":"ContainerStarted","Data":"4a6f5fcd37bd9960009b050a58faeb823ec610a3e411721956e863845d147445"} Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.113740 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-dns-swift-storage-0\") pod \"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07\" (UID: \"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07\") " Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.113837 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdztz\" (UniqueName: \"kubernetes.io/projected/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-kube-api-access-bdztz\") pod \"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07\" (UID: \"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07\") " Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.113959 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-config\") pod \"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07\" (UID: \"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07\") " Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.114036 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-ovsdbserver-nb\") pod \"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07\" (UID: \"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07\") " Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.114140 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-dns-svc\") pod \"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07\" (UID: \"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07\") " Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.121989 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-ovsdbserver-sb\") pod \"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07\" (UID: \"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07\") " Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.190072 4869 generic.go:334] "Generic (PLEG): container finished" podID="75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07" containerID="8b3ee80975e1d3643409b5fb530997091c8816b2bd0ecfcbcaacde4a9d977fe7" exitCode=0 Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.190189 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-gp8rq" event={"ID":"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07","Type":"ContainerDied","Data":"8b3ee80975e1d3643409b5fb530997091c8816b2bd0ecfcbcaacde4a9d977fe7"} Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.190223 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-gp8rq" event={"ID":"75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07","Type":"ContainerDied","Data":"cebe402b114474f0290c30836f0cfa5d0b1971908410c9ed2cbe05d206cbfe8a"} Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.190276 4869 scope.go:117] "RemoveContainer" containerID="8b3ee80975e1d3643409b5fb530997091c8816b2bd0ecfcbcaacde4a9d977fe7" Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.190597 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-gp8rq" Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.211031 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-kube-api-access-bdztz" (OuterVolumeSpecName: "kube-api-access-bdztz") pod "75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07" (UID: "75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07"). InnerVolumeSpecName "kube-api-access-bdztz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.244657 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdztz\" (UniqueName: \"kubernetes.io/projected/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-kube-api-access-bdztz\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.249794 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"1d159686-9ca4-48e1-8df5-445aa803eb8c","Type":"ContainerStarted","Data":"d4c5fefb2af0d84b1c13bdbe786df81c64f84d962fdc84ed630d87039d3cc24c"} Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.260269 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"df9ac527-ae76-4cb7-b474-61f5699e610f","Type":"ContainerStarted","Data":"4e9f403b34ce1ec40c753f7ad561847766caddaa805bee02f2c88d604ce033d7"} Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.267740 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.268100 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"6838a2d7-2052-45b9-a8d5-3aa6639bccb4","Type":"ContainerStarted","Data":"5eef3bff8ede5e368449574dbd8de81c3649bcd7bf68cc7651c22688dcd5131d"} Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.301450 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07" (UID: "75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.310125 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07" (UID: "75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.346348 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-config" (OuterVolumeSpecName: "config") pod "75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07" (UID: "75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.348068 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.348099 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-config\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.348110 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.357011 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07" (UID: "75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.370909 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea3ffca3-9d12-4095-b382-52a0a55a7c69" path="/var/lib/kubelet/pods/ea3ffca3-9d12-4095-b382-52a0a55a7c69/volumes" Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.390386 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07" (UID: "75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.452025 4869 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.452064 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.531070 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-gp8rq"] Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.538182 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-gp8rq"] Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.619919 4869 scope.go:117] "RemoveContainer" containerID="8bad80e7ca3862201ed9397ef85339a0a24474d2913af32a9dc8a4b734152c8b" Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.650341 4869 scope.go:117] "RemoveContainer" containerID="8b3ee80975e1d3643409b5fb530997091c8816b2bd0ecfcbcaacde4a9d977fe7" Mar 12 15:08:36 crc kubenswrapper[4869]: E0312 15:08:36.651966 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b3ee80975e1d3643409b5fb530997091c8816b2bd0ecfcbcaacde4a9d977fe7\": container with ID starting with 8b3ee80975e1d3643409b5fb530997091c8816b2bd0ecfcbcaacde4a9d977fe7 not found: ID does not exist" containerID="8b3ee80975e1d3643409b5fb530997091c8816b2bd0ecfcbcaacde4a9d977fe7" Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.652005 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b3ee80975e1d3643409b5fb530997091c8816b2bd0ecfcbcaacde4a9d977fe7"} err="failed to get container status \"8b3ee80975e1d3643409b5fb530997091c8816b2bd0ecfcbcaacde4a9d977fe7\": rpc error: code = NotFound desc = could not find container \"8b3ee80975e1d3643409b5fb530997091c8816b2bd0ecfcbcaacde4a9d977fe7\": container with ID starting with 8b3ee80975e1d3643409b5fb530997091c8816b2bd0ecfcbcaacde4a9d977fe7 not found: ID does not exist" Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.652092 4869 scope.go:117] "RemoveContainer" containerID="8bad80e7ca3862201ed9397ef85339a0a24474d2913af32a9dc8a4b734152c8b" Mar 12 15:08:36 crc kubenswrapper[4869]: E0312 15:08:36.660827 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bad80e7ca3862201ed9397ef85339a0a24474d2913af32a9dc8a4b734152c8b\": container with ID starting with 8bad80e7ca3862201ed9397ef85339a0a24474d2913af32a9dc8a4b734152c8b not found: ID does not exist" containerID="8bad80e7ca3862201ed9397ef85339a0a24474d2913af32a9dc8a4b734152c8b" Mar 12 15:08:36 crc kubenswrapper[4869]: I0312 15:08:36.660884 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bad80e7ca3862201ed9397ef85339a0a24474d2913af32a9dc8a4b734152c8b"} err="failed to get container status \"8bad80e7ca3862201ed9397ef85339a0a24474d2913af32a9dc8a4b734152c8b\": rpc error: code = NotFound desc = could not find container \"8bad80e7ca3862201ed9397ef85339a0a24474d2913af32a9dc8a4b734152c8b\": container with ID starting with 8bad80e7ca3862201ed9397ef85339a0a24474d2913af32a9dc8a4b734152c8b not found: ID does not exist" Mar 12 15:08:37 crc kubenswrapper[4869]: I0312 15:08:37.085922 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-75658d85fd-ktwbd" Mar 12 15:08:37 crc kubenswrapper[4869]: I0312 15:08:37.259427 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-75658d85fd-ktwbd" Mar 12 15:08:37 crc kubenswrapper[4869]: I0312 15:08:37.330827 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"1d159686-9ca4-48e1-8df5-445aa803eb8c","Type":"ContainerStarted","Data":"a77471c37a712719f7a64fe9ed0a5da9bac1d44e001ad011363d90cea2a4e6f7"} Mar 12 15:08:37 crc kubenswrapper[4869]: I0312 15:08:37.332669 4869 generic.go:334] "Generic (PLEG): container finished" podID="3110530e-08c4-483d-898f-bcb2eeb0cc62" containerID="192ea470b280bb582eae77c3f3779e33d7e7b61f275b0a207a15770c19a306ec" exitCode=0 Mar 12 15:08:37 crc kubenswrapper[4869]: I0312 15:08:37.332715 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95ccfc9f9-9rcc7" event={"ID":"3110530e-08c4-483d-898f-bcb2eeb0cc62","Type":"ContainerDied","Data":"192ea470b280bb582eae77c3f3779e33d7e7b61f275b0a207a15770c19a306ec"} Mar 12 15:08:37 crc kubenswrapper[4869]: I0312 15:08:37.733200 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Mar 12 15:08:38 crc kubenswrapper[4869]: I0312 15:08:38.442905 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07" path="/var/lib/kubelet/pods/75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07/volumes" Mar 12 15:08:38 crc kubenswrapper[4869]: I0312 15:08:38.457405 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"3fa83a8e-fd16-444a-8967-17725d75565d","Type":"ContainerStarted","Data":"3c98d32f355dd68e7daf78c6478c6758363c3cb4ff1388fc0fbf56edd988d467"} Mar 12 15:08:38 crc kubenswrapper[4869]: I0312 15:08:38.459215 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95ccfc9f9-9rcc7" event={"ID":"3110530e-08c4-483d-898f-bcb2eeb0cc62","Type":"ContainerStarted","Data":"b14c0e4c43c6113161339211452c8a28a565cbc34dd4a650599d4d8ab5e77c51"} Mar 12 15:08:38 crc kubenswrapper[4869]: I0312 15:08:38.459961 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-95ccfc9f9-9rcc7" Mar 12 15:08:38 crc kubenswrapper[4869]: I0312 15:08:38.461870 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"1d159686-9ca4-48e1-8df5-445aa803eb8c","Type":"ContainerStarted","Data":"a0413f892069f4629a0039c2467728fc594a26a01cbb0da3ffe6b34da2f4dbbe"} Mar 12 15:08:38 crc kubenswrapper[4869]: I0312 15:08:38.461968 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="1d159686-9ca4-48e1-8df5-445aa803eb8c" containerName="manila-api-log" containerID="cri-o://a77471c37a712719f7a64fe9ed0a5da9bac1d44e001ad011363d90cea2a4e6f7" gracePeriod=30 Mar 12 15:08:38 crc kubenswrapper[4869]: I0312 15:08:38.462223 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Mar 12 15:08:38 crc kubenswrapper[4869]: I0312 15:08:38.462252 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="1d159686-9ca4-48e1-8df5-445aa803eb8c" containerName="manila-api" containerID="cri-o://a0413f892069f4629a0039c2467728fc594a26a01cbb0da3ffe6b34da2f4dbbe" gracePeriod=30 Mar 12 15:08:38 crc kubenswrapper[4869]: I0312 15:08:38.669820 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-95ccfc9f9-9rcc7" podStartSLOduration=4.669799358 podStartE2EDuration="4.669799358s" podCreationTimestamp="2026-03-12 15:08:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:08:38.641081077 +0000 UTC m=+1270.926306355" watchObservedRunningTime="2026-03-12 15:08:38.669799358 +0000 UTC m=+1270.955024636" Mar 12 15:08:38 crc kubenswrapper[4869]: I0312 15:08:38.674598 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=4.674587305 podStartE2EDuration="4.674587305s" podCreationTimestamp="2026-03-12 15:08:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:08:38.66110928 +0000 UTC m=+1270.946334558" watchObservedRunningTime="2026-03-12 15:08:38.674587305 +0000 UTC m=+1270.959812583" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.405995 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.462728 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d159686-9ca4-48e1-8df5-445aa803eb8c-config-data\") pod \"1d159686-9ca4-48e1-8df5-445aa803eb8c\" (UID: \"1d159686-9ca4-48e1-8df5-445aa803eb8c\") " Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.462809 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ksfk\" (UniqueName: \"kubernetes.io/projected/1d159686-9ca4-48e1-8df5-445aa803eb8c-kube-api-access-4ksfk\") pod \"1d159686-9ca4-48e1-8df5-445aa803eb8c\" (UID: \"1d159686-9ca4-48e1-8df5-445aa803eb8c\") " Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.462837 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d159686-9ca4-48e1-8df5-445aa803eb8c-logs\") pod \"1d159686-9ca4-48e1-8df5-445aa803eb8c\" (UID: \"1d159686-9ca4-48e1-8df5-445aa803eb8c\") " Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.462948 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d159686-9ca4-48e1-8df5-445aa803eb8c-scripts\") pod \"1d159686-9ca4-48e1-8df5-445aa803eb8c\" (UID: \"1d159686-9ca4-48e1-8df5-445aa803eb8c\") " Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.463007 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1d159686-9ca4-48e1-8df5-445aa803eb8c-etc-machine-id\") pod \"1d159686-9ca4-48e1-8df5-445aa803eb8c\" (UID: \"1d159686-9ca4-48e1-8df5-445aa803eb8c\") " Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.463071 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d159686-9ca4-48e1-8df5-445aa803eb8c-config-data-custom\") pod \"1d159686-9ca4-48e1-8df5-445aa803eb8c\" (UID: \"1d159686-9ca4-48e1-8df5-445aa803eb8c\") " Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.463142 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d159686-9ca4-48e1-8df5-445aa803eb8c-combined-ca-bundle\") pod \"1d159686-9ca4-48e1-8df5-445aa803eb8c\" (UID: \"1d159686-9ca4-48e1-8df5-445aa803eb8c\") " Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.475689 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d159686-9ca4-48e1-8df5-445aa803eb8c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1d159686-9ca4-48e1-8df5-445aa803eb8c" (UID: "1d159686-9ca4-48e1-8df5-445aa803eb8c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.477277 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d159686-9ca4-48e1-8df5-445aa803eb8c-logs" (OuterVolumeSpecName: "logs") pod "1d159686-9ca4-48e1-8df5-445aa803eb8c" (UID: "1d159686-9ca4-48e1-8df5-445aa803eb8c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.516408 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d159686-9ca4-48e1-8df5-445aa803eb8c-scripts" (OuterVolumeSpecName: "scripts") pod "1d159686-9ca4-48e1-8df5-445aa803eb8c" (UID: "1d159686-9ca4-48e1-8df5-445aa803eb8c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.518781 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d159686-9ca4-48e1-8df5-445aa803eb8c-kube-api-access-4ksfk" (OuterVolumeSpecName: "kube-api-access-4ksfk") pod "1d159686-9ca4-48e1-8df5-445aa803eb8c" (UID: "1d159686-9ca4-48e1-8df5-445aa803eb8c"). InnerVolumeSpecName "kube-api-access-4ksfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.536764 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d159686-9ca4-48e1-8df5-445aa803eb8c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1d159686-9ca4-48e1-8df5-445aa803eb8c" (UID: "1d159686-9ca4-48e1-8df5-445aa803eb8c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.537066 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"3fa83a8e-fd16-444a-8967-17725d75565d","Type":"ContainerStarted","Data":"ac01121649e7151b6fccc1ceaef28cc60d4fc6a663ba1bdcfc03a76d5b3025cf"} Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.574025 4869 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1d159686-9ca4-48e1-8df5-445aa803eb8c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.574063 4869 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d159686-9ca4-48e1-8df5-445aa803eb8c-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.574075 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ksfk\" (UniqueName: \"kubernetes.io/projected/1d159686-9ca4-48e1-8df5-445aa803eb8c-kube-api-access-4ksfk\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.574084 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d159686-9ca4-48e1-8df5-445aa803eb8c-logs\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.574095 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d159686-9ca4-48e1-8df5-445aa803eb8c-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.579936 4869 generic.go:334] "Generic (PLEG): container finished" podID="1d159686-9ca4-48e1-8df5-445aa803eb8c" containerID="a0413f892069f4629a0039c2467728fc594a26a01cbb0da3ffe6b34da2f4dbbe" exitCode=143 Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.579968 4869 generic.go:334] "Generic (PLEG): container finished" podID="1d159686-9ca4-48e1-8df5-445aa803eb8c" containerID="a77471c37a712719f7a64fe9ed0a5da9bac1d44e001ad011363d90cea2a4e6f7" exitCode=143 Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.580948 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.585103 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"1d159686-9ca4-48e1-8df5-445aa803eb8c","Type":"ContainerDied","Data":"a0413f892069f4629a0039c2467728fc594a26a01cbb0da3ffe6b34da2f4dbbe"} Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.585167 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"1d159686-9ca4-48e1-8df5-445aa803eb8c","Type":"ContainerDied","Data":"a77471c37a712719f7a64fe9ed0a5da9bac1d44e001ad011363d90cea2a4e6f7"} Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.585180 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"1d159686-9ca4-48e1-8df5-445aa803eb8c","Type":"ContainerDied","Data":"d4c5fefb2af0d84b1c13bdbe786df81c64f84d962fdc84ed630d87039d3cc24c"} Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.585201 4869 scope.go:117] "RemoveContainer" containerID="a0413f892069f4629a0039c2467728fc594a26a01cbb0da3ffe6b34da2f4dbbe" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.600801 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d159686-9ca4-48e1-8df5-445aa803eb8c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d159686-9ca4-48e1-8df5-445aa803eb8c" (UID: "1d159686-9ca4-48e1-8df5-445aa803eb8c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.620642 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d159686-9ca4-48e1-8df5-445aa803eb8c-config-data" (OuterVolumeSpecName: "config-data") pod "1d159686-9ca4-48e1-8df5-445aa803eb8c" (UID: "1d159686-9ca4-48e1-8df5-445aa803eb8c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.643978 4869 scope.go:117] "RemoveContainer" containerID="a77471c37a712719f7a64fe9ed0a5da9bac1d44e001ad011363d90cea2a4e6f7" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.677439 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d159686-9ca4-48e1-8df5-445aa803eb8c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.677469 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d159686-9ca4-48e1-8df5-445aa803eb8c-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.708643 4869 scope.go:117] "RemoveContainer" containerID="a0413f892069f4629a0039c2467728fc594a26a01cbb0da3ffe6b34da2f4dbbe" Mar 12 15:08:39 crc kubenswrapper[4869]: E0312 15:08:39.710050 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0413f892069f4629a0039c2467728fc594a26a01cbb0da3ffe6b34da2f4dbbe\": container with ID starting with a0413f892069f4629a0039c2467728fc594a26a01cbb0da3ffe6b34da2f4dbbe not found: ID does not exist" containerID="a0413f892069f4629a0039c2467728fc594a26a01cbb0da3ffe6b34da2f4dbbe" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.710164 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0413f892069f4629a0039c2467728fc594a26a01cbb0da3ffe6b34da2f4dbbe"} err="failed to get container status \"a0413f892069f4629a0039c2467728fc594a26a01cbb0da3ffe6b34da2f4dbbe\": rpc error: code = NotFound desc = could not find container \"a0413f892069f4629a0039c2467728fc594a26a01cbb0da3ffe6b34da2f4dbbe\": container with ID starting with a0413f892069f4629a0039c2467728fc594a26a01cbb0da3ffe6b34da2f4dbbe not found: ID does not exist" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.710190 4869 scope.go:117] "RemoveContainer" containerID="a77471c37a712719f7a64fe9ed0a5da9bac1d44e001ad011363d90cea2a4e6f7" Mar 12 15:08:39 crc kubenswrapper[4869]: E0312 15:08:39.710557 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a77471c37a712719f7a64fe9ed0a5da9bac1d44e001ad011363d90cea2a4e6f7\": container with ID starting with a77471c37a712719f7a64fe9ed0a5da9bac1d44e001ad011363d90cea2a4e6f7 not found: ID does not exist" containerID="a77471c37a712719f7a64fe9ed0a5da9bac1d44e001ad011363d90cea2a4e6f7" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.710600 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a77471c37a712719f7a64fe9ed0a5da9bac1d44e001ad011363d90cea2a4e6f7"} err="failed to get container status \"a77471c37a712719f7a64fe9ed0a5da9bac1d44e001ad011363d90cea2a4e6f7\": rpc error: code = NotFound desc = could not find container \"a77471c37a712719f7a64fe9ed0a5da9bac1d44e001ad011363d90cea2a4e6f7\": container with ID starting with a77471c37a712719f7a64fe9ed0a5da9bac1d44e001ad011363d90cea2a4e6f7 not found: ID does not exist" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.710627 4869 scope.go:117] "RemoveContainer" containerID="a0413f892069f4629a0039c2467728fc594a26a01cbb0da3ffe6b34da2f4dbbe" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.710964 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0413f892069f4629a0039c2467728fc594a26a01cbb0da3ffe6b34da2f4dbbe"} err="failed to get container status \"a0413f892069f4629a0039c2467728fc594a26a01cbb0da3ffe6b34da2f4dbbe\": rpc error: code = NotFound desc = could not find container \"a0413f892069f4629a0039c2467728fc594a26a01cbb0da3ffe6b34da2f4dbbe\": container with ID starting with a0413f892069f4629a0039c2467728fc594a26a01cbb0da3ffe6b34da2f4dbbe not found: ID does not exist" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.710996 4869 scope.go:117] "RemoveContainer" containerID="a77471c37a712719f7a64fe9ed0a5da9bac1d44e001ad011363d90cea2a4e6f7" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.711177 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a77471c37a712719f7a64fe9ed0a5da9bac1d44e001ad011363d90cea2a4e6f7"} err="failed to get container status \"a77471c37a712719f7a64fe9ed0a5da9bac1d44e001ad011363d90cea2a4e6f7\": rpc error: code = NotFound desc = could not find container \"a77471c37a712719f7a64fe9ed0a5da9bac1d44e001ad011363d90cea2a4e6f7\": container with ID starting with a77471c37a712719f7a64fe9ed0a5da9bac1d44e001ad011363d90cea2a4e6f7 not found: ID does not exist" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.903947 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=4.525918553 podStartE2EDuration="5.903930199s" podCreationTimestamp="2026-03-12 15:08:34 +0000 UTC" firstStartedPulling="2026-03-12 15:08:35.731773 +0000 UTC m=+1268.016998278" lastFinishedPulling="2026-03-12 15:08:37.109784646 +0000 UTC m=+1269.395009924" observedRunningTime="2026-03-12 15:08:39.573835497 +0000 UTC m=+1271.859060775" watchObservedRunningTime="2026-03-12 15:08:39.903930199 +0000 UTC m=+1272.189155477" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.919329 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.927177 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.950442 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Mar 12 15:08:39 crc kubenswrapper[4869]: E0312 15:08:39.950910 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d159686-9ca4-48e1-8df5-445aa803eb8c" containerName="manila-api" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.950929 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d159686-9ca4-48e1-8df5-445aa803eb8c" containerName="manila-api" Mar 12 15:08:39 crc kubenswrapper[4869]: E0312 15:08:39.950944 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07" containerName="init" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.950951 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07" containerName="init" Mar 12 15:08:39 crc kubenswrapper[4869]: E0312 15:08:39.950976 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07" containerName="dnsmasq-dns" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.950983 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07" containerName="dnsmasq-dns" Mar 12 15:08:39 crc kubenswrapper[4869]: E0312 15:08:39.951008 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d159686-9ca4-48e1-8df5-445aa803eb8c" containerName="manila-api-log" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.951013 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d159686-9ca4-48e1-8df5-445aa803eb8c" containerName="manila-api-log" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.951173 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d159686-9ca4-48e1-8df5-445aa803eb8c" containerName="manila-api" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.951199 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d159686-9ca4-48e1-8df5-445aa803eb8c" containerName="manila-api-log" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.951213 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="75f308bb-16c9-4cf1-a9fb-1b5e4c1bbc07" containerName="dnsmasq-dns" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.952196 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.960353 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.960903 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.961199 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.987591 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.987686 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0c3a0a4-5746-41fe-a7a5-0386aedb2a86-config-data\") pod \"manila-api-0\" (UID: \"b0c3a0a4-5746-41fe-a7a5-0386aedb2a86\") " pod="openstack/manila-api-0" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.987738 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t88gr\" (UniqueName: \"kubernetes.io/projected/b0c3a0a4-5746-41fe-a7a5-0386aedb2a86-kube-api-access-t88gr\") pod \"manila-api-0\" (UID: \"b0c3a0a4-5746-41fe-a7a5-0386aedb2a86\") " pod="openstack/manila-api-0" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.987771 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0c3a0a4-5746-41fe-a7a5-0386aedb2a86-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"b0c3a0a4-5746-41fe-a7a5-0386aedb2a86\") " pod="openstack/manila-api-0" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.987806 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0c3a0a4-5746-41fe-a7a5-0386aedb2a86-scripts\") pod \"manila-api-0\" (UID: \"b0c3a0a4-5746-41fe-a7a5-0386aedb2a86\") " pod="openstack/manila-api-0" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.987831 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b0c3a0a4-5746-41fe-a7a5-0386aedb2a86-config-data-custom\") pod \"manila-api-0\" (UID: \"b0c3a0a4-5746-41fe-a7a5-0386aedb2a86\") " pod="openstack/manila-api-0" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.987874 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0c3a0a4-5746-41fe-a7a5-0386aedb2a86-internal-tls-certs\") pod \"manila-api-0\" (UID: \"b0c3a0a4-5746-41fe-a7a5-0386aedb2a86\") " pod="openstack/manila-api-0" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.987901 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0c3a0a4-5746-41fe-a7a5-0386aedb2a86-public-tls-certs\") pod \"manila-api-0\" (UID: \"b0c3a0a4-5746-41fe-a7a5-0386aedb2a86\") " pod="openstack/manila-api-0" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.987935 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b0c3a0a4-5746-41fe-a7a5-0386aedb2a86-etc-machine-id\") pod \"manila-api-0\" (UID: \"b0c3a0a4-5746-41fe-a7a5-0386aedb2a86\") " pod="openstack/manila-api-0" Mar 12 15:08:39 crc kubenswrapper[4869]: I0312 15:08:39.987963 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0c3a0a4-5746-41fe-a7a5-0386aedb2a86-logs\") pod \"manila-api-0\" (UID: \"b0c3a0a4-5746-41fe-a7a5-0386aedb2a86\") " pod="openstack/manila-api-0" Mar 12 15:08:40 crc kubenswrapper[4869]: I0312 15:08:40.089511 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0c3a0a4-5746-41fe-a7a5-0386aedb2a86-config-data\") pod \"manila-api-0\" (UID: \"b0c3a0a4-5746-41fe-a7a5-0386aedb2a86\") " pod="openstack/manila-api-0" Mar 12 15:08:40 crc kubenswrapper[4869]: I0312 15:08:40.089599 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t88gr\" (UniqueName: \"kubernetes.io/projected/b0c3a0a4-5746-41fe-a7a5-0386aedb2a86-kube-api-access-t88gr\") pod \"manila-api-0\" (UID: \"b0c3a0a4-5746-41fe-a7a5-0386aedb2a86\") " pod="openstack/manila-api-0" Mar 12 15:08:40 crc kubenswrapper[4869]: I0312 15:08:40.089641 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0c3a0a4-5746-41fe-a7a5-0386aedb2a86-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"b0c3a0a4-5746-41fe-a7a5-0386aedb2a86\") " pod="openstack/manila-api-0" Mar 12 15:08:40 crc kubenswrapper[4869]: I0312 15:08:40.089679 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0c3a0a4-5746-41fe-a7a5-0386aedb2a86-scripts\") pod \"manila-api-0\" (UID: \"b0c3a0a4-5746-41fe-a7a5-0386aedb2a86\") " pod="openstack/manila-api-0" Mar 12 15:08:40 crc kubenswrapper[4869]: I0312 15:08:40.089736 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b0c3a0a4-5746-41fe-a7a5-0386aedb2a86-config-data-custom\") pod \"manila-api-0\" (UID: \"b0c3a0a4-5746-41fe-a7a5-0386aedb2a86\") " pod="openstack/manila-api-0" Mar 12 15:08:40 crc kubenswrapper[4869]: I0312 15:08:40.090310 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0c3a0a4-5746-41fe-a7a5-0386aedb2a86-internal-tls-certs\") pod \"manila-api-0\" (UID: \"b0c3a0a4-5746-41fe-a7a5-0386aedb2a86\") " pod="openstack/manila-api-0" Mar 12 15:08:40 crc kubenswrapper[4869]: I0312 15:08:40.090344 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0c3a0a4-5746-41fe-a7a5-0386aedb2a86-public-tls-certs\") pod \"manila-api-0\" (UID: \"b0c3a0a4-5746-41fe-a7a5-0386aedb2a86\") " pod="openstack/manila-api-0" Mar 12 15:08:40 crc kubenswrapper[4869]: I0312 15:08:40.090378 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b0c3a0a4-5746-41fe-a7a5-0386aedb2a86-etc-machine-id\") pod \"manila-api-0\" (UID: \"b0c3a0a4-5746-41fe-a7a5-0386aedb2a86\") " pod="openstack/manila-api-0" Mar 12 15:08:40 crc kubenswrapper[4869]: I0312 15:08:40.090407 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0c3a0a4-5746-41fe-a7a5-0386aedb2a86-logs\") pod \"manila-api-0\" (UID: \"b0c3a0a4-5746-41fe-a7a5-0386aedb2a86\") " pod="openstack/manila-api-0" Mar 12 15:08:40 crc kubenswrapper[4869]: I0312 15:08:40.090913 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0c3a0a4-5746-41fe-a7a5-0386aedb2a86-logs\") pod \"manila-api-0\" (UID: \"b0c3a0a4-5746-41fe-a7a5-0386aedb2a86\") " pod="openstack/manila-api-0" Mar 12 15:08:40 crc kubenswrapper[4869]: I0312 15:08:40.093010 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b0c3a0a4-5746-41fe-a7a5-0386aedb2a86-etc-machine-id\") pod \"manila-api-0\" (UID: \"b0c3a0a4-5746-41fe-a7a5-0386aedb2a86\") " pod="openstack/manila-api-0" Mar 12 15:08:40 crc kubenswrapper[4869]: I0312 15:08:40.096806 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0c3a0a4-5746-41fe-a7a5-0386aedb2a86-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"b0c3a0a4-5746-41fe-a7a5-0386aedb2a86\") " pod="openstack/manila-api-0" Mar 12 15:08:40 crc kubenswrapper[4869]: I0312 15:08:40.097403 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0c3a0a4-5746-41fe-a7a5-0386aedb2a86-scripts\") pod \"manila-api-0\" (UID: \"b0c3a0a4-5746-41fe-a7a5-0386aedb2a86\") " pod="openstack/manila-api-0" Mar 12 15:08:40 crc kubenswrapper[4869]: I0312 15:08:40.097767 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b0c3a0a4-5746-41fe-a7a5-0386aedb2a86-config-data-custom\") pod \"manila-api-0\" (UID: \"b0c3a0a4-5746-41fe-a7a5-0386aedb2a86\") " pod="openstack/manila-api-0" Mar 12 15:08:40 crc kubenswrapper[4869]: I0312 15:08:40.098415 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0c3a0a4-5746-41fe-a7a5-0386aedb2a86-config-data\") pod \"manila-api-0\" (UID: \"b0c3a0a4-5746-41fe-a7a5-0386aedb2a86\") " pod="openstack/manila-api-0" Mar 12 15:08:40 crc kubenswrapper[4869]: I0312 15:08:40.107216 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0c3a0a4-5746-41fe-a7a5-0386aedb2a86-internal-tls-certs\") pod \"manila-api-0\" (UID: \"b0c3a0a4-5746-41fe-a7a5-0386aedb2a86\") " pod="openstack/manila-api-0" Mar 12 15:08:40 crc kubenswrapper[4869]: I0312 15:08:40.108300 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0c3a0a4-5746-41fe-a7a5-0386aedb2a86-public-tls-certs\") pod \"manila-api-0\" (UID: \"b0c3a0a4-5746-41fe-a7a5-0386aedb2a86\") " pod="openstack/manila-api-0" Mar 12 15:08:40 crc kubenswrapper[4869]: I0312 15:08:40.110313 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t88gr\" (UniqueName: \"kubernetes.io/projected/b0c3a0a4-5746-41fe-a7a5-0386aedb2a86-kube-api-access-t88gr\") pod \"manila-api-0\" (UID: \"b0c3a0a4-5746-41fe-a7a5-0386aedb2a86\") " pod="openstack/manila-api-0" Mar 12 15:08:40 crc kubenswrapper[4869]: I0312 15:08:40.279680 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 12 15:08:40 crc kubenswrapper[4869]: I0312 15:08:40.372873 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d159686-9ca4-48e1-8df5-445aa803eb8c" path="/var/lib/kubelet/pods/1d159686-9ca4-48e1-8df5-445aa803eb8c/volumes" Mar 12 15:08:40 crc kubenswrapper[4869]: I0312 15:08:40.410777 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-566d64c64b-vx76m" Mar 12 15:08:40 crc kubenswrapper[4869]: I0312 15:08:40.626792 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-t7fgp" event={"ID":"ca1953bb-fc5d-4285-8de4-b67746201d05","Type":"ContainerStarted","Data":"665d0805fd38b2e1ff328cadab61dfff64c4ea1f95dc5fe946805046dfa08a04"} Mar 12 15:08:40 crc kubenswrapper[4869]: I0312 15:08:40.660454 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-t7fgp" podStartSLOduration=3.68368314 podStartE2EDuration="1m14.660436838s" podCreationTimestamp="2026-03-12 15:07:26 +0000 UTC" firstStartedPulling="2026-03-12 15:07:28.217565393 +0000 UTC m=+1200.502790671" lastFinishedPulling="2026-03-12 15:08:39.194319091 +0000 UTC m=+1271.479544369" observedRunningTime="2026-03-12 15:08:40.656937778 +0000 UTC m=+1272.942163056" watchObservedRunningTime="2026-03-12 15:08:40.660436838 +0000 UTC m=+1272.945662116" Mar 12 15:08:40 crc kubenswrapper[4869]: I0312 15:08:40.673869 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-566d64c64b-vx76m" Mar 12 15:08:40 crc kubenswrapper[4869]: I0312 15:08:40.747342 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-75658d85fd-ktwbd"] Mar 12 15:08:40 crc kubenswrapper[4869]: I0312 15:08:40.747579 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-75658d85fd-ktwbd" podUID="898fa39d-459b-4d0b-969a-b8e93ea03fe1" containerName="barbican-api-log" containerID="cri-o://4a4631081d5e5fc7edf8ec441b4e6021fccc8a0eb43129ead7c218c73fc3c349" gracePeriod=30 Mar 12 15:08:40 crc kubenswrapper[4869]: I0312 15:08:40.748006 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-75658d85fd-ktwbd" podUID="898fa39d-459b-4d0b-969a-b8e93ea03fe1" containerName="barbican-api" containerID="cri-o://6691c87a5e7cc321e0c2d8e50068a6febe67be7c1ea060f760a86bb39e326983" gracePeriod=30 Mar 12 15:08:40 crc kubenswrapper[4869]: I0312 15:08:40.954039 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 12 15:08:41 crc kubenswrapper[4869]: I0312 15:08:41.663586 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"b0c3a0a4-5746-41fe-a7a5-0386aedb2a86","Type":"ContainerStarted","Data":"b2928099490130f83c80d2dc51da85313ed2535cd9dd4e130849b74e1fe2cf0f"} Mar 12 15:08:41 crc kubenswrapper[4869]: I0312 15:08:41.663912 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"b0c3a0a4-5746-41fe-a7a5-0386aedb2a86","Type":"ContainerStarted","Data":"b09414e77ff381775f5223b9ef444d86b8e315f71240e928f4b7dd3d805094a9"} Mar 12 15:08:41 crc kubenswrapper[4869]: I0312 15:08:41.683219 4869 generic.go:334] "Generic (PLEG): container finished" podID="898fa39d-459b-4d0b-969a-b8e93ea03fe1" containerID="4a4631081d5e5fc7edf8ec441b4e6021fccc8a0eb43129ead7c218c73fc3c349" exitCode=143 Mar 12 15:08:41 crc kubenswrapper[4869]: I0312 15:08:41.683263 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75658d85fd-ktwbd" event={"ID":"898fa39d-459b-4d0b-969a-b8e93ea03fe1","Type":"ContainerDied","Data":"4a4631081d5e5fc7edf8ec441b4e6021fccc8a0eb43129ead7c218c73fc3c349"} Mar 12 15:08:42 crc kubenswrapper[4869]: I0312 15:08:42.701014 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"b0c3a0a4-5746-41fe-a7a5-0386aedb2a86","Type":"ContainerStarted","Data":"63238771d37645192466b77869b27c500e409bdfd796f074b0bab70fd2e494be"} Mar 12 15:08:42 crc kubenswrapper[4869]: I0312 15:08:42.703455 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Mar 12 15:08:42 crc kubenswrapper[4869]: I0312 15:08:42.733272 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.733248858 podStartE2EDuration="3.733248858s" podCreationTimestamp="2026-03-12 15:08:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:08:42.717851128 +0000 UTC m=+1275.003076406" watchObservedRunningTime="2026-03-12 15:08:42.733248858 +0000 UTC m=+1275.018474136" Mar 12 15:08:43 crc kubenswrapper[4869]: I0312 15:08:43.947836 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:08:43 crc kubenswrapper[4869]: I0312 15:08:43.948125 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c67d578b-b8ef-43a0-a170-2f4f1ca48195" containerName="ceilometer-central-agent" containerID="cri-o://bbeb4784032f18091ab3ebb3557765d719c3cdb93ad1cb66ba79c60042c1a016" gracePeriod=30 Mar 12 15:08:43 crc kubenswrapper[4869]: I0312 15:08:43.949117 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c67d578b-b8ef-43a0-a170-2f4f1ca48195" containerName="sg-core" containerID="cri-o://68f8228c41c7a1e89b5fa127af5baed49ccb31777ec001ffb327de077e00fa70" gracePeriod=30 Mar 12 15:08:43 crc kubenswrapper[4869]: I0312 15:08:43.949233 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c67d578b-b8ef-43a0-a170-2f4f1ca48195" containerName="proxy-httpd" containerID="cri-o://e14e6076d0352ac1b10d52a6bfe7de727ec71a4c462e25668d80f34c59523246" gracePeriod=30 Mar 12 15:08:43 crc kubenswrapper[4869]: I0312 15:08:43.949549 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c67d578b-b8ef-43a0-a170-2f4f1ca48195" containerName="ceilometer-notification-agent" containerID="cri-o://4329c2f75308269fa25f20e3a946f5880a52e98019bae0743d891f3e785ef8ff" gracePeriod=30 Mar 12 15:08:43 crc kubenswrapper[4869]: I0312 15:08:43.968740 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c67d578b-b8ef-43a0-a170-2f4f1ca48195" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.170:3000/\": EOF" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.322419 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5bcd88f7f5-258jj"] Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.324267 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5bcd88f7f5-258jj" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.343898 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.344173 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.344278 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.379442 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5bcd88f7f5-258jj"] Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.402466 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vplsd\" (UniqueName: \"kubernetes.io/projected/944bd0f4-e7ea-430d-995e-fabdf1f79bab-kube-api-access-vplsd\") pod \"swift-proxy-5bcd88f7f5-258jj\" (UID: \"944bd0f4-e7ea-430d-995e-fabdf1f79bab\") " pod="openstack/swift-proxy-5bcd88f7f5-258jj" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.402574 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/944bd0f4-e7ea-430d-995e-fabdf1f79bab-config-data\") pod \"swift-proxy-5bcd88f7f5-258jj\" (UID: \"944bd0f4-e7ea-430d-995e-fabdf1f79bab\") " pod="openstack/swift-proxy-5bcd88f7f5-258jj" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.402592 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/944bd0f4-e7ea-430d-995e-fabdf1f79bab-run-httpd\") pod \"swift-proxy-5bcd88f7f5-258jj\" (UID: \"944bd0f4-e7ea-430d-995e-fabdf1f79bab\") " pod="openstack/swift-proxy-5bcd88f7f5-258jj" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.402623 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/944bd0f4-e7ea-430d-995e-fabdf1f79bab-combined-ca-bundle\") pod \"swift-proxy-5bcd88f7f5-258jj\" (UID: \"944bd0f4-e7ea-430d-995e-fabdf1f79bab\") " pod="openstack/swift-proxy-5bcd88f7f5-258jj" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.402654 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/944bd0f4-e7ea-430d-995e-fabdf1f79bab-etc-swift\") pod \"swift-proxy-5bcd88f7f5-258jj\" (UID: \"944bd0f4-e7ea-430d-995e-fabdf1f79bab\") " pod="openstack/swift-proxy-5bcd88f7f5-258jj" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.402687 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/944bd0f4-e7ea-430d-995e-fabdf1f79bab-log-httpd\") pod \"swift-proxy-5bcd88f7f5-258jj\" (UID: \"944bd0f4-e7ea-430d-995e-fabdf1f79bab\") " pod="openstack/swift-proxy-5bcd88f7f5-258jj" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.402765 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/944bd0f4-e7ea-430d-995e-fabdf1f79bab-internal-tls-certs\") pod \"swift-proxy-5bcd88f7f5-258jj\" (UID: \"944bd0f4-e7ea-430d-995e-fabdf1f79bab\") " pod="openstack/swift-proxy-5bcd88f7f5-258jj" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.402783 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/944bd0f4-e7ea-430d-995e-fabdf1f79bab-public-tls-certs\") pod \"swift-proxy-5bcd88f7f5-258jj\" (UID: \"944bd0f4-e7ea-430d-995e-fabdf1f79bab\") " pod="openstack/swift-proxy-5bcd88f7f5-258jj" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.509228 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/944bd0f4-e7ea-430d-995e-fabdf1f79bab-log-httpd\") pod \"swift-proxy-5bcd88f7f5-258jj\" (UID: \"944bd0f4-e7ea-430d-995e-fabdf1f79bab\") " pod="openstack/swift-proxy-5bcd88f7f5-258jj" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.509785 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/944bd0f4-e7ea-430d-995e-fabdf1f79bab-internal-tls-certs\") pod \"swift-proxy-5bcd88f7f5-258jj\" (UID: \"944bd0f4-e7ea-430d-995e-fabdf1f79bab\") " pod="openstack/swift-proxy-5bcd88f7f5-258jj" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.509815 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/944bd0f4-e7ea-430d-995e-fabdf1f79bab-public-tls-certs\") pod \"swift-proxy-5bcd88f7f5-258jj\" (UID: \"944bd0f4-e7ea-430d-995e-fabdf1f79bab\") " pod="openstack/swift-proxy-5bcd88f7f5-258jj" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.509931 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vplsd\" (UniqueName: \"kubernetes.io/projected/944bd0f4-e7ea-430d-995e-fabdf1f79bab-kube-api-access-vplsd\") pod \"swift-proxy-5bcd88f7f5-258jj\" (UID: \"944bd0f4-e7ea-430d-995e-fabdf1f79bab\") " pod="openstack/swift-proxy-5bcd88f7f5-258jj" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.510060 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/944bd0f4-e7ea-430d-995e-fabdf1f79bab-config-data\") pod \"swift-proxy-5bcd88f7f5-258jj\" (UID: \"944bd0f4-e7ea-430d-995e-fabdf1f79bab\") " pod="openstack/swift-proxy-5bcd88f7f5-258jj" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.510096 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/944bd0f4-e7ea-430d-995e-fabdf1f79bab-run-httpd\") pod \"swift-proxy-5bcd88f7f5-258jj\" (UID: \"944bd0f4-e7ea-430d-995e-fabdf1f79bab\") " pod="openstack/swift-proxy-5bcd88f7f5-258jj" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.510139 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/944bd0f4-e7ea-430d-995e-fabdf1f79bab-combined-ca-bundle\") pod \"swift-proxy-5bcd88f7f5-258jj\" (UID: \"944bd0f4-e7ea-430d-995e-fabdf1f79bab\") " pod="openstack/swift-proxy-5bcd88f7f5-258jj" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.510179 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/944bd0f4-e7ea-430d-995e-fabdf1f79bab-etc-swift\") pod \"swift-proxy-5bcd88f7f5-258jj\" (UID: \"944bd0f4-e7ea-430d-995e-fabdf1f79bab\") " pod="openstack/swift-proxy-5bcd88f7f5-258jj" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.511357 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/944bd0f4-e7ea-430d-995e-fabdf1f79bab-log-httpd\") pod \"swift-proxy-5bcd88f7f5-258jj\" (UID: \"944bd0f4-e7ea-430d-995e-fabdf1f79bab\") " pod="openstack/swift-proxy-5bcd88f7f5-258jj" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.512767 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/944bd0f4-e7ea-430d-995e-fabdf1f79bab-run-httpd\") pod \"swift-proxy-5bcd88f7f5-258jj\" (UID: \"944bd0f4-e7ea-430d-995e-fabdf1f79bab\") " pod="openstack/swift-proxy-5bcd88f7f5-258jj" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.519114 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/944bd0f4-e7ea-430d-995e-fabdf1f79bab-etc-swift\") pod \"swift-proxy-5bcd88f7f5-258jj\" (UID: \"944bd0f4-e7ea-430d-995e-fabdf1f79bab\") " pod="openstack/swift-proxy-5bcd88f7f5-258jj" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.520702 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/944bd0f4-e7ea-430d-995e-fabdf1f79bab-public-tls-certs\") pod \"swift-proxy-5bcd88f7f5-258jj\" (UID: \"944bd0f4-e7ea-430d-995e-fabdf1f79bab\") " pod="openstack/swift-proxy-5bcd88f7f5-258jj" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.520955 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/944bd0f4-e7ea-430d-995e-fabdf1f79bab-combined-ca-bundle\") pod \"swift-proxy-5bcd88f7f5-258jj\" (UID: \"944bd0f4-e7ea-430d-995e-fabdf1f79bab\") " pod="openstack/swift-proxy-5bcd88f7f5-258jj" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.523278 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/944bd0f4-e7ea-430d-995e-fabdf1f79bab-internal-tls-certs\") pod \"swift-proxy-5bcd88f7f5-258jj\" (UID: \"944bd0f4-e7ea-430d-995e-fabdf1f79bab\") " pod="openstack/swift-proxy-5bcd88f7f5-258jj" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.531732 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/944bd0f4-e7ea-430d-995e-fabdf1f79bab-config-data\") pod \"swift-proxy-5bcd88f7f5-258jj\" (UID: \"944bd0f4-e7ea-430d-995e-fabdf1f79bab\") " pod="openstack/swift-proxy-5bcd88f7f5-258jj" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.537248 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vplsd\" (UniqueName: \"kubernetes.io/projected/944bd0f4-e7ea-430d-995e-fabdf1f79bab-kube-api-access-vplsd\") pod \"swift-proxy-5bcd88f7f5-258jj\" (UID: \"944bd0f4-e7ea-430d-995e-fabdf1f79bab\") " pod="openstack/swift-proxy-5bcd88f7f5-258jj" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.598661 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75658d85fd-ktwbd" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.676959 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5bcd88f7f5-258jj" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.720245 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/898fa39d-459b-4d0b-969a-b8e93ea03fe1-logs\") pod \"898fa39d-459b-4d0b-969a-b8e93ea03fe1\" (UID: \"898fa39d-459b-4d0b-969a-b8e93ea03fe1\") " Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.720318 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/898fa39d-459b-4d0b-969a-b8e93ea03fe1-config-data\") pod \"898fa39d-459b-4d0b-969a-b8e93ea03fe1\" (UID: \"898fa39d-459b-4d0b-969a-b8e93ea03fe1\") " Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.720410 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/898fa39d-459b-4d0b-969a-b8e93ea03fe1-config-data-custom\") pod \"898fa39d-459b-4d0b-969a-b8e93ea03fe1\" (UID: \"898fa39d-459b-4d0b-969a-b8e93ea03fe1\") " Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.720439 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/898fa39d-459b-4d0b-969a-b8e93ea03fe1-combined-ca-bundle\") pod \"898fa39d-459b-4d0b-969a-b8e93ea03fe1\" (UID: \"898fa39d-459b-4d0b-969a-b8e93ea03fe1\") " Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.721023 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbfrj\" (UniqueName: \"kubernetes.io/projected/898fa39d-459b-4d0b-969a-b8e93ea03fe1-kube-api-access-vbfrj\") pod \"898fa39d-459b-4d0b-969a-b8e93ea03fe1\" (UID: \"898fa39d-459b-4d0b-969a-b8e93ea03fe1\") " Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.721515 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/898fa39d-459b-4d0b-969a-b8e93ea03fe1-logs" (OuterVolumeSpecName: "logs") pod "898fa39d-459b-4d0b-969a-b8e93ea03fe1" (UID: "898fa39d-459b-4d0b-969a-b8e93ea03fe1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.722157 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/898fa39d-459b-4d0b-969a-b8e93ea03fe1-logs\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.725528 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/898fa39d-459b-4d0b-969a-b8e93ea03fe1-kube-api-access-vbfrj" (OuterVolumeSpecName: "kube-api-access-vbfrj") pod "898fa39d-459b-4d0b-969a-b8e93ea03fe1" (UID: "898fa39d-459b-4d0b-969a-b8e93ea03fe1"). InnerVolumeSpecName "kube-api-access-vbfrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.728644 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/898fa39d-459b-4d0b-969a-b8e93ea03fe1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "898fa39d-459b-4d0b-969a-b8e93ea03fe1" (UID: "898fa39d-459b-4d0b-969a-b8e93ea03fe1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.729206 4869 generic.go:334] "Generic (PLEG): container finished" podID="c67d578b-b8ef-43a0-a170-2f4f1ca48195" containerID="e14e6076d0352ac1b10d52a6bfe7de727ec71a4c462e25668d80f34c59523246" exitCode=0 Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.729239 4869 generic.go:334] "Generic (PLEG): container finished" podID="c67d578b-b8ef-43a0-a170-2f4f1ca48195" containerID="68f8228c41c7a1e89b5fa127af5baed49ccb31777ec001ffb327de077e00fa70" exitCode=2 Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.729246 4869 generic.go:334] "Generic (PLEG): container finished" podID="c67d578b-b8ef-43a0-a170-2f4f1ca48195" containerID="4329c2f75308269fa25f20e3a946f5880a52e98019bae0743d891f3e785ef8ff" exitCode=0 Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.729252 4869 generic.go:334] "Generic (PLEG): container finished" podID="c67d578b-b8ef-43a0-a170-2f4f1ca48195" containerID="bbeb4784032f18091ab3ebb3557765d719c3cdb93ad1cb66ba79c60042c1a016" exitCode=0 Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.729292 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c67d578b-b8ef-43a0-a170-2f4f1ca48195","Type":"ContainerDied","Data":"e14e6076d0352ac1b10d52a6bfe7de727ec71a4c462e25668d80f34c59523246"} Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.729320 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c67d578b-b8ef-43a0-a170-2f4f1ca48195","Type":"ContainerDied","Data":"68f8228c41c7a1e89b5fa127af5baed49ccb31777ec001ffb327de077e00fa70"} Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.729330 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c67d578b-b8ef-43a0-a170-2f4f1ca48195","Type":"ContainerDied","Data":"4329c2f75308269fa25f20e3a946f5880a52e98019bae0743d891f3e785ef8ff"} Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.729339 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c67d578b-b8ef-43a0-a170-2f4f1ca48195","Type":"ContainerDied","Data":"bbeb4784032f18091ab3ebb3557765d719c3cdb93ad1cb66ba79c60042c1a016"} Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.731080 4869 generic.go:334] "Generic (PLEG): container finished" podID="898fa39d-459b-4d0b-969a-b8e93ea03fe1" containerID="6691c87a5e7cc321e0c2d8e50068a6febe67be7c1ea060f760a86bb39e326983" exitCode=0 Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.732032 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75658d85fd-ktwbd" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.732483 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75658d85fd-ktwbd" event={"ID":"898fa39d-459b-4d0b-969a-b8e93ea03fe1","Type":"ContainerDied","Data":"6691c87a5e7cc321e0c2d8e50068a6febe67be7c1ea060f760a86bb39e326983"} Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.732512 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75658d85fd-ktwbd" event={"ID":"898fa39d-459b-4d0b-969a-b8e93ea03fe1","Type":"ContainerDied","Data":"c559838923928761160e138484aa661c04e2927ea8c18f5a76dfa8f5e0350a76"} Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.732528 4869 scope.go:117] "RemoveContainer" containerID="6691c87a5e7cc321e0c2d8e50068a6febe67be7c1ea060f760a86bb39e326983" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.758570 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/898fa39d-459b-4d0b-969a-b8e93ea03fe1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "898fa39d-459b-4d0b-969a-b8e93ea03fe1" (UID: "898fa39d-459b-4d0b-969a-b8e93ea03fe1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.785137 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/898fa39d-459b-4d0b-969a-b8e93ea03fe1-config-data" (OuterVolumeSpecName: "config-data") pod "898fa39d-459b-4d0b-969a-b8e93ea03fe1" (UID: "898fa39d-459b-4d0b-969a-b8e93ea03fe1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.789694 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.813693 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-95ccfc9f9-9rcc7" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.824724 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/898fa39d-459b-4d0b-969a-b8e93ea03fe1-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.824968 4869 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/898fa39d-459b-4d0b-969a-b8e93ea03fe1-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.825090 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/898fa39d-459b-4d0b-969a-b8e93ea03fe1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.825189 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbfrj\" (UniqueName: \"kubernetes.io/projected/898fa39d-459b-4d0b-969a-b8e93ea03fe1-kube-api-access-vbfrj\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.916003 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-78k6r"] Mar 12 15:08:44 crc kubenswrapper[4869]: I0312 15:08:44.916293 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-78k6r" podUID="95e0a90a-0269-471b-bd8d-a110809af063" containerName="dnsmasq-dns" containerID="cri-o://779f0090be3e86acdb1e053767078ea8011c69e92098502aee66b94993914d15" gracePeriod=10 Mar 12 15:08:45 crc kubenswrapper[4869]: I0312 15:08:45.122665 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-75658d85fd-ktwbd"] Mar 12 15:08:45 crc kubenswrapper[4869]: I0312 15:08:45.126624 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-75658d85fd-ktwbd"] Mar 12 15:08:45 crc kubenswrapper[4869]: I0312 15:08:45.745941 4869 generic.go:334] "Generic (PLEG): container finished" podID="95e0a90a-0269-471b-bd8d-a110809af063" containerID="779f0090be3e86acdb1e053767078ea8011c69e92098502aee66b94993914d15" exitCode=0 Mar 12 15:08:45 crc kubenswrapper[4869]: I0312 15:08:45.746009 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-78k6r" event={"ID":"95e0a90a-0269-471b-bd8d-a110809af063","Type":"ContainerDied","Data":"779f0090be3e86acdb1e053767078ea8011c69e92098502aee66b94993914d15"} Mar 12 15:08:45 crc kubenswrapper[4869]: I0312 15:08:45.748372 4869 generic.go:334] "Generic (PLEG): container finished" podID="ca1953bb-fc5d-4285-8de4-b67746201d05" containerID="665d0805fd38b2e1ff328cadab61dfff64c4ea1f95dc5fe946805046dfa08a04" exitCode=0 Mar 12 15:08:45 crc kubenswrapper[4869]: I0312 15:08:45.748402 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-t7fgp" event={"ID":"ca1953bb-fc5d-4285-8de4-b67746201d05","Type":"ContainerDied","Data":"665d0805fd38b2e1ff328cadab61dfff64c4ea1f95dc5fe946805046dfa08a04"} Mar 12 15:08:45 crc kubenswrapper[4869]: I0312 15:08:45.768701 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7f798b7b68-jtm8h" podUID="b4d63031-e072-466e-ae3c-d829a699b197" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Mar 12 15:08:45 crc kubenswrapper[4869]: I0312 15:08:45.768902 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7f798b7b68-jtm8h" Mar 12 15:08:46 crc kubenswrapper[4869]: I0312 15:08:46.349932 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="898fa39d-459b-4d0b-969a-b8e93ea03fe1" path="/var/lib/kubelet/pods/898fa39d-459b-4d0b-969a-b8e93ea03fe1/volumes" Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.639871 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-mb78b"] Mar 12 15:08:47 crc kubenswrapper[4869]: E0312 15:08:47.640334 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="898fa39d-459b-4d0b-969a-b8e93ea03fe1" containerName="barbican-api-log" Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.640350 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="898fa39d-459b-4d0b-969a-b8e93ea03fe1" containerName="barbican-api-log" Mar 12 15:08:47 crc kubenswrapper[4869]: E0312 15:08:47.640376 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="898fa39d-459b-4d0b-969a-b8e93ea03fe1" containerName="barbican-api" Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.640384 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="898fa39d-459b-4d0b-969a-b8e93ea03fe1" containerName="barbican-api" Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.640641 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="898fa39d-459b-4d0b-969a-b8e93ea03fe1" containerName="barbican-api" Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.640679 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="898fa39d-459b-4d0b-969a-b8e93ea03fe1" containerName="barbican-api-log" Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.641334 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mb78b" Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.654312 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-mb78b"] Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.710011 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2-operator-scripts\") pod \"nova-api-db-create-mb78b\" (UID: \"ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2\") " pod="openstack/nova-api-db-create-mb78b" Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.710106 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfhsr\" (UniqueName: \"kubernetes.io/projected/ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2-kube-api-access-wfhsr\") pod \"nova-api-db-create-mb78b\" (UID: \"ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2\") " pod="openstack/nova-api-db-create-mb78b" Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.754606 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-96j6d"] Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.756135 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-96j6d" Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.784249 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-96j6d"] Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.809769 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-1e62-account-create-update-8jrd8"] Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.814823 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1e62-account-create-update-8jrd8" Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.817459 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.818574 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1e62-account-create-update-8jrd8"] Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.819317 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfhsr\" (UniqueName: \"kubernetes.io/projected/ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2-kube-api-access-wfhsr\") pod \"nova-api-db-create-mb78b\" (UID: \"ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2\") " pod="openstack/nova-api-db-create-mb78b" Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.819504 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e396abd5-7a42-405e-827f-85f8426c6ed6-operator-scripts\") pod \"nova-cell0-db-create-96j6d\" (UID: \"e396abd5-7a42-405e-827f-85f8426c6ed6\") " pod="openstack/nova-cell0-db-create-96j6d" Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.819634 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvklw\" (UniqueName: \"kubernetes.io/projected/e396abd5-7a42-405e-827f-85f8426c6ed6-kube-api-access-zvklw\") pod \"nova-cell0-db-create-96j6d\" (UID: \"e396abd5-7a42-405e-827f-85f8426c6ed6\") " pod="openstack/nova-cell0-db-create-96j6d" Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.819689 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2-operator-scripts\") pod \"nova-api-db-create-mb78b\" (UID: \"ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2\") " pod="openstack/nova-api-db-create-mb78b" Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.824390 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2-operator-scripts\") pod \"nova-api-db-create-mb78b\" (UID: \"ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2\") " pod="openstack/nova-api-db-create-mb78b" Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.847709 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfhsr\" (UniqueName: \"kubernetes.io/projected/ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2-kube-api-access-wfhsr\") pod \"nova-api-db-create-mb78b\" (UID: \"ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2\") " pod="openstack/nova-api-db-create-mb78b" Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.866781 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-pzfq5"] Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.868023 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-pzfq5" Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.875989 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-pzfq5"] Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.923235 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvklw\" (UniqueName: \"kubernetes.io/projected/e396abd5-7a42-405e-827f-85f8426c6ed6-kube-api-access-zvklw\") pod \"nova-cell0-db-create-96j6d\" (UID: \"e396abd5-7a42-405e-827f-85f8426c6ed6\") " pod="openstack/nova-cell0-db-create-96j6d" Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.923446 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e396abd5-7a42-405e-827f-85f8426c6ed6-operator-scripts\") pod \"nova-cell0-db-create-96j6d\" (UID: \"e396abd5-7a42-405e-827f-85f8426c6ed6\") " pod="openstack/nova-cell0-db-create-96j6d" Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.923508 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtdgf\" (UniqueName: \"kubernetes.io/projected/2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7-kube-api-access-wtdgf\") pod \"nova-api-1e62-account-create-update-8jrd8\" (UID: \"2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7\") " pod="openstack/nova-api-1e62-account-create-update-8jrd8" Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.923555 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7-operator-scripts\") pod \"nova-api-1e62-account-create-update-8jrd8\" (UID: \"2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7\") " pod="openstack/nova-api-1e62-account-create-update-8jrd8" Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.924703 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e396abd5-7a42-405e-827f-85f8426c6ed6-operator-scripts\") pod \"nova-cell0-db-create-96j6d\" (UID: \"e396abd5-7a42-405e-827f-85f8426c6ed6\") " pod="openstack/nova-cell0-db-create-96j6d" Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.960252 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-43b5-account-create-update-5ggll"] Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.961462 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-43b5-account-create-update-5ggll" Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.965743 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.967139 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvklw\" (UniqueName: \"kubernetes.io/projected/e396abd5-7a42-405e-827f-85f8426c6ed6-kube-api-access-zvklw\") pod \"nova-cell0-db-create-96j6d\" (UID: \"e396abd5-7a42-405e-827f-85f8426c6ed6\") " pod="openstack/nova-cell0-db-create-96j6d" Mar 12 15:08:47 crc kubenswrapper[4869]: I0312 15:08:47.987520 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-43b5-account-create-update-5ggll"] Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.006286 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mb78b" Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.027565 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42cdbca9-8e71-4fd8-8389-d4476417217b-operator-scripts\") pod \"nova-cell1-db-create-pzfq5\" (UID: \"42cdbca9-8e71-4fd8-8389-d4476417217b\") " pod="openstack/nova-cell1-db-create-pzfq5" Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.027816 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cqws\" (UniqueName: \"kubernetes.io/projected/42cdbca9-8e71-4fd8-8389-d4476417217b-kube-api-access-5cqws\") pod \"nova-cell1-db-create-pzfq5\" (UID: \"42cdbca9-8e71-4fd8-8389-d4476417217b\") " pod="openstack/nova-cell1-db-create-pzfq5" Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.027913 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtdgf\" (UniqueName: \"kubernetes.io/projected/2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7-kube-api-access-wtdgf\") pod \"nova-api-1e62-account-create-update-8jrd8\" (UID: \"2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7\") " pod="openstack/nova-api-1e62-account-create-update-8jrd8" Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.027955 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjbnn\" (UniqueName: \"kubernetes.io/projected/ec9fe62a-a7c1-4c9b-8520-2046bc10995a-kube-api-access-zjbnn\") pod \"nova-cell0-43b5-account-create-update-5ggll\" (UID: \"ec9fe62a-a7c1-4c9b-8520-2046bc10995a\") " pod="openstack/nova-cell0-43b5-account-create-update-5ggll" Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.027990 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7-operator-scripts\") pod \"nova-api-1e62-account-create-update-8jrd8\" (UID: \"2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7\") " pod="openstack/nova-api-1e62-account-create-update-8jrd8" Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.028125 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec9fe62a-a7c1-4c9b-8520-2046bc10995a-operator-scripts\") pod \"nova-cell0-43b5-account-create-update-5ggll\" (UID: \"ec9fe62a-a7c1-4c9b-8520-2046bc10995a\") " pod="openstack/nova-cell0-43b5-account-create-update-5ggll" Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.029384 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7-operator-scripts\") pod \"nova-api-1e62-account-create-update-8jrd8\" (UID: \"2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7\") " pod="openstack/nova-api-1e62-account-create-update-8jrd8" Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.047237 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtdgf\" (UniqueName: \"kubernetes.io/projected/2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7-kube-api-access-wtdgf\") pod \"nova-api-1e62-account-create-update-8jrd8\" (UID: \"2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7\") " pod="openstack/nova-api-1e62-account-create-update-8jrd8" Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.092218 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-96j6d" Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.129874 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42cdbca9-8e71-4fd8-8389-d4476417217b-operator-scripts\") pod \"nova-cell1-db-create-pzfq5\" (UID: \"42cdbca9-8e71-4fd8-8389-d4476417217b\") " pod="openstack/nova-cell1-db-create-pzfq5" Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.129983 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cqws\" (UniqueName: \"kubernetes.io/projected/42cdbca9-8e71-4fd8-8389-d4476417217b-kube-api-access-5cqws\") pod \"nova-cell1-db-create-pzfq5\" (UID: \"42cdbca9-8e71-4fd8-8389-d4476417217b\") " pod="openstack/nova-cell1-db-create-pzfq5" Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.130019 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjbnn\" (UniqueName: \"kubernetes.io/projected/ec9fe62a-a7c1-4c9b-8520-2046bc10995a-kube-api-access-zjbnn\") pod \"nova-cell0-43b5-account-create-update-5ggll\" (UID: \"ec9fe62a-a7c1-4c9b-8520-2046bc10995a\") " pod="openstack/nova-cell0-43b5-account-create-update-5ggll" Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.130078 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec9fe62a-a7c1-4c9b-8520-2046bc10995a-operator-scripts\") pod \"nova-cell0-43b5-account-create-update-5ggll\" (UID: \"ec9fe62a-a7c1-4c9b-8520-2046bc10995a\") " pod="openstack/nova-cell0-43b5-account-create-update-5ggll" Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.130933 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec9fe62a-a7c1-4c9b-8520-2046bc10995a-operator-scripts\") pod \"nova-cell0-43b5-account-create-update-5ggll\" (UID: \"ec9fe62a-a7c1-4c9b-8520-2046bc10995a\") " pod="openstack/nova-cell0-43b5-account-create-update-5ggll" Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.131327 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42cdbca9-8e71-4fd8-8389-d4476417217b-operator-scripts\") pod \"nova-cell1-db-create-pzfq5\" (UID: \"42cdbca9-8e71-4fd8-8389-d4476417217b\") " pod="openstack/nova-cell1-db-create-pzfq5" Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.146411 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cqws\" (UniqueName: \"kubernetes.io/projected/42cdbca9-8e71-4fd8-8389-d4476417217b-kube-api-access-5cqws\") pod \"nova-cell1-db-create-pzfq5\" (UID: \"42cdbca9-8e71-4fd8-8389-d4476417217b\") " pod="openstack/nova-cell1-db-create-pzfq5" Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.149000 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjbnn\" (UniqueName: \"kubernetes.io/projected/ec9fe62a-a7c1-4c9b-8520-2046bc10995a-kube-api-access-zjbnn\") pod \"nova-cell0-43b5-account-create-update-5ggll\" (UID: \"ec9fe62a-a7c1-4c9b-8520-2046bc10995a\") " pod="openstack/nova-cell0-43b5-account-create-update-5ggll" Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.162980 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-e653-account-create-update-mjf5p"] Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.164232 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e653-account-create-update-mjf5p" Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.169668 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.187805 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e653-account-create-update-mjf5p"] Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.205597 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1e62-account-create-update-8jrd8" Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.221369 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-pzfq5" Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.232033 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8081e56b-635e-4137-982b-c5eafd77af8e-operator-scripts\") pod \"nova-cell1-e653-account-create-update-mjf5p\" (UID: \"8081e56b-635e-4137-982b-c5eafd77af8e\") " pod="openstack/nova-cell1-e653-account-create-update-mjf5p" Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.232228 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpzf4\" (UniqueName: \"kubernetes.io/projected/8081e56b-635e-4137-982b-c5eafd77af8e-kube-api-access-xpzf4\") pod \"nova-cell1-e653-account-create-update-mjf5p\" (UID: \"8081e56b-635e-4137-982b-c5eafd77af8e\") " pod="openstack/nova-cell1-e653-account-create-update-mjf5p" Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.306439 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-43b5-account-create-update-5ggll" Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.333910 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpzf4\" (UniqueName: \"kubernetes.io/projected/8081e56b-635e-4137-982b-c5eafd77af8e-kube-api-access-xpzf4\") pod \"nova-cell1-e653-account-create-update-mjf5p\" (UID: \"8081e56b-635e-4137-982b-c5eafd77af8e\") " pod="openstack/nova-cell1-e653-account-create-update-mjf5p" Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.333996 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8081e56b-635e-4137-982b-c5eafd77af8e-operator-scripts\") pod \"nova-cell1-e653-account-create-update-mjf5p\" (UID: \"8081e56b-635e-4137-982b-c5eafd77af8e\") " pod="openstack/nova-cell1-e653-account-create-update-mjf5p" Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.335430 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8081e56b-635e-4137-982b-c5eafd77af8e-operator-scripts\") pod \"nova-cell1-e653-account-create-update-mjf5p\" (UID: \"8081e56b-635e-4137-982b-c5eafd77af8e\") " pod="openstack/nova-cell1-e653-account-create-update-mjf5p" Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.369873 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpzf4\" (UniqueName: \"kubernetes.io/projected/8081e56b-635e-4137-982b-c5eafd77af8e-kube-api-access-xpzf4\") pod \"nova-cell1-e653-account-create-update-mjf5p\" (UID: \"8081e56b-635e-4137-982b-c5eafd77af8e\") " pod="openstack/nova-cell1-e653-account-create-update-mjf5p" Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.556166 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e653-account-create-update-mjf5p" Mar 12 15:08:48 crc kubenswrapper[4869]: I0312 15:08:48.606938 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-55f844cf75-78k6r" podUID="95e0a90a-0269-471b-bd8d-a110809af063" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.159:5353: connect: connection refused" Mar 12 15:08:49 crc kubenswrapper[4869]: I0312 15:08:49.239806 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-t7fgp" Mar 12 15:08:49 crc kubenswrapper[4869]: I0312 15:08:49.352248 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca1953bb-fc5d-4285-8de4-b67746201d05-etc-machine-id\") pod \"ca1953bb-fc5d-4285-8de4-b67746201d05\" (UID: \"ca1953bb-fc5d-4285-8de4-b67746201d05\") " Mar 12 15:08:49 crc kubenswrapper[4869]: I0312 15:08:49.352907 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s57vj\" (UniqueName: \"kubernetes.io/projected/ca1953bb-fc5d-4285-8de4-b67746201d05-kube-api-access-s57vj\") pod \"ca1953bb-fc5d-4285-8de4-b67746201d05\" (UID: \"ca1953bb-fc5d-4285-8de4-b67746201d05\") " Mar 12 15:08:49 crc kubenswrapper[4869]: I0312 15:08:49.352943 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca1953bb-fc5d-4285-8de4-b67746201d05-combined-ca-bundle\") pod \"ca1953bb-fc5d-4285-8de4-b67746201d05\" (UID: \"ca1953bb-fc5d-4285-8de4-b67746201d05\") " Mar 12 15:08:49 crc kubenswrapper[4869]: I0312 15:08:49.352976 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca1953bb-fc5d-4285-8de4-b67746201d05-scripts\") pod \"ca1953bb-fc5d-4285-8de4-b67746201d05\" (UID: \"ca1953bb-fc5d-4285-8de4-b67746201d05\") " Mar 12 15:08:49 crc kubenswrapper[4869]: I0312 15:08:49.353039 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca1953bb-fc5d-4285-8de4-b67746201d05-config-data\") pod \"ca1953bb-fc5d-4285-8de4-b67746201d05\" (UID: \"ca1953bb-fc5d-4285-8de4-b67746201d05\") " Mar 12 15:08:49 crc kubenswrapper[4869]: I0312 15:08:49.353065 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca1953bb-fc5d-4285-8de4-b67746201d05-db-sync-config-data\") pod \"ca1953bb-fc5d-4285-8de4-b67746201d05\" (UID: \"ca1953bb-fc5d-4285-8de4-b67746201d05\") " Mar 12 15:08:49 crc kubenswrapper[4869]: I0312 15:08:49.352754 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca1953bb-fc5d-4285-8de4-b67746201d05-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ca1953bb-fc5d-4285-8de4-b67746201d05" (UID: "ca1953bb-fc5d-4285-8de4-b67746201d05"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:08:49 crc kubenswrapper[4869]: I0312 15:08:49.364910 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca1953bb-fc5d-4285-8de4-b67746201d05-kube-api-access-s57vj" (OuterVolumeSpecName: "kube-api-access-s57vj") pod "ca1953bb-fc5d-4285-8de4-b67746201d05" (UID: "ca1953bb-fc5d-4285-8de4-b67746201d05"). InnerVolumeSpecName "kube-api-access-s57vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:08:49 crc kubenswrapper[4869]: I0312 15:08:49.365033 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca1953bb-fc5d-4285-8de4-b67746201d05-scripts" (OuterVolumeSpecName: "scripts") pod "ca1953bb-fc5d-4285-8de4-b67746201d05" (UID: "ca1953bb-fc5d-4285-8de4-b67746201d05"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:49 crc kubenswrapper[4869]: I0312 15:08:49.376013 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca1953bb-fc5d-4285-8de4-b67746201d05-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ca1953bb-fc5d-4285-8de4-b67746201d05" (UID: "ca1953bb-fc5d-4285-8de4-b67746201d05"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:49 crc kubenswrapper[4869]: I0312 15:08:49.392734 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca1953bb-fc5d-4285-8de4-b67746201d05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca1953bb-fc5d-4285-8de4-b67746201d05" (UID: "ca1953bb-fc5d-4285-8de4-b67746201d05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:49 crc kubenswrapper[4869]: I0312 15:08:49.427183 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-75658d85fd-ktwbd" podUID="898fa39d-459b-4d0b-969a-b8e93ea03fe1" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.168:9311/healthcheck\": dial tcp 10.217.0.168:9311: i/o timeout (Client.Timeout exceeded while awaiting headers)" Mar 12 15:08:49 crc kubenswrapper[4869]: I0312 15:08:49.427384 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-75658d85fd-ktwbd" podUID="898fa39d-459b-4d0b-969a-b8e93ea03fe1" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.168:9311/healthcheck\": dial tcp 10.217.0.168:9311: i/o timeout (Client.Timeout exceeded while awaiting headers)" Mar 12 15:08:49 crc kubenswrapper[4869]: I0312 15:08:49.452513 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca1953bb-fc5d-4285-8de4-b67746201d05-config-data" (OuterVolumeSpecName: "config-data") pod "ca1953bb-fc5d-4285-8de4-b67746201d05" (UID: "ca1953bb-fc5d-4285-8de4-b67746201d05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:49 crc kubenswrapper[4869]: I0312 15:08:49.456011 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca1953bb-fc5d-4285-8de4-b67746201d05-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:49 crc kubenswrapper[4869]: I0312 15:08:49.456046 4869 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca1953bb-fc5d-4285-8de4-b67746201d05-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:49 crc kubenswrapper[4869]: I0312 15:08:49.456059 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca1953bb-fc5d-4285-8de4-b67746201d05-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:49 crc kubenswrapper[4869]: I0312 15:08:49.456071 4869 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca1953bb-fc5d-4285-8de4-b67746201d05-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:49 crc kubenswrapper[4869]: I0312 15:08:49.456083 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s57vj\" (UniqueName: \"kubernetes.io/projected/ca1953bb-fc5d-4285-8de4-b67746201d05-kube-api-access-s57vj\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:49 crc kubenswrapper[4869]: I0312 15:08:49.456095 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca1953bb-fc5d-4285-8de4-b67746201d05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:49 crc kubenswrapper[4869]: I0312 15:08:49.686145 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:08:49 crc kubenswrapper[4869]: I0312 15:08:49.686209 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:08:49 crc kubenswrapper[4869]: I0312 15:08:49.686259 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" Mar 12 15:08:49 crc kubenswrapper[4869]: I0312 15:08:49.687125 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2f5452570a2d00afc7e7591fc67b3884055af23168ca0e1d9b4ff0e5dcdc6950"} pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 15:08:49 crc kubenswrapper[4869]: I0312 15:08:49.687193 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" containerID="cri-o://2f5452570a2d00afc7e7591fc67b3884055af23168ca0e1d9b4ff0e5dcdc6950" gracePeriod=600 Mar 12 15:08:49 crc kubenswrapper[4869]: I0312 15:08:49.820357 4869 generic.go:334] "Generic (PLEG): container finished" podID="b4d63031-e072-466e-ae3c-d829a699b197" containerID="a26eec3d4fd18f4819fca706f27ddd2aa65eae9751ad98f439f13147a5de8480" exitCode=137 Mar 12 15:08:49 crc kubenswrapper[4869]: I0312 15:08:49.820424 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f798b7b68-jtm8h" event={"ID":"b4d63031-e072-466e-ae3c-d829a699b197","Type":"ContainerDied","Data":"a26eec3d4fd18f4819fca706f27ddd2aa65eae9751ad98f439f13147a5de8480"} Mar 12 15:08:49 crc kubenswrapper[4869]: I0312 15:08:49.822308 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-t7fgp" event={"ID":"ca1953bb-fc5d-4285-8de4-b67746201d05","Type":"ContainerDied","Data":"b21d3ca01c0107f42f8287a0c0d733e88fcf2e0da3f1602a7f3ac0c90f260b58"} Mar 12 15:08:49 crc kubenswrapper[4869]: I0312 15:08:49.822334 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b21d3ca01c0107f42f8287a0c0d733e88fcf2e0da3f1602a7f3ac0c90f260b58" Mar 12 15:08:49 crc kubenswrapper[4869]: I0312 15:08:49.822385 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-t7fgp" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.478722 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 15:08:50 crc kubenswrapper[4869]: E0312 15:08:50.479417 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca1953bb-fc5d-4285-8de4-b67746201d05" containerName="cinder-db-sync" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.479430 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca1953bb-fc5d-4285-8de4-b67746201d05" containerName="cinder-db-sync" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.480104 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca1953bb-fc5d-4285-8de4-b67746201d05" containerName="cinder-db-sync" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.481114 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.493882 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.493956 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-9c8hn" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.493890 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.494120 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.537121 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.586678 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9\") " pod="openstack/cinder-scheduler-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.586742 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-scripts\") pod \"cinder-scheduler-0\" (UID: \"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9\") " pod="openstack/cinder-scheduler-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.586838 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9\") " pod="openstack/cinder-scheduler-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.586869 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whx8k\" (UniqueName: \"kubernetes.io/projected/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-kube-api-access-whx8k\") pod \"cinder-scheduler-0\" (UID: \"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9\") " pod="openstack/cinder-scheduler-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.586963 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-config-data\") pod \"cinder-scheduler-0\" (UID: \"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9\") " pod="openstack/cinder-scheduler-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.587012 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9\") " pod="openstack/cinder-scheduler-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.623484 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56696ff475-snqth"] Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.625119 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56696ff475-snqth" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.655688 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.657415 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.661890 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.666602 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56696ff475-snqth"] Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.688887 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh92j\" (UniqueName: \"kubernetes.io/projected/7d8820b4-254f-4f89-8609-a8b86b0d5796-kube-api-access-qh92j\") pod \"dnsmasq-dns-56696ff475-snqth\" (UID: \"7d8820b4-254f-4f89-8609-a8b86b0d5796\") " pod="openstack/dnsmasq-dns-56696ff475-snqth" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.688935 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d8820b4-254f-4f89-8609-a8b86b0d5796-dns-svc\") pod \"dnsmasq-dns-56696ff475-snqth\" (UID: \"7d8820b4-254f-4f89-8609-a8b86b0d5796\") " pod="openstack/dnsmasq-dns-56696ff475-snqth" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.688978 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-config-data\") pod \"cinder-scheduler-0\" (UID: \"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9\") " pod="openstack/cinder-scheduler-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.689028 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9\") " pod="openstack/cinder-scheduler-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.689084 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d8820b4-254f-4f89-8609-a8b86b0d5796-ovsdbserver-sb\") pod \"dnsmasq-dns-56696ff475-snqth\" (UID: \"7d8820b4-254f-4f89-8609-a8b86b0d5796\") " pod="openstack/dnsmasq-dns-56696ff475-snqth" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.689111 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d8820b4-254f-4f89-8609-a8b86b0d5796-config\") pod \"dnsmasq-dns-56696ff475-snqth\" (UID: \"7d8820b4-254f-4f89-8609-a8b86b0d5796\") " pod="openstack/dnsmasq-dns-56696ff475-snqth" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.689138 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d8820b4-254f-4f89-8609-a8b86b0d5796-dns-swift-storage-0\") pod \"dnsmasq-dns-56696ff475-snqth\" (UID: \"7d8820b4-254f-4f89-8609-a8b86b0d5796\") " pod="openstack/dnsmasq-dns-56696ff475-snqth" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.689175 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9\") " pod="openstack/cinder-scheduler-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.689203 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-scripts\") pod \"cinder-scheduler-0\" (UID: \"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9\") " pod="openstack/cinder-scheduler-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.689275 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9\") " pod="openstack/cinder-scheduler-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.689301 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d8820b4-254f-4f89-8609-a8b86b0d5796-ovsdbserver-nb\") pod \"dnsmasq-dns-56696ff475-snqth\" (UID: \"7d8820b4-254f-4f89-8609-a8b86b0d5796\") " pod="openstack/dnsmasq-dns-56696ff475-snqth" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.689329 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whx8k\" (UniqueName: \"kubernetes.io/projected/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-kube-api-access-whx8k\") pod \"cinder-scheduler-0\" (UID: \"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9\") " pod="openstack/cinder-scheduler-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.695447 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9\") " pod="openstack/cinder-scheduler-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.699359 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.703872 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-scripts\") pod \"cinder-scheduler-0\" (UID: \"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9\") " pod="openstack/cinder-scheduler-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.704502 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9\") " pod="openstack/cinder-scheduler-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.708857 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9\") " pod="openstack/cinder-scheduler-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.715241 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-config-data\") pod \"cinder-scheduler-0\" (UID: \"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9\") " pod="openstack/cinder-scheduler-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.715325 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.717221 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.722088 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.725191 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.727967 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whx8k\" (UniqueName: \"kubernetes.io/projected/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-kube-api-access-whx8k\") pod \"cinder-scheduler-0\" (UID: \"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9\") " pod="openstack/cinder-scheduler-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.798032 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c13a91ca-db8c-42b4-bfca-029e427aff28-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.798195 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e9514e1-cc53-4445-904c-4505fc60a1ea-config-data-custom\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.798292 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.798365 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28v8v\" (UniqueName: \"kubernetes.io/projected/c13a91ca-db8c-42b4-bfca-029e427aff28-kube-api-access-28v8v\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.798473 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-lib-modules\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.798608 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c13a91ca-db8c-42b4-bfca-029e427aff28-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.798736 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-run\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.798809 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh92j\" (UniqueName: \"kubernetes.io/projected/7d8820b4-254f-4f89-8609-a8b86b0d5796-kube-api-access-qh92j\") pod \"dnsmasq-dns-56696ff475-snqth\" (UID: \"7d8820b4-254f-4f89-8609-a8b86b0d5796\") " pod="openstack/dnsmasq-dns-56696ff475-snqth" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.798927 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d8820b4-254f-4f89-8609-a8b86b0d5796-dns-svc\") pod \"dnsmasq-dns-56696ff475-snqth\" (UID: \"7d8820b4-254f-4f89-8609-a8b86b0d5796\") " pod="openstack/dnsmasq-dns-56696ff475-snqth" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.799049 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e9514e1-cc53-4445-904c-4505fc60a1ea-scripts\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.799092 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhbtk\" (UniqueName: \"kubernetes.io/projected/5e9514e1-cc53-4445-904c-4505fc60a1ea-kube-api-access-mhbtk\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.799125 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.799200 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9514e1-cc53-4445-904c-4505fc60a1ea-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.799291 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d8820b4-254f-4f89-8609-a8b86b0d5796-ovsdbserver-sb\") pod \"dnsmasq-dns-56696ff475-snqth\" (UID: \"7d8820b4-254f-4f89-8609-a8b86b0d5796\") " pod="openstack/dnsmasq-dns-56696ff475-snqth" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.799327 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-dev\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.799352 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-dev\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.799373 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-run\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.799398 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d8820b4-254f-4f89-8609-a8b86b0d5796-config\") pod \"dnsmasq-dns-56696ff475-snqth\" (UID: \"7d8820b4-254f-4f89-8609-a8b86b0d5796\") " pod="openstack/dnsmasq-dns-56696ff475-snqth" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.799430 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d8820b4-254f-4f89-8609-a8b86b0d5796-dns-swift-storage-0\") pod \"dnsmasq-dns-56696ff475-snqth\" (UID: \"7d8820b4-254f-4f89-8609-a8b86b0d5796\") " pod="openstack/dnsmasq-dns-56696ff475-snqth" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.799476 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.799526 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c13a91ca-db8c-42b4-bfca-029e427aff28-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.799598 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.799647 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.799681 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.799701 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.799737 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-etc-nvme\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.799783 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.799805 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5e9514e1-cc53-4445-904c-4505fc60a1ea-ceph\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.799836 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c13a91ca-db8c-42b4-bfca-029e427aff28-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.799885 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-sys\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.799914 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d8820b4-254f-4f89-8609-a8b86b0d5796-ovsdbserver-nb\") pod \"dnsmasq-dns-56696ff475-snqth\" (UID: \"7d8820b4-254f-4f89-8609-a8b86b0d5796\") " pod="openstack/dnsmasq-dns-56696ff475-snqth" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.799936 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.799966 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e9514e1-cc53-4445-904c-4505fc60a1ea-config-data\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.800001 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c13a91ca-db8c-42b4-bfca-029e427aff28-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.800030 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.800050 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.800069 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.800091 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-sys\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.801475 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d8820b4-254f-4f89-8609-a8b86b0d5796-dns-svc\") pod \"dnsmasq-dns-56696ff475-snqth\" (UID: \"7d8820b4-254f-4f89-8609-a8b86b0d5796\") " pod="openstack/dnsmasq-dns-56696ff475-snqth" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.802136 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d8820b4-254f-4f89-8609-a8b86b0d5796-ovsdbserver-sb\") pod \"dnsmasq-dns-56696ff475-snqth\" (UID: \"7d8820b4-254f-4f89-8609-a8b86b0d5796\") " pod="openstack/dnsmasq-dns-56696ff475-snqth" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.802777 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d8820b4-254f-4f89-8609-a8b86b0d5796-config\") pod \"dnsmasq-dns-56696ff475-snqth\" (UID: \"7d8820b4-254f-4f89-8609-a8b86b0d5796\") " pod="openstack/dnsmasq-dns-56696ff475-snqth" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.804607 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d8820b4-254f-4f89-8609-a8b86b0d5796-dns-swift-storage-0\") pod \"dnsmasq-dns-56696ff475-snqth\" (UID: \"7d8820b4-254f-4f89-8609-a8b86b0d5796\") " pod="openstack/dnsmasq-dns-56696ff475-snqth" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.807582 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d8820b4-254f-4f89-8609-a8b86b0d5796-ovsdbserver-nb\") pod \"dnsmasq-dns-56696ff475-snqth\" (UID: \"7d8820b4-254f-4f89-8609-a8b86b0d5796\") " pod="openstack/dnsmasq-dns-56696ff475-snqth" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.820969 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.829865 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh92j\" (UniqueName: \"kubernetes.io/projected/7d8820b4-254f-4f89-8609-a8b86b0d5796-kube-api-access-qh92j\") pod \"dnsmasq-dns-56696ff475-snqth\" (UID: \"7d8820b4-254f-4f89-8609-a8b86b0d5796\") " pod="openstack/dnsmasq-dns-56696ff475-snqth" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.865919 4869 generic.go:334] "Generic (PLEG): container finished" podID="1621c994-94d2-4105-a988-f4739518ba91" containerID="2f5452570a2d00afc7e7591fc67b3884055af23168ca0e1d9b4ff0e5dcdc6950" exitCode=0 Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.866211 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerDied","Data":"2f5452570a2d00afc7e7591fc67b3884055af23168ca0e1d9b4ff0e5dcdc6950"} Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.876647 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.879158 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.888954 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.901624 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.901854 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.901923 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.902010 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-etc-nvme\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.902095 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.902177 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5e9514e1-cc53-4445-904c-4505fc60a1ea-ceph\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.902255 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c13a91ca-db8c-42b4-bfca-029e427aff28-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.902333 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-sys\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.902409 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.902485 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e9514e1-cc53-4445-904c-4505fc60a1ea-config-data\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.902572 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c13a91ca-db8c-42b4-bfca-029e427aff28-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.902652 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.902720 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-sys\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.902806 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.902873 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.902958 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c13a91ca-db8c-42b4-bfca-029e427aff28-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.903024 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e9514e1-cc53-4445-904c-4505fc60a1ea-config-data-custom\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.903089 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.903172 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28v8v\" (UniqueName: \"kubernetes.io/projected/c13a91ca-db8c-42b4-bfca-029e427aff28-kube-api-access-28v8v\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.903259 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-lib-modules\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.903415 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-run\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.903517 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c13a91ca-db8c-42b4-bfca-029e427aff28-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.903653 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e9514e1-cc53-4445-904c-4505fc60a1ea-scripts\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.903729 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhbtk\" (UniqueName: \"kubernetes.io/projected/5e9514e1-cc53-4445-904c-4505fc60a1ea-kube-api-access-mhbtk\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.903815 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.903912 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9514e1-cc53-4445-904c-4505fc60a1ea-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.904013 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-dev\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.904120 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-dev\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.904484 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-run\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.904650 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.904740 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c13a91ca-db8c-42b4-bfca-029e427aff28-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.904812 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.905130 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.905294 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.905418 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.905510 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.905619 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-etc-nvme\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.905699 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.908723 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-sys\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.908775 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-sys\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.908805 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.912802 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.912874 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-lib-modules\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.912903 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-run\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.913275 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.913313 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-dev\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.918005 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-run\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.918200 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.919213 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c13a91ca-db8c-42b4-bfca-029e427aff28-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.919643 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.919930 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.920057 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-dev\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.920899 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.924152 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5e9514e1-cc53-4445-904c-4505fc60a1ea-ceph\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.925306 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c13a91ca-db8c-42b4-bfca-029e427aff28-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.931408 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9514e1-cc53-4445-904c-4505fc60a1ea-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.931613 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c13a91ca-db8c-42b4-bfca-029e427aff28-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.932774 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e9514e1-cc53-4445-904c-4505fc60a1ea-config-data\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.933549 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c13a91ca-db8c-42b4-bfca-029e427aff28-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.941558 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c13a91ca-db8c-42b4-bfca-029e427aff28-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.942257 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e9514e1-cc53-4445-904c-4505fc60a1ea-config-data-custom\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.948528 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28v8v\" (UniqueName: \"kubernetes.io/projected/c13a91ca-db8c-42b4-bfca-029e427aff28-kube-api-access-28v8v\") pod \"cinder-volume-volume1-0\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.950955 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e9514e1-cc53-4445-904c-4505fc60a1ea-scripts\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.953001 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.954129 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhbtk\" (UniqueName: \"kubernetes.io/projected/5e9514e1-cc53-4445-904c-4505fc60a1ea-kube-api-access-mhbtk\") pod \"cinder-backup-0\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " pod="openstack/cinder-backup-0" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.974986 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56696ff475-snqth" Mar 12 15:08:50 crc kubenswrapper[4869]: I0312 15:08:50.986975 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 12 15:08:51 crc kubenswrapper[4869]: I0312 15:08:51.007716 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9b204f01-5a99-40b4-9cbf-38711f1d1f82-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\") " pod="openstack/cinder-api-0" Mar 12 15:08:51 crc kubenswrapper[4869]: I0312 15:08:51.007759 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b204f01-5a99-40b4-9cbf-38711f1d1f82-config-data\") pod \"cinder-api-0\" (UID: \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\") " pod="openstack/cinder-api-0" Mar 12 15:08:51 crc kubenswrapper[4869]: I0312 15:08:51.007840 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b204f01-5a99-40b4-9cbf-38711f1d1f82-logs\") pod \"cinder-api-0\" (UID: \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\") " pod="openstack/cinder-api-0" Mar 12 15:08:51 crc kubenswrapper[4869]: I0312 15:08:51.007858 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b204f01-5a99-40b4-9cbf-38711f1d1f82-scripts\") pod \"cinder-api-0\" (UID: \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\") " pod="openstack/cinder-api-0" Mar 12 15:08:51 crc kubenswrapper[4869]: I0312 15:08:51.007885 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b204f01-5a99-40b4-9cbf-38711f1d1f82-config-data-custom\") pod \"cinder-api-0\" (UID: \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\") " pod="openstack/cinder-api-0" Mar 12 15:08:51 crc kubenswrapper[4869]: I0312 15:08:51.007901 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b204f01-5a99-40b4-9cbf-38711f1d1f82-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\") " pod="openstack/cinder-api-0" Mar 12 15:08:51 crc kubenswrapper[4869]: I0312 15:08:51.007956 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csrbn\" (UniqueName: \"kubernetes.io/projected/9b204f01-5a99-40b4-9cbf-38711f1d1f82-kube-api-access-csrbn\") pod \"cinder-api-0\" (UID: \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\") " pod="openstack/cinder-api-0" Mar 12 15:08:51 crc kubenswrapper[4869]: I0312 15:08:51.102823 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 12 15:08:51 crc kubenswrapper[4869]: I0312 15:08:51.109150 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b204f01-5a99-40b4-9cbf-38711f1d1f82-logs\") pod \"cinder-api-0\" (UID: \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\") " pod="openstack/cinder-api-0" Mar 12 15:08:51 crc kubenswrapper[4869]: I0312 15:08:51.109199 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b204f01-5a99-40b4-9cbf-38711f1d1f82-scripts\") pod \"cinder-api-0\" (UID: \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\") " pod="openstack/cinder-api-0" Mar 12 15:08:51 crc kubenswrapper[4869]: I0312 15:08:51.109233 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b204f01-5a99-40b4-9cbf-38711f1d1f82-config-data-custom\") pod \"cinder-api-0\" (UID: \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\") " pod="openstack/cinder-api-0" Mar 12 15:08:51 crc kubenswrapper[4869]: I0312 15:08:51.109251 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b204f01-5a99-40b4-9cbf-38711f1d1f82-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\") " pod="openstack/cinder-api-0" Mar 12 15:08:51 crc kubenswrapper[4869]: I0312 15:08:51.109309 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csrbn\" (UniqueName: \"kubernetes.io/projected/9b204f01-5a99-40b4-9cbf-38711f1d1f82-kube-api-access-csrbn\") pod \"cinder-api-0\" (UID: \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\") " pod="openstack/cinder-api-0" Mar 12 15:08:51 crc kubenswrapper[4869]: I0312 15:08:51.109356 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9b204f01-5a99-40b4-9cbf-38711f1d1f82-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\") " pod="openstack/cinder-api-0" Mar 12 15:08:51 crc kubenswrapper[4869]: I0312 15:08:51.109376 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b204f01-5a99-40b4-9cbf-38711f1d1f82-config-data\") pod \"cinder-api-0\" (UID: \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\") " pod="openstack/cinder-api-0" Mar 12 15:08:51 crc kubenswrapper[4869]: I0312 15:08:51.109774 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b204f01-5a99-40b4-9cbf-38711f1d1f82-logs\") pod \"cinder-api-0\" (UID: \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\") " pod="openstack/cinder-api-0" Mar 12 15:08:51 crc kubenswrapper[4869]: I0312 15:08:51.109920 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9b204f01-5a99-40b4-9cbf-38711f1d1f82-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\") " pod="openstack/cinder-api-0" Mar 12 15:08:51 crc kubenswrapper[4869]: I0312 15:08:51.112679 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b204f01-5a99-40b4-9cbf-38711f1d1f82-scripts\") pod \"cinder-api-0\" (UID: \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\") " pod="openstack/cinder-api-0" Mar 12 15:08:51 crc kubenswrapper[4869]: I0312 15:08:51.119160 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b204f01-5a99-40b4-9cbf-38711f1d1f82-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\") " pod="openstack/cinder-api-0" Mar 12 15:08:51 crc kubenswrapper[4869]: I0312 15:08:51.120451 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b204f01-5a99-40b4-9cbf-38711f1d1f82-config-data-custom\") pod \"cinder-api-0\" (UID: \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\") " pod="openstack/cinder-api-0" Mar 12 15:08:51 crc kubenswrapper[4869]: I0312 15:08:51.126765 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b204f01-5a99-40b4-9cbf-38711f1d1f82-config-data\") pod \"cinder-api-0\" (UID: \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\") " pod="openstack/cinder-api-0" Mar 12 15:08:51 crc kubenswrapper[4869]: I0312 15:08:51.148107 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csrbn\" (UniqueName: \"kubernetes.io/projected/9b204f01-5a99-40b4-9cbf-38711f1d1f82-kube-api-access-csrbn\") pod \"cinder-api-0\" (UID: \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\") " pod="openstack/cinder-api-0" Mar 12 15:08:51 crc kubenswrapper[4869]: I0312 15:08:51.312064 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 12 15:08:52 crc kubenswrapper[4869]: I0312 15:08:52.734806 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 12 15:08:54 crc kubenswrapper[4869]: I0312 15:08:54.464113 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-79647c85bf-8lt5x" Mar 12 15:08:54 crc kubenswrapper[4869]: I0312 15:08:54.541840 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7f7dc9cccb-h2224"] Mar 12 15:08:54 crc kubenswrapper[4869]: I0312 15:08:54.542061 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7f7dc9cccb-h2224" podUID="d33ab78c-e70a-4b83-9c6d-d94fab2fbd55" containerName="neutron-api" containerID="cri-o://1278217bb1774c11ce94df9c8b2d85393904c84d9a532b6e7bd9869fc86adf7b" gracePeriod=30 Mar 12 15:08:54 crc kubenswrapper[4869]: I0312 15:08:54.542462 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7f7dc9cccb-h2224" podUID="d33ab78c-e70a-4b83-9c6d-d94fab2fbd55" containerName="neutron-httpd" containerID="cri-o://3b8f0131e0017262115ec0dc23c7b2937c9c0540e2564adb238f25d2f1afd88c" gracePeriod=30 Mar 12 15:08:54 crc kubenswrapper[4869]: I0312 15:08:54.915742 4869 generic.go:334] "Generic (PLEG): container finished" podID="d33ab78c-e70a-4b83-9c6d-d94fab2fbd55" containerID="3b8f0131e0017262115ec0dc23c7b2937c9c0540e2564adb238f25d2f1afd88c" exitCode=0 Mar 12 15:08:54 crc kubenswrapper[4869]: I0312 15:08:54.915799 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f7dc9cccb-h2224" event={"ID":"d33ab78c-e70a-4b83-9c6d-d94fab2fbd55","Type":"ContainerDied","Data":"3b8f0131e0017262115ec0dc23c7b2937c9c0540e2564adb238f25d2f1afd88c"} Mar 12 15:08:55 crc kubenswrapper[4869]: I0312 15:08:55.768587 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7f798b7b68-jtm8h" podUID="b4d63031-e072-466e-ae3c-d829a699b197" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Mar 12 15:08:56 crc kubenswrapper[4869]: I0312 15:08:56.357932 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c67d578b-b8ef-43a0-a170-2f4f1ca48195" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.170:3000/\": dial tcp 10.217.0.170:3000: connect: connection refused" Mar 12 15:08:56 crc kubenswrapper[4869]: I0312 15:08:56.420992 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Mar 12 15:08:56 crc kubenswrapper[4869]: I0312 15:08:56.465004 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Mar 12 15:08:56 crc kubenswrapper[4869]: I0312 15:08:56.932866 4869 generic.go:334] "Generic (PLEG): container finished" podID="d33ab78c-e70a-4b83-9c6d-d94fab2fbd55" containerID="1278217bb1774c11ce94df9c8b2d85393904c84d9a532b6e7bd9869fc86adf7b" exitCode=0 Mar 12 15:08:56 crc kubenswrapper[4869]: I0312 15:08:56.932968 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f7dc9cccb-h2224" event={"ID":"d33ab78c-e70a-4b83-9c6d-d94fab2fbd55","Type":"ContainerDied","Data":"1278217bb1774c11ce94df9c8b2d85393904c84d9a532b6e7bd9869fc86adf7b"} Mar 12 15:08:56 crc kubenswrapper[4869]: I0312 15:08:56.933073 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="3fa83a8e-fd16-444a-8967-17725d75565d" containerName="manila-scheduler" containerID="cri-o://3c98d32f355dd68e7daf78c6478c6758363c3cb4ff1388fc0fbf56edd988d467" gracePeriod=30 Mar 12 15:08:56 crc kubenswrapper[4869]: I0312 15:08:56.933160 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="3fa83a8e-fd16-444a-8967-17725d75565d" containerName="probe" containerID="cri-o://ac01121649e7151b6fccc1ceaef28cc60d4fc6a663ba1bdcfc03a76d5b3025cf" gracePeriod=30 Mar 12 15:08:57 crc kubenswrapper[4869]: E0312 15:08:57.305660 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified" Mar 12 15:08:57 crc kubenswrapper[4869]: E0312 15:08:57.306063 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstackclient,Image:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,Command:[/bin/sleep],Args:[infinity],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n689h65h656h9bh8bh68ch5c6hc5h68bhb7hc6h6fh58h84h5cbhc6h657hb9h69h5f7h6dh599h5f5h544hdch98h74hbh5f5h568h68ch558q,ValueFrom:nil,},EnvVar{Name:OS_CLOUD,Value:default,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_HOST,Value:metric-storage-prometheus.openstack.svc,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_PORT,Value:9090,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openstack-config,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/cloudrc,SubPath:cloudrc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6nhkl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42401,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42401,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstackclient_openstack(df9ac527-ae76-4cb7-b474-61f5699e610f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 15:08:57 crc kubenswrapper[4869]: E0312 15:08:57.307587 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstackclient" podUID="df9ac527-ae76-4cb7-b474-61f5699e610f" Mar 12 15:08:57 crc kubenswrapper[4869]: I0312 15:08:57.436088 4869 scope.go:117] "RemoveContainer" containerID="4a4631081d5e5fc7edf8ec441b4e6021fccc8a0eb43129ead7c218c73fc3c349" Mar 12 15:08:57 crc kubenswrapper[4869]: I0312 15:08:57.716890 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-78k6r" Mar 12 15:08:57 crc kubenswrapper[4869]: E0312 15:08:57.744382 4869 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3fa83a8e_fd16_444a_8967_17725d75565d.slice/crio-conmon-ac01121649e7151b6fccc1ceaef28cc60d4fc6a663ba1bdcfc03a76d5b3025cf.scope\": RecentStats: unable to find data in memory cache]" Mar 12 15:08:57 crc kubenswrapper[4869]: I0312 15:08:57.853405 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95e0a90a-0269-471b-bd8d-a110809af063-ovsdbserver-nb\") pod \"95e0a90a-0269-471b-bd8d-a110809af063\" (UID: \"95e0a90a-0269-471b-bd8d-a110809af063\") " Mar 12 15:08:57 crc kubenswrapper[4869]: I0312 15:08:57.853459 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95e0a90a-0269-471b-bd8d-a110809af063-dns-svc\") pod \"95e0a90a-0269-471b-bd8d-a110809af063\" (UID: \"95e0a90a-0269-471b-bd8d-a110809af063\") " Mar 12 15:08:57 crc kubenswrapper[4869]: I0312 15:08:57.853487 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/95e0a90a-0269-471b-bd8d-a110809af063-dns-swift-storage-0\") pod \"95e0a90a-0269-471b-bd8d-a110809af063\" (UID: \"95e0a90a-0269-471b-bd8d-a110809af063\") " Mar 12 15:08:57 crc kubenswrapper[4869]: I0312 15:08:57.853515 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pzfj\" (UniqueName: \"kubernetes.io/projected/95e0a90a-0269-471b-bd8d-a110809af063-kube-api-access-7pzfj\") pod \"95e0a90a-0269-471b-bd8d-a110809af063\" (UID: \"95e0a90a-0269-471b-bd8d-a110809af063\") " Mar 12 15:08:57 crc kubenswrapper[4869]: I0312 15:08:57.853532 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95e0a90a-0269-471b-bd8d-a110809af063-ovsdbserver-sb\") pod \"95e0a90a-0269-471b-bd8d-a110809af063\" (UID: \"95e0a90a-0269-471b-bd8d-a110809af063\") " Mar 12 15:08:57 crc kubenswrapper[4869]: I0312 15:08:57.853716 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95e0a90a-0269-471b-bd8d-a110809af063-config\") pod \"95e0a90a-0269-471b-bd8d-a110809af063\" (UID: \"95e0a90a-0269-471b-bd8d-a110809af063\") " Mar 12 15:08:57 crc kubenswrapper[4869]: I0312 15:08:57.859310 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95e0a90a-0269-471b-bd8d-a110809af063-kube-api-access-7pzfj" (OuterVolumeSpecName: "kube-api-access-7pzfj") pod "95e0a90a-0269-471b-bd8d-a110809af063" (UID: "95e0a90a-0269-471b-bd8d-a110809af063"). InnerVolumeSpecName "kube-api-access-7pzfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:08:57 crc kubenswrapper[4869]: I0312 15:08:57.927033 4869 scope.go:117] "RemoveContainer" containerID="6691c87a5e7cc321e0c2d8e50068a6febe67be7c1ea060f760a86bb39e326983" Mar 12 15:08:57 crc kubenswrapper[4869]: E0312 15:08:57.929715 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6691c87a5e7cc321e0c2d8e50068a6febe67be7c1ea060f760a86bb39e326983\": container with ID starting with 6691c87a5e7cc321e0c2d8e50068a6febe67be7c1ea060f760a86bb39e326983 not found: ID does not exist" containerID="6691c87a5e7cc321e0c2d8e50068a6febe67be7c1ea060f760a86bb39e326983" Mar 12 15:08:57 crc kubenswrapper[4869]: I0312 15:08:57.929884 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6691c87a5e7cc321e0c2d8e50068a6febe67be7c1ea060f760a86bb39e326983"} err="failed to get container status \"6691c87a5e7cc321e0c2d8e50068a6febe67be7c1ea060f760a86bb39e326983\": rpc error: code = NotFound desc = could not find container \"6691c87a5e7cc321e0c2d8e50068a6febe67be7c1ea060f760a86bb39e326983\": container with ID starting with 6691c87a5e7cc321e0c2d8e50068a6febe67be7c1ea060f760a86bb39e326983 not found: ID does not exist" Mar 12 15:08:57 crc kubenswrapper[4869]: I0312 15:08:57.929991 4869 scope.go:117] "RemoveContainer" containerID="4a4631081d5e5fc7edf8ec441b4e6021fccc8a0eb43129ead7c218c73fc3c349" Mar 12 15:08:57 crc kubenswrapper[4869]: E0312 15:08:57.932894 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a4631081d5e5fc7edf8ec441b4e6021fccc8a0eb43129ead7c218c73fc3c349\": container with ID starting with 4a4631081d5e5fc7edf8ec441b4e6021fccc8a0eb43129ead7c218c73fc3c349 not found: ID does not exist" containerID="4a4631081d5e5fc7edf8ec441b4e6021fccc8a0eb43129ead7c218c73fc3c349" Mar 12 15:08:57 crc kubenswrapper[4869]: I0312 15:08:57.933025 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a4631081d5e5fc7edf8ec441b4e6021fccc8a0eb43129ead7c218c73fc3c349"} err="failed to get container status \"4a4631081d5e5fc7edf8ec441b4e6021fccc8a0eb43129ead7c218c73fc3c349\": rpc error: code = NotFound desc = could not find container \"4a4631081d5e5fc7edf8ec441b4e6021fccc8a0eb43129ead7c218c73fc3c349\": container with ID starting with 4a4631081d5e5fc7edf8ec441b4e6021fccc8a0eb43129ead7c218c73fc3c349 not found: ID does not exist" Mar 12 15:08:57 crc kubenswrapper[4869]: I0312 15:08:57.933809 4869 scope.go:117] "RemoveContainer" containerID="e0eefeacb8cbd3ff8e6abbc8c8ef619674d30f19c6c54cde868b340244e6c200" Mar 12 15:08:57 crc kubenswrapper[4869]: I0312 15:08:57.956735 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pzfj\" (UniqueName: \"kubernetes.io/projected/95e0a90a-0269-471b-bd8d-a110809af063-kube-api-access-7pzfj\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:57 crc kubenswrapper[4869]: I0312 15:08:57.958215 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-78k6r" event={"ID":"95e0a90a-0269-471b-bd8d-a110809af063","Type":"ContainerDied","Data":"26c39212bc9b61dd7d52fb1ab0bd238568eff9f3f56ce2547dd801cf4f2da319"} Mar 12 15:08:57 crc kubenswrapper[4869]: I0312 15:08:57.958248 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-78k6r" Mar 12 15:08:57 crc kubenswrapper[4869]: I0312 15:08:57.963243 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c67d578b-b8ef-43a0-a170-2f4f1ca48195","Type":"ContainerDied","Data":"31666c6e447b5f2d0573a0d1facad1178976f9462fb95a093acb0ea4d43af7cd"} Mar 12 15:08:57 crc kubenswrapper[4869]: I0312 15:08:57.963277 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31666c6e447b5f2d0573a0d1facad1178976f9462fb95a093acb0ea4d43af7cd" Mar 12 15:08:57 crc kubenswrapper[4869]: I0312 15:08:57.965445 4869 generic.go:334] "Generic (PLEG): container finished" podID="3fa83a8e-fd16-444a-8967-17725d75565d" containerID="ac01121649e7151b6fccc1ceaef28cc60d4fc6a663ba1bdcfc03a76d5b3025cf" exitCode=0 Mar 12 15:08:57 crc kubenswrapper[4869]: I0312 15:08:57.965486 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"3fa83a8e-fd16-444a-8967-17725d75565d","Type":"ContainerDied","Data":"ac01121649e7151b6fccc1ceaef28cc60d4fc6a663ba1bdcfc03a76d5b3025cf"} Mar 12 15:08:57 crc kubenswrapper[4869]: E0312 15:08:57.972770 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified\\\"\"" pod="openstack/openstackclient" podUID="df9ac527-ae76-4cb7-b474-61f5699e610f" Mar 12 15:08:57 crc kubenswrapper[4869]: I0312 15:08:57.986859 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 15:08:57 crc kubenswrapper[4869]: I0312 15:08:57.989515 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95e0a90a-0269-471b-bd8d-a110809af063-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "95e0a90a-0269-471b-bd8d-a110809af063" (UID: "95e0a90a-0269-471b-bd8d-a110809af063"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.007118 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95e0a90a-0269-471b-bd8d-a110809af063-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "95e0a90a-0269-471b-bd8d-a110809af063" (UID: "95e0a90a-0269-471b-bd8d-a110809af063"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.009836 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95e0a90a-0269-471b-bd8d-a110809af063-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "95e0a90a-0269-471b-bd8d-a110809af063" (UID: "95e0a90a-0269-471b-bd8d-a110809af063"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.011738 4869 scope.go:117] "RemoveContainer" containerID="779f0090be3e86acdb1e053767078ea8011c69e92098502aee66b94993914d15" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.012461 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95e0a90a-0269-471b-bd8d-a110809af063-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "95e0a90a-0269-471b-bd8d-a110809af063" (UID: "95e0a90a-0269-471b-bd8d-a110809af063"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.033308 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95e0a90a-0269-471b-bd8d-a110809af063-config" (OuterVolumeSpecName: "config") pod "95e0a90a-0269-471b-bd8d-a110809af063" (UID: "95e0a90a-0269-471b-bd8d-a110809af063"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.062491 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c67d578b-b8ef-43a0-a170-2f4f1ca48195-run-httpd\") pod \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\" (UID: \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\") " Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.062581 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c67d578b-b8ef-43a0-a170-2f4f1ca48195-sg-core-conf-yaml\") pod \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\" (UID: \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\") " Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.062632 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c67d578b-b8ef-43a0-a170-2f4f1ca48195-log-httpd\") pod \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\" (UID: \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\") " Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.062652 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c67d578b-b8ef-43a0-a170-2f4f1ca48195-config-data\") pod \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\" (UID: \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\") " Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.062718 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c67d578b-b8ef-43a0-a170-2f4f1ca48195-combined-ca-bundle\") pod \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\" (UID: \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\") " Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.062792 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c67d578b-b8ef-43a0-a170-2f4f1ca48195-scripts\") pod \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\" (UID: \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\") " Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.062859 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gbg5\" (UniqueName: \"kubernetes.io/projected/c67d578b-b8ef-43a0-a170-2f4f1ca48195-kube-api-access-2gbg5\") pod \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\" (UID: \"c67d578b-b8ef-43a0-a170-2f4f1ca48195\") " Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.063313 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95e0a90a-0269-471b-bd8d-a110809af063-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.063331 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95e0a90a-0269-471b-bd8d-a110809af063-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.063341 4869 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/95e0a90a-0269-471b-bd8d-a110809af063-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.063351 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95e0a90a-0269-471b-bd8d-a110809af063-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.063359 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95e0a90a-0269-471b-bd8d-a110809af063-config\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.065696 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c67d578b-b8ef-43a0-a170-2f4f1ca48195-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c67d578b-b8ef-43a0-a170-2f4f1ca48195" (UID: "c67d578b-b8ef-43a0-a170-2f4f1ca48195"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.065750 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c67d578b-b8ef-43a0-a170-2f4f1ca48195-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c67d578b-b8ef-43a0-a170-2f4f1ca48195" (UID: "c67d578b-b8ef-43a0-a170-2f4f1ca48195"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.088887 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c67d578b-b8ef-43a0-a170-2f4f1ca48195-scripts" (OuterVolumeSpecName: "scripts") pod "c67d578b-b8ef-43a0-a170-2f4f1ca48195" (UID: "c67d578b-b8ef-43a0-a170-2f4f1ca48195"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.089135 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c67d578b-b8ef-43a0-a170-2f4f1ca48195-kube-api-access-2gbg5" (OuterVolumeSpecName: "kube-api-access-2gbg5") pod "c67d578b-b8ef-43a0-a170-2f4f1ca48195" (UID: "c67d578b-b8ef-43a0-a170-2f4f1ca48195"). InnerVolumeSpecName "kube-api-access-2gbg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.102620 4869 scope.go:117] "RemoveContainer" containerID="6e007d00dcfa5ddc50317efa17d7f0343c22e8dafe1fe1f1b4c1137ced411ee9" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.115714 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c67d578b-b8ef-43a0-a170-2f4f1ca48195-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c67d578b-b8ef-43a0-a170-2f4f1ca48195" (UID: "c67d578b-b8ef-43a0-a170-2f4f1ca48195"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.166703 4869 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c67d578b-b8ef-43a0-a170-2f4f1ca48195-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.166732 4869 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c67d578b-b8ef-43a0-a170-2f4f1ca48195-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.166741 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c67d578b-b8ef-43a0-a170-2f4f1ca48195-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.166751 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gbg5\" (UniqueName: \"kubernetes.io/projected/c67d578b-b8ef-43a0-a170-2f4f1ca48195-kube-api-access-2gbg5\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.166780 4869 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c67d578b-b8ef-43a0-a170-2f4f1ca48195-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.171480 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f798b7b68-jtm8h" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.267630 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4d63031-e072-466e-ae3c-d829a699b197-combined-ca-bundle\") pod \"b4d63031-e072-466e-ae3c-d829a699b197\" (UID: \"b4d63031-e072-466e-ae3c-d829a699b197\") " Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.268135 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt9dr\" (UniqueName: \"kubernetes.io/projected/b4d63031-e072-466e-ae3c-d829a699b197-kube-api-access-pt9dr\") pod \"b4d63031-e072-466e-ae3c-d829a699b197\" (UID: \"b4d63031-e072-466e-ae3c-d829a699b197\") " Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.268695 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b4d63031-e072-466e-ae3c-d829a699b197-config-data\") pod \"b4d63031-e072-466e-ae3c-d829a699b197\" (UID: \"b4d63031-e072-466e-ae3c-d829a699b197\") " Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.268841 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b4d63031-e072-466e-ae3c-d829a699b197-horizon-secret-key\") pod \"b4d63031-e072-466e-ae3c-d829a699b197\" (UID: \"b4d63031-e072-466e-ae3c-d829a699b197\") " Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.268924 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4d63031-e072-466e-ae3c-d829a699b197-logs\") pod \"b4d63031-e072-466e-ae3c-d829a699b197\" (UID: \"b4d63031-e072-466e-ae3c-d829a699b197\") " Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.268955 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4d63031-e072-466e-ae3c-d829a699b197-horizon-tls-certs\") pod \"b4d63031-e072-466e-ae3c-d829a699b197\" (UID: \"b4d63031-e072-466e-ae3c-d829a699b197\") " Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.269001 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4d63031-e072-466e-ae3c-d829a699b197-scripts\") pod \"b4d63031-e072-466e-ae3c-d829a699b197\" (UID: \"b4d63031-e072-466e-ae3c-d829a699b197\") " Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.269468 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4d63031-e072-466e-ae3c-d829a699b197-logs" (OuterVolumeSpecName: "logs") pod "b4d63031-e072-466e-ae3c-d829a699b197" (UID: "b4d63031-e072-466e-ae3c-d829a699b197"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.270313 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4d63031-e072-466e-ae3c-d829a699b197-logs\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.272387 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c67d578b-b8ef-43a0-a170-2f4f1ca48195-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c67d578b-b8ef-43a0-a170-2f4f1ca48195" (UID: "c67d578b-b8ef-43a0-a170-2f4f1ca48195"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.276292 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4d63031-e072-466e-ae3c-d829a699b197-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b4d63031-e072-466e-ae3c-d829a699b197" (UID: "b4d63031-e072-466e-ae3c-d829a699b197"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.276362 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4d63031-e072-466e-ae3c-d829a699b197-kube-api-access-pt9dr" (OuterVolumeSpecName: "kube-api-access-pt9dr") pod "b4d63031-e072-466e-ae3c-d829a699b197" (UID: "b4d63031-e072-466e-ae3c-d829a699b197"). InnerVolumeSpecName "kube-api-access-pt9dr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.276801 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c67d578b-b8ef-43a0-a170-2f4f1ca48195-config-data" (OuterVolumeSpecName: "config-data") pod "c67d578b-b8ef-43a0-a170-2f4f1ca48195" (UID: "c67d578b-b8ef-43a0-a170-2f4f1ca48195"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.305416 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4d63031-e072-466e-ae3c-d829a699b197-config-data" (OuterVolumeSpecName: "config-data") pod "b4d63031-e072-466e-ae3c-d829a699b197" (UID: "b4d63031-e072-466e-ae3c-d829a699b197"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.320305 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4d63031-e072-466e-ae3c-d829a699b197-scripts" (OuterVolumeSpecName: "scripts") pod "b4d63031-e072-466e-ae3c-d829a699b197" (UID: "b4d63031-e072-466e-ae3c-d829a699b197"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.325015 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4d63031-e072-466e-ae3c-d829a699b197-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4d63031-e072-466e-ae3c-d829a699b197" (UID: "b4d63031-e072-466e-ae3c-d829a699b197"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.363639 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4d63031-e072-466e-ae3c-d829a699b197-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "b4d63031-e072-466e-ae3c-d829a699b197" (UID: "b4d63031-e072-466e-ae3c-d829a699b197"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.373167 4869 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4d63031-e072-466e-ae3c-d829a699b197-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.373223 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4d63031-e072-466e-ae3c-d829a699b197-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.373254 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4d63031-e072-466e-ae3c-d829a699b197-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.373266 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c67d578b-b8ef-43a0-a170-2f4f1ca48195-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.373278 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt9dr\" (UniqueName: \"kubernetes.io/projected/b4d63031-e072-466e-ae3c-d829a699b197-kube-api-access-pt9dr\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.373293 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b4d63031-e072-466e-ae3c-d829a699b197-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.373303 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c67d578b-b8ef-43a0-a170-2f4f1ca48195-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.373314 4869 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b4d63031-e072-466e-ae3c-d829a699b197-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.591785 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-78k6r"] Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.603233 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-78k6r"] Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.607611 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-55f844cf75-78k6r" podUID="95e0a90a-0269-471b-bd8d-a110809af063" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.159:5353: i/o timeout" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.612636 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1e62-account-create-update-8jrd8"] Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.867920 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f7dc9cccb-h2224" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.974889 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1e62-account-create-update-8jrd8" event={"ID":"2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7","Type":"ContainerStarted","Data":"de32733dca9c691a1b604fb64e3a77cd2eaec7842e4b9d2439487dbab3745bff"} Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.974937 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1e62-account-create-update-8jrd8" event={"ID":"2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7","Type":"ContainerStarted","Data":"a536d0c13b45533668d79120ea209a7131da6ecbc32a9831848df29380c648a2"} Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.977018 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerStarted","Data":"e494c12edd1eec941b7037231b079cb054af121efa06d3e86355c27776905fd6"} Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.986631 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f7dc9cccb-h2224" event={"ID":"d33ab78c-e70a-4b83-9c6d-d94fab2fbd55","Type":"ContainerDied","Data":"d40b7506475eaea5422b6633c3e9528b8dc954bd0913f646753609e71f8341aa"} Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.986674 4869 scope.go:117] "RemoveContainer" containerID="3b8f0131e0017262115ec0dc23c7b2937c9c0540e2564adb238f25d2f1afd88c" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.986763 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f7dc9cccb-h2224" Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.993322 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"6838a2d7-2052-45b9-a8d5-3aa6639bccb4","Type":"ContainerStarted","Data":"8024b6d2f80b1844ef74c5d435773a5be37299846584c993574bc63e515f0c67"} Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.993473 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d33ab78c-e70a-4b83-9c6d-d94fab2fbd55-ovndb-tls-certs\") pod \"d33ab78c-e70a-4b83-9c6d-d94fab2fbd55\" (UID: \"d33ab78c-e70a-4b83-9c6d-d94fab2fbd55\") " Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.993519 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d33ab78c-e70a-4b83-9c6d-d94fab2fbd55-config\") pod \"d33ab78c-e70a-4b83-9c6d-d94fab2fbd55\" (UID: \"d33ab78c-e70a-4b83-9c6d-d94fab2fbd55\") " Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.993561 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tkpf\" (UniqueName: \"kubernetes.io/projected/d33ab78c-e70a-4b83-9c6d-d94fab2fbd55-kube-api-access-9tkpf\") pod \"d33ab78c-e70a-4b83-9c6d-d94fab2fbd55\" (UID: \"d33ab78c-e70a-4b83-9c6d-d94fab2fbd55\") " Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.993728 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d33ab78c-e70a-4b83-9c6d-d94fab2fbd55-combined-ca-bundle\") pod \"d33ab78c-e70a-4b83-9c6d-d94fab2fbd55\" (UID: \"d33ab78c-e70a-4b83-9c6d-d94fab2fbd55\") " Mar 12 15:08:58 crc kubenswrapper[4869]: I0312 15:08:58.993761 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d33ab78c-e70a-4b83-9c6d-d94fab2fbd55-httpd-config\") pod \"d33ab78c-e70a-4b83-9c6d-d94fab2fbd55\" (UID: \"d33ab78c-e70a-4b83-9c6d-d94fab2fbd55\") " Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.005801 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-1e62-account-create-update-8jrd8" podStartSLOduration=12.005781815 podStartE2EDuration="12.005781815s" podCreationTimestamp="2026-03-12 15:08:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:08:58.990738705 +0000 UTC m=+1291.275963983" watchObservedRunningTime="2026-03-12 15:08:59.005781815 +0000 UTC m=+1291.291007093" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.009341 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d33ab78c-e70a-4b83-9c6d-d94fab2fbd55-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "d33ab78c-e70a-4b83-9c6d-d94fab2fbd55" (UID: "d33ab78c-e70a-4b83-9c6d-d94fab2fbd55"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.010026 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d33ab78c-e70a-4b83-9c6d-d94fab2fbd55-kube-api-access-9tkpf" (OuterVolumeSpecName: "kube-api-access-9tkpf") pod "d33ab78c-e70a-4b83-9c6d-d94fab2fbd55" (UID: "d33ab78c-e70a-4b83-9c6d-d94fab2fbd55"). InnerVolumeSpecName "kube-api-access-9tkpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.021284 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f798b7b68-jtm8h" event={"ID":"b4d63031-e072-466e-ae3c-d829a699b197","Type":"ContainerDied","Data":"9c0db23a5ffbd9280c69817bf2cfa28632e216ba14ae4aad3ba2920b3f9b6ed9"} Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.021461 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f798b7b68-jtm8h" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.027404 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.066243 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7f798b7b68-jtm8h"] Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.088515 4869 scope.go:117] "RemoveContainer" containerID="1278217bb1774c11ce94df9c8b2d85393904c84d9a532b6e7bd9869fc86adf7b" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.090592 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7f798b7b68-jtm8h"] Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.122405 4869 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d33ab78c-e70a-4b83-9c6d-d94fab2fbd55-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.122449 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tkpf\" (UniqueName: \"kubernetes.io/projected/d33ab78c-e70a-4b83-9c6d-d94fab2fbd55-kube-api-access-9tkpf\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.149391 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d33ab78c-e70a-4b83-9c6d-d94fab2fbd55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d33ab78c-e70a-4b83-9c6d-d94fab2fbd55" (UID: "d33ab78c-e70a-4b83-9c6d-d94fab2fbd55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.160829 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d33ab78c-e70a-4b83-9c6d-d94fab2fbd55-config" (OuterVolumeSpecName: "config") pod "d33ab78c-e70a-4b83-9c6d-d94fab2fbd55" (UID: "d33ab78c-e70a-4b83-9c6d-d94fab2fbd55"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.167255 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.175277 4869 scope.go:117] "RemoveContainer" containerID="8d06094a459cda2058ff32c4ae52ed8cd8b1f55cc120d4cd9b716f47aa66a7f1" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.179609 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.198358 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d33ab78c-e70a-4b83-9c6d-d94fab2fbd55-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "d33ab78c-e70a-4b83-9c6d-d94fab2fbd55" (UID: "d33ab78c-e70a-4b83-9c6d-d94fab2fbd55"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.225872 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:08:59 crc kubenswrapper[4869]: E0312 15:08:59.226725 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c67d578b-b8ef-43a0-a170-2f4f1ca48195" containerName="ceilometer-notification-agent" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.226743 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="c67d578b-b8ef-43a0-a170-2f4f1ca48195" containerName="ceilometer-notification-agent" Mar 12 15:08:59 crc kubenswrapper[4869]: E0312 15:08:59.226770 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d33ab78c-e70a-4b83-9c6d-d94fab2fbd55" containerName="neutron-httpd" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.226779 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="d33ab78c-e70a-4b83-9c6d-d94fab2fbd55" containerName="neutron-httpd" Mar 12 15:08:59 crc kubenswrapper[4869]: E0312 15:08:59.226791 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95e0a90a-0269-471b-bd8d-a110809af063" containerName="init" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.226799 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="95e0a90a-0269-471b-bd8d-a110809af063" containerName="init" Mar 12 15:08:59 crc kubenswrapper[4869]: E0312 15:08:59.226809 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c67d578b-b8ef-43a0-a170-2f4f1ca48195" containerName="sg-core" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.226815 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="c67d578b-b8ef-43a0-a170-2f4f1ca48195" containerName="sg-core" Mar 12 15:08:59 crc kubenswrapper[4869]: E0312 15:08:59.226831 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4d63031-e072-466e-ae3c-d829a699b197" containerName="horizon-log" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.226842 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4d63031-e072-466e-ae3c-d829a699b197" containerName="horizon-log" Mar 12 15:08:59 crc kubenswrapper[4869]: E0312 15:08:59.226861 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4d63031-e072-466e-ae3c-d829a699b197" containerName="horizon" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.226869 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4d63031-e072-466e-ae3c-d829a699b197" containerName="horizon" Mar 12 15:08:59 crc kubenswrapper[4869]: E0312 15:08:59.226887 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d33ab78c-e70a-4b83-9c6d-d94fab2fbd55" containerName="neutron-api" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.226894 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="d33ab78c-e70a-4b83-9c6d-d94fab2fbd55" containerName="neutron-api" Mar 12 15:08:59 crc kubenswrapper[4869]: E0312 15:08:59.226913 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c67d578b-b8ef-43a0-a170-2f4f1ca48195" containerName="proxy-httpd" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.226921 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="c67d578b-b8ef-43a0-a170-2f4f1ca48195" containerName="proxy-httpd" Mar 12 15:08:59 crc kubenswrapper[4869]: E0312 15:08:59.226933 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c67d578b-b8ef-43a0-a170-2f4f1ca48195" containerName="ceilometer-central-agent" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.226941 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="c67d578b-b8ef-43a0-a170-2f4f1ca48195" containerName="ceilometer-central-agent" Mar 12 15:08:59 crc kubenswrapper[4869]: E0312 15:08:59.226957 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95e0a90a-0269-471b-bd8d-a110809af063" containerName="dnsmasq-dns" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.226964 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="95e0a90a-0269-471b-bd8d-a110809af063" containerName="dnsmasq-dns" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.227172 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="95e0a90a-0269-471b-bd8d-a110809af063" containerName="dnsmasq-dns" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.227189 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="d33ab78c-e70a-4b83-9c6d-d94fab2fbd55" containerName="neutron-api" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.227202 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="c67d578b-b8ef-43a0-a170-2f4f1ca48195" containerName="ceilometer-central-agent" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.227219 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="c67d578b-b8ef-43a0-a170-2f4f1ca48195" containerName="proxy-httpd" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.227228 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="d33ab78c-e70a-4b83-9c6d-d94fab2fbd55" containerName="neutron-httpd" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.227243 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4d63031-e072-466e-ae3c-d829a699b197" containerName="horizon" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.227251 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="c67d578b-b8ef-43a0-a170-2f4f1ca48195" containerName="ceilometer-notification-agent" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.227265 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="c67d578b-b8ef-43a0-a170-2f4f1ca48195" containerName="sg-core" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.227281 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4d63031-e072-466e-ae3c-d829a699b197" containerName="horizon-log" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.230844 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.236460 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.236716 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.242120 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d33ab78c-e70a-4b83-9c6d-d94fab2fbd55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.242163 4869 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d33ab78c-e70a-4b83-9c6d-d94fab2fbd55-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.242176 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d33ab78c-e70a-4b83-9c6d-d94fab2fbd55-config\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.285858 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.349622 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9gnr\" (UniqueName: \"kubernetes.io/projected/cd15c8a9-0582-4c14-9450-407ad0cfa828-kube-api-access-d9gnr\") pod \"ceilometer-0\" (UID: \"cd15c8a9-0582-4c14-9450-407ad0cfa828\") " pod="openstack/ceilometer-0" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.349703 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cd15c8a9-0582-4c14-9450-407ad0cfa828-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cd15c8a9-0582-4c14-9450-407ad0cfa828\") " pod="openstack/ceilometer-0" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.349744 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd15c8a9-0582-4c14-9450-407ad0cfa828-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cd15c8a9-0582-4c14-9450-407ad0cfa828\") " pod="openstack/ceilometer-0" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.349784 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd15c8a9-0582-4c14-9450-407ad0cfa828-config-data\") pod \"ceilometer-0\" (UID: \"cd15c8a9-0582-4c14-9450-407ad0cfa828\") " pod="openstack/ceilometer-0" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.349906 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd15c8a9-0582-4c14-9450-407ad0cfa828-log-httpd\") pod \"ceilometer-0\" (UID: \"cd15c8a9-0582-4c14-9450-407ad0cfa828\") " pod="openstack/ceilometer-0" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.350118 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd15c8a9-0582-4c14-9450-407ad0cfa828-scripts\") pod \"ceilometer-0\" (UID: \"cd15c8a9-0582-4c14-9450-407ad0cfa828\") " pod="openstack/ceilometer-0" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.350162 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd15c8a9-0582-4c14-9450-407ad0cfa828-run-httpd\") pod \"ceilometer-0\" (UID: \"cd15c8a9-0582-4c14-9450-407ad0cfa828\") " pod="openstack/ceilometer-0" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.418603 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7f7dc9cccb-h2224"] Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.439609 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7f7dc9cccb-h2224"] Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.451567 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd15c8a9-0582-4c14-9450-407ad0cfa828-scripts\") pod \"ceilometer-0\" (UID: \"cd15c8a9-0582-4c14-9450-407ad0cfa828\") " pod="openstack/ceilometer-0" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.451605 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd15c8a9-0582-4c14-9450-407ad0cfa828-run-httpd\") pod \"ceilometer-0\" (UID: \"cd15c8a9-0582-4c14-9450-407ad0cfa828\") " pod="openstack/ceilometer-0" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.451697 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9gnr\" (UniqueName: \"kubernetes.io/projected/cd15c8a9-0582-4c14-9450-407ad0cfa828-kube-api-access-d9gnr\") pod \"ceilometer-0\" (UID: \"cd15c8a9-0582-4c14-9450-407ad0cfa828\") " pod="openstack/ceilometer-0" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.451733 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cd15c8a9-0582-4c14-9450-407ad0cfa828-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cd15c8a9-0582-4c14-9450-407ad0cfa828\") " pod="openstack/ceilometer-0" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.451756 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd15c8a9-0582-4c14-9450-407ad0cfa828-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cd15c8a9-0582-4c14-9450-407ad0cfa828\") " pod="openstack/ceilometer-0" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.451780 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd15c8a9-0582-4c14-9450-407ad0cfa828-config-data\") pod \"ceilometer-0\" (UID: \"cd15c8a9-0582-4c14-9450-407ad0cfa828\") " pod="openstack/ceilometer-0" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.451802 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd15c8a9-0582-4c14-9450-407ad0cfa828-log-httpd\") pod \"ceilometer-0\" (UID: \"cd15c8a9-0582-4c14-9450-407ad0cfa828\") " pod="openstack/ceilometer-0" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.452207 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd15c8a9-0582-4c14-9450-407ad0cfa828-log-httpd\") pod \"ceilometer-0\" (UID: \"cd15c8a9-0582-4c14-9450-407ad0cfa828\") " pod="openstack/ceilometer-0" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.455164 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd15c8a9-0582-4c14-9450-407ad0cfa828-run-httpd\") pod \"ceilometer-0\" (UID: \"cd15c8a9-0582-4c14-9450-407ad0cfa828\") " pod="openstack/ceilometer-0" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.458869 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd15c8a9-0582-4c14-9450-407ad0cfa828-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cd15c8a9-0582-4c14-9450-407ad0cfa828\") " pod="openstack/ceilometer-0" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.459273 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cd15c8a9-0582-4c14-9450-407ad0cfa828-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cd15c8a9-0582-4c14-9450-407ad0cfa828\") " pod="openstack/ceilometer-0" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.459959 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd15c8a9-0582-4c14-9450-407ad0cfa828-scripts\") pod \"ceilometer-0\" (UID: \"cd15c8a9-0582-4c14-9450-407ad0cfa828\") " pod="openstack/ceilometer-0" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.462664 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd15c8a9-0582-4c14-9450-407ad0cfa828-config-data\") pod \"ceilometer-0\" (UID: \"cd15c8a9-0582-4c14-9450-407ad0cfa828\") " pod="openstack/ceilometer-0" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.467718 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9gnr\" (UniqueName: \"kubernetes.io/projected/cd15c8a9-0582-4c14-9450-407ad0cfa828-kube-api-access-d9gnr\") pod \"ceilometer-0\" (UID: \"cd15c8a9-0582-4c14-9450-407ad0cfa828\") " pod="openstack/ceilometer-0" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.497824 4869 scope.go:117] "RemoveContainer" containerID="a26eec3d4fd18f4819fca706f27ddd2aa65eae9751ad98f439f13147a5de8480" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.545377 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e653-account-create-update-mjf5p"] Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.575910 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.611592 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-mb78b"] Mar 12 15:08:59 crc kubenswrapper[4869]: W0312 15:08:59.634207 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec9fe62a_a7c1_4c9b_8520_2046bc10995a.slice/crio-dbc4a01bbbfec1dc8adfacd8db467501d1a737302640868cbae03124eab2491a WatchSource:0}: Error finding container dbc4a01bbbfec1dc8adfacd8db467501d1a737302640868cbae03124eab2491a: Status 404 returned error can't find the container with id dbc4a01bbbfec1dc8adfacd8db467501d1a737302640868cbae03124eab2491a Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.644742 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-96j6d"] Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.656247 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.674623 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-43b5-account-create-update-5ggll"] Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.691821 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-pzfq5"] Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.701764 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5bcd88f7f5-258jj"] Mar 12 15:08:59 crc kubenswrapper[4869]: I0312 15:08:59.947413 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56696ff475-snqth"] Mar 12 15:09:00 crc kubenswrapper[4869]: I0312 15:09:00.087865 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 15:09:00 crc kubenswrapper[4869]: I0312 15:09:00.148805 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mb78b" event={"ID":"ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2","Type":"ContainerStarted","Data":"4a124a617d00ca23ad987e9136a9cb1a047ad42157b1d3bf6c72b409e8b12ba8"} Mar 12 15:09:00 crc kubenswrapper[4869]: I0312 15:09:00.176687 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 12 15:09:00 crc kubenswrapper[4869]: I0312 15:09:00.201679 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-pzfq5" event={"ID":"42cdbca9-8e71-4fd8-8389-d4476417217b","Type":"ContainerStarted","Data":"cf8c1ee1f8a2a6e97433d5bb58238564829e8a8d64fd74fd3ee51b61754b85e6"} Mar 12 15:09:00 crc kubenswrapper[4869]: I0312 15:09:00.258774 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:09:00 crc kubenswrapper[4869]: I0312 15:09:00.259127 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5bcd88f7f5-258jj" event={"ID":"944bd0f4-e7ea-430d-995e-fabdf1f79bab","Type":"ContainerStarted","Data":"48e8d62ae11db83ba5a2a8e4552b2b7dfca078a719a33532754a40e55138c5f5"} Mar 12 15:09:00 crc kubenswrapper[4869]: I0312 15:09:00.337802 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-43b5-account-create-update-5ggll" event={"ID":"ec9fe62a-a7c1-4c9b-8520-2046bc10995a","Type":"ContainerStarted","Data":"dbc4a01bbbfec1dc8adfacd8db467501d1a737302640868cbae03124eab2491a"} Mar 12 15:09:00 crc kubenswrapper[4869]: I0312 15:09:00.364892 4869 generic.go:334] "Generic (PLEG): container finished" podID="2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7" containerID="de32733dca9c691a1b604fb64e3a77cd2eaec7842e4b9d2439487dbab3745bff" exitCode=0 Mar 12 15:09:00 crc kubenswrapper[4869]: I0312 15:09:00.464958 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95e0a90a-0269-471b-bd8d-a110809af063" path="/var/lib/kubelet/pods/95e0a90a-0269-471b-bd8d-a110809af063/volumes" Mar 12 15:09:00 crc kubenswrapper[4869]: I0312 15:09:00.466048 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4d63031-e072-466e-ae3c-d829a699b197" path="/var/lib/kubelet/pods/b4d63031-e072-466e-ae3c-d829a699b197/volumes" Mar 12 15:09:00 crc kubenswrapper[4869]: I0312 15:09:00.466710 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c67d578b-b8ef-43a0-a170-2f4f1ca48195" path="/var/lib/kubelet/pods/c67d578b-b8ef-43a0-a170-2f4f1ca48195/volumes" Mar 12 15:09:00 crc kubenswrapper[4869]: I0312 15:09:00.480876 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d33ab78c-e70a-4b83-9c6d-d94fab2fbd55" path="/var/lib/kubelet/pods/d33ab78c-e70a-4b83-9c6d-d94fab2fbd55/volumes" Mar 12 15:09:00 crc kubenswrapper[4869]: I0312 15:09:00.481518 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1e62-account-create-update-8jrd8" event={"ID":"2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7","Type":"ContainerDied","Data":"de32733dca9c691a1b604fb64e3a77cd2eaec7842e4b9d2439487dbab3745bff"} Mar 12 15:09:00 crc kubenswrapper[4869]: I0312 15:09:00.481566 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"6838a2d7-2052-45b9-a8d5-3aa6639bccb4","Type":"ContainerStarted","Data":"7f81bd9fade21a35c31dd509835b1c12df5880f2683e340a53bc73ea3406d1ca"} Mar 12 15:09:00 crc kubenswrapper[4869]: I0312 15:09:00.481580 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-96j6d" event={"ID":"e396abd5-7a42-405e-827f-85f8426c6ed6","Type":"ContainerStarted","Data":"43c63e505faed973731922370114da3d39980aa27a6b9f07370e7acaf583a194"} Mar 12 15:09:00 crc kubenswrapper[4869]: I0312 15:09:00.502425 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e653-account-create-update-mjf5p" event={"ID":"8081e56b-635e-4137-982b-c5eafd77af8e","Type":"ContainerStarted","Data":"e4f86add35e5bc48999071e9bc112e0fe628d881aa94f7faaaa5cef732fc3295"} Mar 12 15:09:00 crc kubenswrapper[4869]: I0312 15:09:00.531307 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9b204f01-5a99-40b4-9cbf-38711f1d1f82","Type":"ContainerStarted","Data":"feb6c774a8ecd29253de604b3c9516871180dc2ea967fdbf62a5ba76a83de54b"} Mar 12 15:09:00 crc kubenswrapper[4869]: I0312 15:09:00.531507 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 12 15:09:00 crc kubenswrapper[4869]: I0312 15:09:00.570884 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=4.774688149 podStartE2EDuration="26.570866922s" podCreationTimestamp="2026-03-12 15:08:34 +0000 UTC" firstStartedPulling="2026-03-12 15:08:35.667411439 +0000 UTC m=+1267.952636717" lastFinishedPulling="2026-03-12 15:08:57.463590212 +0000 UTC m=+1289.748815490" observedRunningTime="2026-03-12 15:09:00.528274193 +0000 UTC m=+1292.813499481" watchObservedRunningTime="2026-03-12 15:09:00.570866922 +0000 UTC m=+1292.856092200" Mar 12 15:09:00 crc kubenswrapper[4869]: I0312 15:09:00.579323 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-e653-account-create-update-mjf5p" podStartSLOduration=12.579307553 podStartE2EDuration="12.579307553s" podCreationTimestamp="2026-03-12 15:08:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:09:00.548918844 +0000 UTC m=+1292.834144152" watchObservedRunningTime="2026-03-12 15:09:00.579307553 +0000 UTC m=+1292.864532831" Mar 12 15:09:00 crc kubenswrapper[4869]: W0312 15:09:00.582477 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc13a91ca_db8c_42b4_bfca_029e427aff28.slice/crio-23f0e9b3a7cf88948884e8f788df4eb82dcd7e1e08852ac82e535a3497644ef7 WatchSource:0}: Error finding container 23f0e9b3a7cf88948884e8f788df4eb82dcd7e1e08852ac82e535a3497644ef7: Status 404 returned error can't find the container with id 23f0e9b3a7cf88948884e8f788df4eb82dcd7e1e08852ac82e535a3497644ef7 Mar 12 15:09:01 crc kubenswrapper[4869]: I0312 15:09:01.552628 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd15c8a9-0582-4c14-9450-407ad0cfa828","Type":"ContainerStarted","Data":"cd3b0393b44f190417d696075b113b7b1391d50f46566b5dd5e1adff25caf78c"} Mar 12 15:09:01 crc kubenswrapper[4869]: I0312 15:09:01.558719 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9","Type":"ContainerStarted","Data":"1024636ffd65b6773aaf79a479050440cd4b9c5201e56d0c9d3de77a7fe57bbc"} Mar 12 15:09:01 crc kubenswrapper[4869]: I0312 15:09:01.560150 4869 generic.go:334] "Generic (PLEG): container finished" podID="ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2" containerID="b874d3c15299068fb51df91aa3c1b783a965674412046dea094021820c9f052c" exitCode=0 Mar 12 15:09:01 crc kubenswrapper[4869]: I0312 15:09:01.560207 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mb78b" event={"ID":"ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2","Type":"ContainerDied","Data":"b874d3c15299068fb51df91aa3c1b783a965674412046dea094021820c9f052c"} Mar 12 15:09:01 crc kubenswrapper[4869]: I0312 15:09:01.566938 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"5e9514e1-cc53-4445-904c-4505fc60a1ea","Type":"ContainerStarted","Data":"bb647e871607301d071226b3a30170442ae891103c7a693f204539af0acbfd22"} Mar 12 15:09:01 crc kubenswrapper[4869]: I0312 15:09:01.572461 4869 generic.go:334] "Generic (PLEG): container finished" podID="e396abd5-7a42-405e-827f-85f8426c6ed6" containerID="7ba235e0554dcf21230c00522e982cb5615025c5105428b490d0d2c068ffc132" exitCode=0 Mar 12 15:09:01 crc kubenswrapper[4869]: I0312 15:09:01.572522 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-96j6d" event={"ID":"e396abd5-7a42-405e-827f-85f8426c6ed6","Type":"ContainerDied","Data":"7ba235e0554dcf21230c00522e982cb5615025c5105428b490d0d2c068ffc132"} Mar 12 15:09:01 crc kubenswrapper[4869]: I0312 15:09:01.582021 4869 generic.go:334] "Generic (PLEG): container finished" podID="3fa83a8e-fd16-444a-8967-17725d75565d" containerID="3c98d32f355dd68e7daf78c6478c6758363c3cb4ff1388fc0fbf56edd988d467" exitCode=0 Mar 12 15:09:01 crc kubenswrapper[4869]: I0312 15:09:01.582094 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"3fa83a8e-fd16-444a-8967-17725d75565d","Type":"ContainerDied","Data":"3c98d32f355dd68e7daf78c6478c6758363c3cb4ff1388fc0fbf56edd988d467"} Mar 12 15:09:01 crc kubenswrapper[4869]: I0312 15:09:01.583509 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e653-account-create-update-mjf5p" event={"ID":"8081e56b-635e-4137-982b-c5eafd77af8e","Type":"ContainerStarted","Data":"16eb13845162e28d419a27553235b31320bb1c2ded11d9bbf92324524bce06d0"} Mar 12 15:09:01 crc kubenswrapper[4869]: I0312 15:09:01.619216 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-43b5-account-create-update-5ggll" event={"ID":"ec9fe62a-a7c1-4c9b-8520-2046bc10995a","Type":"ContainerStarted","Data":"bb8efafcd12a55d0d332b344fd4a1e8bd28d6b8f8fea0f88349c25e3572ae200"} Mar 12 15:09:01 crc kubenswrapper[4869]: I0312 15:09:01.621346 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5bcd88f7f5-258jj" event={"ID":"944bd0f4-e7ea-430d-995e-fabdf1f79bab","Type":"ContainerStarted","Data":"2f9f4a5d20b2f3f52d3412ac04848e49815fbab7a580729f63127f0cf8d52acd"} Mar 12 15:09:01 crc kubenswrapper[4869]: I0312 15:09:01.629526 4869 generic.go:334] "Generic (PLEG): container finished" podID="42cdbca9-8e71-4fd8-8389-d4476417217b" containerID="025491aefa24c237c6480179bf7cd3a53bf1e21bea7a7e81c4244e42724177a2" exitCode=0 Mar 12 15:09:01 crc kubenswrapper[4869]: I0312 15:09:01.629613 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-pzfq5" event={"ID":"42cdbca9-8e71-4fd8-8389-d4476417217b","Type":"ContainerDied","Data":"025491aefa24c237c6480179bf7cd3a53bf1e21bea7a7e81c4244e42724177a2"} Mar 12 15:09:01 crc kubenswrapper[4869]: I0312 15:09:01.635778 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"c13a91ca-db8c-42b4-bfca-029e427aff28","Type":"ContainerStarted","Data":"23f0e9b3a7cf88948884e8f788df4eb82dcd7e1e08852ac82e535a3497644ef7"} Mar 12 15:09:01 crc kubenswrapper[4869]: I0312 15:09:01.648310 4869 generic.go:334] "Generic (PLEG): container finished" podID="7d8820b4-254f-4f89-8609-a8b86b0d5796" containerID="56b88304e21c94aee28d3896cf3e49687eab4ce788807f2cab4c37e04bdaed22" exitCode=0 Mar 12 15:09:01 crc kubenswrapper[4869]: I0312 15:09:01.648622 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56696ff475-snqth" event={"ID":"7d8820b4-254f-4f89-8609-a8b86b0d5796","Type":"ContainerDied","Data":"56b88304e21c94aee28d3896cf3e49687eab4ce788807f2cab4c37e04bdaed22"} Mar 12 15:09:01 crc kubenswrapper[4869]: I0312 15:09:01.648660 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56696ff475-snqth" event={"ID":"7d8820b4-254f-4f89-8609-a8b86b0d5796","Type":"ContainerStarted","Data":"e85cf88af6a7a070fef5545697d2390f90c3fa7993c40a268a1ecb0601f8e04d"} Mar 12 15:09:01 crc kubenswrapper[4869]: I0312 15:09:01.650278 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-43b5-account-create-update-5ggll" podStartSLOduration=14.650258267 podStartE2EDuration="14.650258267s" podCreationTimestamp="2026-03-12 15:08:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:09:01.644274126 +0000 UTC m=+1293.929499404" watchObservedRunningTime="2026-03-12 15:09:01.650258267 +0000 UTC m=+1293.935483545" Mar 12 15:09:01 crc kubenswrapper[4869]: I0312 15:09:01.656608 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9b204f01-5a99-40b4-9cbf-38711f1d1f82","Type":"ContainerStarted","Data":"50fdb66f8813bb921f74ea9f96ac6c094697df019585ba1e85312e85d2f7ab71"} Mar 12 15:09:01 crc kubenswrapper[4869]: I0312 15:09:01.677067 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-54df54bd-p64lp" Mar 12 15:09:01 crc kubenswrapper[4869]: I0312 15:09:01.784232 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-54df54bd-p64lp" Mar 12 15:09:01 crc kubenswrapper[4869]: I0312 15:09:01.860308 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-666b564cfb-pk78f"] Mar 12 15:09:01 crc kubenswrapper[4869]: I0312 15:09:01.860596 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-666b564cfb-pk78f" podUID="340fff5b-1313-4cbd-82fc-fee5941c1aef" containerName="placement-log" containerID="cri-o://cbdd6143d6e8baeff3ee039b1708c0b35bbd32ab8aeef7031d49822ee148b006" gracePeriod=30 Mar 12 15:09:01 crc kubenswrapper[4869]: I0312 15:09:01.861055 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-666b564cfb-pk78f" podUID="340fff5b-1313-4cbd-82fc-fee5941c1aef" containerName="placement-api" containerID="cri-o://c48102a109e8a9305cb4408ad78c9f91ca883cff4ad62cb7da26befaae38e271" gracePeriod=30 Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.468755 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.473993 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1e62-account-create-update-8jrd8" Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.578681 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fa83a8e-fd16-444a-8967-17725d75565d-combined-ca-bundle\") pod \"3fa83a8e-fd16-444a-8967-17725d75565d\" (UID: \"3fa83a8e-fd16-444a-8967-17725d75565d\") " Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.578758 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fa83a8e-fd16-444a-8967-17725d75565d-config-data\") pod \"3fa83a8e-fd16-444a-8967-17725d75565d\" (UID: \"3fa83a8e-fd16-444a-8967-17725d75565d\") " Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.578827 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-799ln\" (UniqueName: \"kubernetes.io/projected/3fa83a8e-fd16-444a-8967-17725d75565d-kube-api-access-799ln\") pod \"3fa83a8e-fd16-444a-8967-17725d75565d\" (UID: \"3fa83a8e-fd16-444a-8967-17725d75565d\") " Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.578877 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fa83a8e-fd16-444a-8967-17725d75565d-config-data-custom\") pod \"3fa83a8e-fd16-444a-8967-17725d75565d\" (UID: \"3fa83a8e-fd16-444a-8967-17725d75565d\") " Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.578924 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7-operator-scripts\") pod \"2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7\" (UID: \"2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7\") " Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.578991 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fa83a8e-fd16-444a-8967-17725d75565d-scripts\") pod \"3fa83a8e-fd16-444a-8967-17725d75565d\" (UID: \"3fa83a8e-fd16-444a-8967-17725d75565d\") " Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.579031 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtdgf\" (UniqueName: \"kubernetes.io/projected/2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7-kube-api-access-wtdgf\") pod \"2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7\" (UID: \"2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7\") " Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.579116 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3fa83a8e-fd16-444a-8967-17725d75565d-etc-machine-id\") pod \"3fa83a8e-fd16-444a-8967-17725d75565d\" (UID: \"3fa83a8e-fd16-444a-8967-17725d75565d\") " Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.579675 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3fa83a8e-fd16-444a-8967-17725d75565d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3fa83a8e-fd16-444a-8967-17725d75565d" (UID: "3fa83a8e-fd16-444a-8967-17725d75565d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.580785 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7" (UID: "2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.584062 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fa83a8e-fd16-444a-8967-17725d75565d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3fa83a8e-fd16-444a-8967-17725d75565d" (UID: "3fa83a8e-fd16-444a-8967-17725d75565d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.584845 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fa83a8e-fd16-444a-8967-17725d75565d-kube-api-access-799ln" (OuterVolumeSpecName: "kube-api-access-799ln") pod "3fa83a8e-fd16-444a-8967-17725d75565d" (UID: "3fa83a8e-fd16-444a-8967-17725d75565d"). InnerVolumeSpecName "kube-api-access-799ln". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.585125 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fa83a8e-fd16-444a-8967-17725d75565d-scripts" (OuterVolumeSpecName: "scripts") pod "3fa83a8e-fd16-444a-8967-17725d75565d" (UID: "3fa83a8e-fd16-444a-8967-17725d75565d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.586755 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7-kube-api-access-wtdgf" (OuterVolumeSpecName: "kube-api-access-wtdgf") pod "2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7" (UID: "2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7"). InnerVolumeSpecName "kube-api-access-wtdgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.656429 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.663834 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fa83a8e-fd16-444a-8967-17725d75565d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3fa83a8e-fd16-444a-8967-17725d75565d" (UID: "3fa83a8e-fd16-444a-8967-17725d75565d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.676317 4869 generic.go:334] "Generic (PLEG): container finished" podID="8081e56b-635e-4137-982b-c5eafd77af8e" containerID="16eb13845162e28d419a27553235b31320bb1c2ded11d9bbf92324524bce06d0" exitCode=0 Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.676428 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e653-account-create-update-mjf5p" event={"ID":"8081e56b-635e-4137-982b-c5eafd77af8e","Type":"ContainerDied","Data":"16eb13845162e28d419a27553235b31320bb1c2ded11d9bbf92324524bce06d0"} Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.681844 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-799ln\" (UniqueName: \"kubernetes.io/projected/3fa83a8e-fd16-444a-8967-17725d75565d-kube-api-access-799ln\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.682085 4869 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fa83a8e-fd16-444a-8967-17725d75565d-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.682173 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.682245 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fa83a8e-fd16-444a-8967-17725d75565d-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.682357 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtdgf\" (UniqueName: \"kubernetes.io/projected/2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7-kube-api-access-wtdgf\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.682427 4869 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3fa83a8e-fd16-444a-8967-17725d75565d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.682499 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fa83a8e-fd16-444a-8967-17725d75565d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.685377 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9b204f01-5a99-40b4-9cbf-38711f1d1f82","Type":"ContainerStarted","Data":"b23f2eefd51489d4e7567ecf494aceea183db4dd632b006f2adb56606b9acf62"} Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.685557 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="9b204f01-5a99-40b4-9cbf-38711f1d1f82" containerName="cinder-api-log" containerID="cri-o://50fdb66f8813bb921f74ea9f96ac6c094697df019585ba1e85312e85d2f7ab71" gracePeriod=30 Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.685630 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.685665 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="9b204f01-5a99-40b4-9cbf-38711f1d1f82" containerName="cinder-api" containerID="cri-o://b23f2eefd51489d4e7567ecf494aceea183db4dd632b006f2adb56606b9acf62" gracePeriod=30 Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.697754 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1e62-account-create-update-8jrd8" event={"ID":"2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7","Type":"ContainerDied","Data":"a536d0c13b45533668d79120ea209a7131da6ecbc32a9831848df29380c648a2"} Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.697800 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a536d0c13b45533668d79120ea209a7131da6ecbc32a9831848df29380c648a2" Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.697877 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1e62-account-create-update-8jrd8" Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.706346 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5bcd88f7f5-258jj" event={"ID":"944bd0f4-e7ea-430d-995e-fabdf1f79bab","Type":"ContainerStarted","Data":"a2cc96be42d9cb8b3fe81a3de863af7750ec4e195ab2f3c720722c5a5bf95b03"} Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.752558 4869 generic.go:334] "Generic (PLEG): container finished" podID="340fff5b-1313-4cbd-82fc-fee5941c1aef" containerID="cbdd6143d6e8baeff3ee039b1708c0b35bbd32ab8aeef7031d49822ee148b006" exitCode=143 Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.752617 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-666b564cfb-pk78f" event={"ID":"340fff5b-1313-4cbd-82fc-fee5941c1aef","Type":"ContainerDied","Data":"cbdd6143d6e8baeff3ee039b1708c0b35bbd32ab8aeef7031d49822ee148b006"} Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.755062 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.755269 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"3fa83a8e-fd16-444a-8967-17725d75565d","Type":"ContainerDied","Data":"a4eacc1352efee4e3f0b177418ac7cc5acb4c13aadb0779ebb54e35c2a1aba2e"} Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.755299 4869 scope.go:117] "RemoveContainer" containerID="ac01121649e7151b6fccc1ceaef28cc60d4fc6a663ba1bdcfc03a76d5b3025cf" Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.786202 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fa83a8e-fd16-444a-8967-17725d75565d-config-data" (OuterVolumeSpecName: "config-data") pod "3fa83a8e-fd16-444a-8967-17725d75565d" (UID: "3fa83a8e-fd16-444a-8967-17725d75565d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.796133 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5bcd88f7f5-258jj" podStartSLOduration=18.796115163 podStartE2EDuration="18.796115163s" podCreationTimestamp="2026-03-12 15:08:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:09:02.780933679 +0000 UTC m=+1295.066158957" watchObservedRunningTime="2026-03-12 15:09:02.796115163 +0000 UTC m=+1295.081340431" Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.808054 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=12.808037173 podStartE2EDuration="12.808037173s" podCreationTimestamp="2026-03-12 15:08:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:09:02.798963873 +0000 UTC m=+1295.084189151" watchObservedRunningTime="2026-03-12 15:09:02.808037173 +0000 UTC m=+1295.093262451" Mar 12 15:09:02 crc kubenswrapper[4869]: I0312 15:09:02.888599 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fa83a8e-fd16-444a-8967-17725d75565d-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.105848 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.146334 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.163718 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Mar 12 15:09:03 crc kubenswrapper[4869]: E0312 15:09:03.164228 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7" containerName="mariadb-account-create-update" Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.164242 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7" containerName="mariadb-account-create-update" Mar 12 15:09:03 crc kubenswrapper[4869]: E0312 15:09:03.164255 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fa83a8e-fd16-444a-8967-17725d75565d" containerName="probe" Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.164261 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fa83a8e-fd16-444a-8967-17725d75565d" containerName="probe" Mar 12 15:09:03 crc kubenswrapper[4869]: E0312 15:09:03.164283 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fa83a8e-fd16-444a-8967-17725d75565d" containerName="manila-scheduler" Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.164289 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fa83a8e-fd16-444a-8967-17725d75565d" containerName="manila-scheduler" Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.164502 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fa83a8e-fd16-444a-8967-17725d75565d" containerName="manila-scheduler" Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.164515 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fa83a8e-fd16-444a-8967-17725d75565d" containerName="probe" Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.164532 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7" containerName="mariadb-account-create-update" Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.165704 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.171365 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.174365 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.189954 4869 scope.go:117] "RemoveContainer" containerID="3c98d32f355dd68e7daf78c6478c6758363c3cb4ff1388fc0fbf56edd988d467" Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.319338 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0b57477-4e74-4eb5-a0a0-c11e022f6919-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"b0b57477-4e74-4eb5-a0a0-c11e022f6919\") " pod="openstack/manila-scheduler-0" Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.319733 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b0b57477-4e74-4eb5-a0a0-c11e022f6919-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"b0b57477-4e74-4eb5-a0a0-c11e022f6919\") " pod="openstack/manila-scheduler-0" Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.319765 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0b57477-4e74-4eb5-a0a0-c11e022f6919-scripts\") pod \"manila-scheduler-0\" (UID: \"b0b57477-4e74-4eb5-a0a0-c11e022f6919\") " pod="openstack/manila-scheduler-0" Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.319798 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlm5q\" (UniqueName: \"kubernetes.io/projected/b0b57477-4e74-4eb5-a0a0-c11e022f6919-kube-api-access-xlm5q\") pod \"manila-scheduler-0\" (UID: \"b0b57477-4e74-4eb5-a0a0-c11e022f6919\") " pod="openstack/manila-scheduler-0" Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.319818 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0b57477-4e74-4eb5-a0a0-c11e022f6919-config-data\") pod \"manila-scheduler-0\" (UID: \"b0b57477-4e74-4eb5-a0a0-c11e022f6919\") " pod="openstack/manila-scheduler-0" Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.319855 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b0b57477-4e74-4eb5-a0a0-c11e022f6919-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"b0b57477-4e74-4eb5-a0a0-c11e022f6919\") " pod="openstack/manila-scheduler-0" Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.421382 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b0b57477-4e74-4eb5-a0a0-c11e022f6919-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"b0b57477-4e74-4eb5-a0a0-c11e022f6919\") " pod="openstack/manila-scheduler-0" Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.421519 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0b57477-4e74-4eb5-a0a0-c11e022f6919-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"b0b57477-4e74-4eb5-a0a0-c11e022f6919\") " pod="openstack/manila-scheduler-0" Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.421600 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b0b57477-4e74-4eb5-a0a0-c11e022f6919-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"b0b57477-4e74-4eb5-a0a0-c11e022f6919\") " pod="openstack/manila-scheduler-0" Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.421627 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0b57477-4e74-4eb5-a0a0-c11e022f6919-scripts\") pod \"manila-scheduler-0\" (UID: \"b0b57477-4e74-4eb5-a0a0-c11e022f6919\") " pod="openstack/manila-scheduler-0" Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.421677 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlm5q\" (UniqueName: \"kubernetes.io/projected/b0b57477-4e74-4eb5-a0a0-c11e022f6919-kube-api-access-xlm5q\") pod \"manila-scheduler-0\" (UID: \"b0b57477-4e74-4eb5-a0a0-c11e022f6919\") " pod="openstack/manila-scheduler-0" Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.421696 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0b57477-4e74-4eb5-a0a0-c11e022f6919-config-data\") pod \"manila-scheduler-0\" (UID: \"b0b57477-4e74-4eb5-a0a0-c11e022f6919\") " pod="openstack/manila-scheduler-0" Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.424431 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b0b57477-4e74-4eb5-a0a0-c11e022f6919-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"b0b57477-4e74-4eb5-a0a0-c11e022f6919\") " pod="openstack/manila-scheduler-0" Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.426311 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0b57477-4e74-4eb5-a0a0-c11e022f6919-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"b0b57477-4e74-4eb5-a0a0-c11e022f6919\") " pod="openstack/manila-scheduler-0" Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.430661 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0b57477-4e74-4eb5-a0a0-c11e022f6919-config-data\") pod \"manila-scheduler-0\" (UID: \"b0b57477-4e74-4eb5-a0a0-c11e022f6919\") " pod="openstack/manila-scheduler-0" Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.433096 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b0b57477-4e74-4eb5-a0a0-c11e022f6919-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"b0b57477-4e74-4eb5-a0a0-c11e022f6919\") " pod="openstack/manila-scheduler-0" Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.437577 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0b57477-4e74-4eb5-a0a0-c11e022f6919-scripts\") pod \"manila-scheduler-0\" (UID: \"b0b57477-4e74-4eb5-a0a0-c11e022f6919\") " pod="openstack/manila-scheduler-0" Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.447076 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlm5q\" (UniqueName: \"kubernetes.io/projected/b0b57477-4e74-4eb5-a0a0-c11e022f6919-kube-api-access-xlm5q\") pod \"manila-scheduler-0\" (UID: \"b0b57477-4e74-4eb5-a0a0-c11e022f6919\") " pod="openstack/manila-scheduler-0" Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.550114 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.765920 4869 generic.go:334] "Generic (PLEG): container finished" podID="9b204f01-5a99-40b4-9cbf-38711f1d1f82" containerID="b23f2eefd51489d4e7567ecf494aceea183db4dd632b006f2adb56606b9acf62" exitCode=0 Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.766229 4869 generic.go:334] "Generic (PLEG): container finished" podID="9b204f01-5a99-40b4-9cbf-38711f1d1f82" containerID="50fdb66f8813bb921f74ea9f96ac6c094697df019585ba1e85312e85d2f7ab71" exitCode=143 Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.766121 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9b204f01-5a99-40b4-9cbf-38711f1d1f82","Type":"ContainerDied","Data":"b23f2eefd51489d4e7567ecf494aceea183db4dd632b006f2adb56606b9acf62"} Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.766293 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9b204f01-5a99-40b4-9cbf-38711f1d1f82","Type":"ContainerDied","Data":"50fdb66f8813bb921f74ea9f96ac6c094697df019585ba1e85312e85d2f7ab71"} Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.768182 4869 generic.go:334] "Generic (PLEG): container finished" podID="ec9fe62a-a7c1-4c9b-8520-2046bc10995a" containerID="bb8efafcd12a55d0d332b344fd4a1e8bd28d6b8f8fea0f88349c25e3572ae200" exitCode=0 Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.768231 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-43b5-account-create-update-5ggll" event={"ID":"ec9fe62a-a7c1-4c9b-8520-2046bc10995a","Type":"ContainerDied","Data":"bb8efafcd12a55d0d332b344fd4a1e8bd28d6b8f8fea0f88349c25e3572ae200"} Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.770924 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5bcd88f7f5-258jj" Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.770958 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5bcd88f7f5-258jj" Mar 12 15:09:03 crc kubenswrapper[4869]: I0312 15:09:03.919414 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.066974 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mb78b" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.082574 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-96j6d" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.098252 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-pzfq5" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.134505 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2-operator-scripts\") pod \"ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2\" (UID: \"ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2\") " Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.134599 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e396abd5-7a42-405e-827f-85f8426c6ed6-operator-scripts\") pod \"e396abd5-7a42-405e-827f-85f8426c6ed6\" (UID: \"e396abd5-7a42-405e-827f-85f8426c6ed6\") " Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.134734 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvklw\" (UniqueName: \"kubernetes.io/projected/e396abd5-7a42-405e-827f-85f8426c6ed6-kube-api-access-zvklw\") pod \"e396abd5-7a42-405e-827f-85f8426c6ed6\" (UID: \"e396abd5-7a42-405e-827f-85f8426c6ed6\") " Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.134800 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfhsr\" (UniqueName: \"kubernetes.io/projected/ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2-kube-api-access-wfhsr\") pod \"ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2\" (UID: \"ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2\") " Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.135694 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2" (UID: "ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.136131 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e396abd5-7a42-405e-827f-85f8426c6ed6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e396abd5-7a42-405e-827f-85f8426c6ed6" (UID: "e396abd5-7a42-405e-827f-85f8426c6ed6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.145420 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2-kube-api-access-wfhsr" (OuterVolumeSpecName: "kube-api-access-wfhsr") pod "ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2" (UID: "ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2"). InnerVolumeSpecName "kube-api-access-wfhsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.145606 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e396abd5-7a42-405e-827f-85f8426c6ed6-kube-api-access-zvklw" (OuterVolumeSpecName: "kube-api-access-zvklw") pod "e396abd5-7a42-405e-827f-85f8426c6ed6" (UID: "e396abd5-7a42-405e-827f-85f8426c6ed6"). InnerVolumeSpecName "kube-api-access-zvklw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.236167 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cqws\" (UniqueName: \"kubernetes.io/projected/42cdbca9-8e71-4fd8-8389-d4476417217b-kube-api-access-5cqws\") pod \"42cdbca9-8e71-4fd8-8389-d4476417217b\" (UID: \"42cdbca9-8e71-4fd8-8389-d4476417217b\") " Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.236557 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42cdbca9-8e71-4fd8-8389-d4476417217b-operator-scripts\") pod \"42cdbca9-8e71-4fd8-8389-d4476417217b\" (UID: \"42cdbca9-8e71-4fd8-8389-d4476417217b\") " Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.237237 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.237256 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e396abd5-7a42-405e-827f-85f8426c6ed6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.237268 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvklw\" (UniqueName: \"kubernetes.io/projected/e396abd5-7a42-405e-827f-85f8426c6ed6-kube-api-access-zvklw\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.237279 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfhsr\" (UniqueName: \"kubernetes.io/projected/ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2-kube-api-access-wfhsr\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.237839 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42cdbca9-8e71-4fd8-8389-d4476417217b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "42cdbca9-8e71-4fd8-8389-d4476417217b" (UID: "42cdbca9-8e71-4fd8-8389-d4476417217b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.244839 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42cdbca9-8e71-4fd8-8389-d4476417217b-kube-api-access-5cqws" (OuterVolumeSpecName: "kube-api-access-5cqws") pod "42cdbca9-8e71-4fd8-8389-d4476417217b" (UID: "42cdbca9-8e71-4fd8-8389-d4476417217b"). InnerVolumeSpecName "kube-api-access-5cqws". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.349887 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cqws\" (UniqueName: \"kubernetes.io/projected/42cdbca9-8e71-4fd8-8389-d4476417217b-kube-api-access-5cqws\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.349918 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42cdbca9-8e71-4fd8-8389-d4476417217b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.367145 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fa83a8e-fd16-444a-8967-17725d75565d" path="/var/lib/kubelet/pods/3fa83a8e-fd16-444a-8967-17725d75565d/volumes" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.655154 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e653-account-create-update-mjf5p" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.660846 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.748352 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.759302 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8081e56b-635e-4137-982b-c5eafd77af8e-operator-scripts\") pod \"8081e56b-635e-4137-982b-c5eafd77af8e\" (UID: \"8081e56b-635e-4137-982b-c5eafd77af8e\") " Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.759342 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b204f01-5a99-40b4-9cbf-38711f1d1f82-scripts\") pod \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\" (UID: \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\") " Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.759364 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpzf4\" (UniqueName: \"kubernetes.io/projected/8081e56b-635e-4137-982b-c5eafd77af8e-kube-api-access-xpzf4\") pod \"8081e56b-635e-4137-982b-c5eafd77af8e\" (UID: \"8081e56b-635e-4137-982b-c5eafd77af8e\") " Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.759397 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b204f01-5a99-40b4-9cbf-38711f1d1f82-logs\") pod \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\" (UID: \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\") " Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.759429 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b204f01-5a99-40b4-9cbf-38711f1d1f82-config-data\") pod \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\" (UID: \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\") " Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.759503 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b204f01-5a99-40b4-9cbf-38711f1d1f82-config-data-custom\") pod \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\" (UID: \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\") " Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.759530 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9b204f01-5a99-40b4-9cbf-38711f1d1f82-etc-machine-id\") pod \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\" (UID: \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\") " Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.759579 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csrbn\" (UniqueName: \"kubernetes.io/projected/9b204f01-5a99-40b4-9cbf-38711f1d1f82-kube-api-access-csrbn\") pod \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\" (UID: \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\") " Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.759623 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b204f01-5a99-40b4-9cbf-38711f1d1f82-combined-ca-bundle\") pod \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\" (UID: \"9b204f01-5a99-40b4-9cbf-38711f1d1f82\") " Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.762170 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b204f01-5a99-40b4-9cbf-38711f1d1f82-logs" (OuterVolumeSpecName: "logs") pod "9b204f01-5a99-40b4-9cbf-38711f1d1f82" (UID: "9b204f01-5a99-40b4-9cbf-38711f1d1f82"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.762796 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8081e56b-635e-4137-982b-c5eafd77af8e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8081e56b-635e-4137-982b-c5eafd77af8e" (UID: "8081e56b-635e-4137-982b-c5eafd77af8e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.764757 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b204f01-5a99-40b4-9cbf-38711f1d1f82-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9b204f01-5a99-40b4-9cbf-38711f1d1f82" (UID: "9b204f01-5a99-40b4-9cbf-38711f1d1f82"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.784469 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b204f01-5a99-40b4-9cbf-38711f1d1f82-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9b204f01-5a99-40b4-9cbf-38711f1d1f82" (UID: "9b204f01-5a99-40b4-9cbf-38711f1d1f82"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.788989 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b204f01-5a99-40b4-9cbf-38711f1d1f82-kube-api-access-csrbn" (OuterVolumeSpecName: "kube-api-access-csrbn") pod "9b204f01-5a99-40b4-9cbf-38711f1d1f82" (UID: "9b204f01-5a99-40b4-9cbf-38711f1d1f82"). InnerVolumeSpecName "kube-api-access-csrbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.798431 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56696ff475-snqth" event={"ID":"7d8820b4-254f-4f89-8609-a8b86b0d5796","Type":"ContainerStarted","Data":"19b0544d3c6ffd94718f36ac8538fafc6b217424698cb2135894577bfe52dcb6"} Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.799692 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56696ff475-snqth" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.800884 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b204f01-5a99-40b4-9cbf-38711f1d1f82-scripts" (OuterVolumeSpecName: "scripts") pod "9b204f01-5a99-40b4-9cbf-38711f1d1f82" (UID: "9b204f01-5a99-40b4-9cbf-38711f1d1f82"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.804693 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8081e56b-635e-4137-982b-c5eafd77af8e-kube-api-access-xpzf4" (OuterVolumeSpecName: "kube-api-access-xpzf4") pod "8081e56b-635e-4137-982b-c5eafd77af8e" (UID: "8081e56b-635e-4137-982b-c5eafd77af8e"). InnerVolumeSpecName "kube-api-access-xpzf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.809310 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9b204f01-5a99-40b4-9cbf-38711f1d1f82","Type":"ContainerDied","Data":"feb6c774a8ecd29253de604b3c9516871180dc2ea967fdbf62a5ba76a83de54b"} Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.809368 4869 scope.go:117] "RemoveContainer" containerID="b23f2eefd51489d4e7567ecf494aceea183db4dd632b006f2adb56606b9acf62" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.809561 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.819260 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mb78b" event={"ID":"ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2","Type":"ContainerDied","Data":"4a124a617d00ca23ad987e9136a9cb1a047ad42157b1d3bf6c72b409e8b12ba8"} Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.819293 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a124a617d00ca23ad987e9136a9cb1a047ad42157b1d3bf6c72b409e8b12ba8" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.819345 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mb78b" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.826619 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-pzfq5" event={"ID":"42cdbca9-8e71-4fd8-8389-d4476417217b","Type":"ContainerDied","Data":"cf8c1ee1f8a2a6e97433d5bb58238564829e8a8d64fd74fd3ee51b61754b85e6"} Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.826652 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf8c1ee1f8a2a6e97433d5bb58238564829e8a8d64fd74fd3ee51b61754b85e6" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.826698 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-pzfq5" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.828730 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-96j6d" event={"ID":"e396abd5-7a42-405e-827f-85f8426c6ed6","Type":"ContainerDied","Data":"43c63e505faed973731922370114da3d39980aa27a6b9f07370e7acaf583a194"} Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.828745 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43c63e505faed973731922370114da3d39980aa27a6b9f07370e7acaf583a194" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.828780 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-96j6d" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.839585 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56696ff475-snqth" podStartSLOduration=14.839570203 podStartE2EDuration="14.839570203s" podCreationTimestamp="2026-03-12 15:08:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:09:04.836906017 +0000 UTC m=+1297.122131295" watchObservedRunningTime="2026-03-12 15:09:04.839570203 +0000 UTC m=+1297.124795481" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.845445 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e653-account-create-update-mjf5p" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.861842 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b204f01-5a99-40b4-9cbf-38711f1d1f82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b204f01-5a99-40b4-9cbf-38711f1d1f82" (UID: "9b204f01-5a99-40b4-9cbf-38711f1d1f82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.866756 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e653-account-create-update-mjf5p" event={"ID":"8081e56b-635e-4137-982b-c5eafd77af8e","Type":"ContainerDied","Data":"e4f86add35e5bc48999071e9bc112e0fe628d881aa94f7faaaa5cef732fc3295"} Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.866809 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4f86add35e5bc48999071e9bc112e0fe628d881aa94f7faaaa5cef732fc3295" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.880790 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b204f01-5a99-40b4-9cbf-38711f1d1f82-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.880821 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpzf4\" (UniqueName: \"kubernetes.io/projected/8081e56b-635e-4137-982b-c5eafd77af8e-kube-api-access-xpzf4\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.880830 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8081e56b-635e-4137-982b-c5eafd77af8e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.880838 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b204f01-5a99-40b4-9cbf-38711f1d1f82-logs\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.880846 4869 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b204f01-5a99-40b4-9cbf-38711f1d1f82-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.880854 4869 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9b204f01-5a99-40b4-9cbf-38711f1d1f82-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.880865 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csrbn\" (UniqueName: \"kubernetes.io/projected/9b204f01-5a99-40b4-9cbf-38711f1d1f82-kube-api-access-csrbn\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.880875 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b204f01-5a99-40b4-9cbf-38711f1d1f82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.969039 4869 scope.go:117] "RemoveContainer" containerID="50fdb66f8813bb921f74ea9f96ac6c094697df019585ba1e85312e85d2f7ab71" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.984158 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b204f01-5a99-40b4-9cbf-38711f1d1f82-config-data" (OuterVolumeSpecName: "config-data") pod "9b204f01-5a99-40b4-9cbf-38711f1d1f82" (UID: "9b204f01-5a99-40b4-9cbf-38711f1d1f82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:04 crc kubenswrapper[4869]: I0312 15:09:04.985328 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b204f01-5a99-40b4-9cbf-38711f1d1f82-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.058413 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.229302 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.295750 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.308449 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 12 15:09:05 crc kubenswrapper[4869]: E0312 15:09:05.309982 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2" containerName="mariadb-database-create" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.310006 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2" containerName="mariadb-database-create" Mar 12 15:09:05 crc kubenswrapper[4869]: E0312 15:09:05.310027 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e396abd5-7a42-405e-827f-85f8426c6ed6" containerName="mariadb-database-create" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.310033 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e396abd5-7a42-405e-827f-85f8426c6ed6" containerName="mariadb-database-create" Mar 12 15:09:05 crc kubenswrapper[4869]: E0312 15:09:05.310042 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b204f01-5a99-40b4-9cbf-38711f1d1f82" containerName="cinder-api-log" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.310048 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b204f01-5a99-40b4-9cbf-38711f1d1f82" containerName="cinder-api-log" Mar 12 15:09:05 crc kubenswrapper[4869]: E0312 15:09:05.310074 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8081e56b-635e-4137-982b-c5eafd77af8e" containerName="mariadb-account-create-update" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.310080 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="8081e56b-635e-4137-982b-c5eafd77af8e" containerName="mariadb-account-create-update" Mar 12 15:09:05 crc kubenswrapper[4869]: E0312 15:09:05.310106 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b204f01-5a99-40b4-9cbf-38711f1d1f82" containerName="cinder-api" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.310112 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b204f01-5a99-40b4-9cbf-38711f1d1f82" containerName="cinder-api" Mar 12 15:09:05 crc kubenswrapper[4869]: E0312 15:09:05.310140 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42cdbca9-8e71-4fd8-8389-d4476417217b" containerName="mariadb-database-create" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.310146 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="42cdbca9-8e71-4fd8-8389-d4476417217b" containerName="mariadb-database-create" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.314753 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="8081e56b-635e-4137-982b-c5eafd77af8e" containerName="mariadb-account-create-update" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.314792 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b204f01-5a99-40b4-9cbf-38711f1d1f82" containerName="cinder-api" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.314808 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="e396abd5-7a42-405e-827f-85f8426c6ed6" containerName="mariadb-database-create" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.314824 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2" containerName="mariadb-database-create" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.314839 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b204f01-5a99-40b4-9cbf-38711f1d1f82" containerName="cinder-api-log" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.314852 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="42cdbca9-8e71-4fd8-8389-d4476417217b" containerName="mariadb-database-create" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.316307 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.320128 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.322168 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.328834 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.369635 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.449159 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d172b0b-59f1-408e-befd-28542b61af1b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3d172b0b-59f1-408e-befd-28542b61af1b\") " pod="openstack/cinder-api-0" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.449215 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d172b0b-59f1-408e-befd-28542b61af1b-logs\") pod \"cinder-api-0\" (UID: \"3d172b0b-59f1-408e-befd-28542b61af1b\") " pod="openstack/cinder-api-0" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.449262 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d172b0b-59f1-408e-befd-28542b61af1b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3d172b0b-59f1-408e-befd-28542b61af1b\") " pod="openstack/cinder-api-0" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.449277 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d172b0b-59f1-408e-befd-28542b61af1b-config-data-custom\") pod \"cinder-api-0\" (UID: \"3d172b0b-59f1-408e-befd-28542b61af1b\") " pod="openstack/cinder-api-0" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.449300 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d172b0b-59f1-408e-befd-28542b61af1b-scripts\") pod \"cinder-api-0\" (UID: \"3d172b0b-59f1-408e-befd-28542b61af1b\") " pod="openstack/cinder-api-0" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.449328 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d172b0b-59f1-408e-befd-28542b61af1b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3d172b0b-59f1-408e-befd-28542b61af1b\") " pod="openstack/cinder-api-0" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.449511 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d172b0b-59f1-408e-befd-28542b61af1b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3d172b0b-59f1-408e-befd-28542b61af1b\") " pod="openstack/cinder-api-0" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.449532 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d172b0b-59f1-408e-befd-28542b61af1b-config-data\") pod \"cinder-api-0\" (UID: \"3d172b0b-59f1-408e-befd-28542b61af1b\") " pod="openstack/cinder-api-0" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.449572 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hgw8\" (UniqueName: \"kubernetes.io/projected/3d172b0b-59f1-408e-befd-28542b61af1b-kube-api-access-7hgw8\") pod \"cinder-api-0\" (UID: \"3d172b0b-59f1-408e-befd-28542b61af1b\") " pod="openstack/cinder-api-0" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.555740 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d172b0b-59f1-408e-befd-28542b61af1b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3d172b0b-59f1-408e-befd-28542b61af1b\") " pod="openstack/cinder-api-0" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.556075 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d172b0b-59f1-408e-befd-28542b61af1b-config-data\") pod \"cinder-api-0\" (UID: \"3d172b0b-59f1-408e-befd-28542b61af1b\") " pod="openstack/cinder-api-0" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.556098 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hgw8\" (UniqueName: \"kubernetes.io/projected/3d172b0b-59f1-408e-befd-28542b61af1b-kube-api-access-7hgw8\") pod \"cinder-api-0\" (UID: \"3d172b0b-59f1-408e-befd-28542b61af1b\") " pod="openstack/cinder-api-0" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.556136 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d172b0b-59f1-408e-befd-28542b61af1b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3d172b0b-59f1-408e-befd-28542b61af1b\") " pod="openstack/cinder-api-0" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.556156 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d172b0b-59f1-408e-befd-28542b61af1b-logs\") pod \"cinder-api-0\" (UID: \"3d172b0b-59f1-408e-befd-28542b61af1b\") " pod="openstack/cinder-api-0" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.556187 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d172b0b-59f1-408e-befd-28542b61af1b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3d172b0b-59f1-408e-befd-28542b61af1b\") " pod="openstack/cinder-api-0" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.556204 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d172b0b-59f1-408e-befd-28542b61af1b-config-data-custom\") pod \"cinder-api-0\" (UID: \"3d172b0b-59f1-408e-befd-28542b61af1b\") " pod="openstack/cinder-api-0" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.556227 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d172b0b-59f1-408e-befd-28542b61af1b-scripts\") pod \"cinder-api-0\" (UID: \"3d172b0b-59f1-408e-befd-28542b61af1b\") " pod="openstack/cinder-api-0" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.556248 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d172b0b-59f1-408e-befd-28542b61af1b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3d172b0b-59f1-408e-befd-28542b61af1b\") " pod="openstack/cinder-api-0" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.556875 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d172b0b-59f1-408e-befd-28542b61af1b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3d172b0b-59f1-408e-befd-28542b61af1b\") " pod="openstack/cinder-api-0" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.557445 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d172b0b-59f1-408e-befd-28542b61af1b-logs\") pod \"cinder-api-0\" (UID: \"3d172b0b-59f1-408e-befd-28542b61af1b\") " pod="openstack/cinder-api-0" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.563323 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d172b0b-59f1-408e-befd-28542b61af1b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3d172b0b-59f1-408e-befd-28542b61af1b\") " pod="openstack/cinder-api-0" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.563453 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d172b0b-59f1-408e-befd-28542b61af1b-scripts\") pod \"cinder-api-0\" (UID: \"3d172b0b-59f1-408e-befd-28542b61af1b\") " pod="openstack/cinder-api-0" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.564590 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d172b0b-59f1-408e-befd-28542b61af1b-config-data-custom\") pod \"cinder-api-0\" (UID: \"3d172b0b-59f1-408e-befd-28542b61af1b\") " pod="openstack/cinder-api-0" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.566376 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d172b0b-59f1-408e-befd-28542b61af1b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3d172b0b-59f1-408e-befd-28542b61af1b\") " pod="openstack/cinder-api-0" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.572975 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d172b0b-59f1-408e-befd-28542b61af1b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3d172b0b-59f1-408e-befd-28542b61af1b\") " pod="openstack/cinder-api-0" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.572992 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d172b0b-59f1-408e-befd-28542b61af1b-config-data\") pod \"cinder-api-0\" (UID: \"3d172b0b-59f1-408e-befd-28542b61af1b\") " pod="openstack/cinder-api-0" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.573967 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hgw8\" (UniqueName: \"kubernetes.io/projected/3d172b0b-59f1-408e-befd-28542b61af1b-kube-api-access-7hgw8\") pod \"cinder-api-0\" (UID: \"3d172b0b-59f1-408e-befd-28542b61af1b\") " pod="openstack/cinder-api-0" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.716842 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.724258 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-43b5-account-create-update-5ggll" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.761312 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-666b564cfb-pk78f" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.861756 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/340fff5b-1313-4cbd-82fc-fee5941c1aef-public-tls-certs\") pod \"340fff5b-1313-4cbd-82fc-fee5941c1aef\" (UID: \"340fff5b-1313-4cbd-82fc-fee5941c1aef\") " Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.861855 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/340fff5b-1313-4cbd-82fc-fee5941c1aef-logs\") pod \"340fff5b-1313-4cbd-82fc-fee5941c1aef\" (UID: \"340fff5b-1313-4cbd-82fc-fee5941c1aef\") " Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.861945 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/340fff5b-1313-4cbd-82fc-fee5941c1aef-scripts\") pod \"340fff5b-1313-4cbd-82fc-fee5941c1aef\" (UID: \"340fff5b-1313-4cbd-82fc-fee5941c1aef\") " Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.861983 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/340fff5b-1313-4cbd-82fc-fee5941c1aef-internal-tls-certs\") pod \"340fff5b-1313-4cbd-82fc-fee5941c1aef\" (UID: \"340fff5b-1313-4cbd-82fc-fee5941c1aef\") " Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.862011 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ldzv\" (UniqueName: \"kubernetes.io/projected/340fff5b-1313-4cbd-82fc-fee5941c1aef-kube-api-access-9ldzv\") pod \"340fff5b-1313-4cbd-82fc-fee5941c1aef\" (UID: \"340fff5b-1313-4cbd-82fc-fee5941c1aef\") " Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.862103 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/340fff5b-1313-4cbd-82fc-fee5941c1aef-combined-ca-bundle\") pod \"340fff5b-1313-4cbd-82fc-fee5941c1aef\" (UID: \"340fff5b-1313-4cbd-82fc-fee5941c1aef\") " Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.862172 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec9fe62a-a7c1-4c9b-8520-2046bc10995a-operator-scripts\") pod \"ec9fe62a-a7c1-4c9b-8520-2046bc10995a\" (UID: \"ec9fe62a-a7c1-4c9b-8520-2046bc10995a\") " Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.862213 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjbnn\" (UniqueName: \"kubernetes.io/projected/ec9fe62a-a7c1-4c9b-8520-2046bc10995a-kube-api-access-zjbnn\") pod \"ec9fe62a-a7c1-4c9b-8520-2046bc10995a\" (UID: \"ec9fe62a-a7c1-4c9b-8520-2046bc10995a\") " Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.862260 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/340fff5b-1313-4cbd-82fc-fee5941c1aef-config-data\") pod \"340fff5b-1313-4cbd-82fc-fee5941c1aef\" (UID: \"340fff5b-1313-4cbd-82fc-fee5941c1aef\") " Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.865511 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/340fff5b-1313-4cbd-82fc-fee5941c1aef-logs" (OuterVolumeSpecName: "logs") pod "340fff5b-1313-4cbd-82fc-fee5941c1aef" (UID: "340fff5b-1313-4cbd-82fc-fee5941c1aef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.870037 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/340fff5b-1313-4cbd-82fc-fee5941c1aef-kube-api-access-9ldzv" (OuterVolumeSpecName: "kube-api-access-9ldzv") pod "340fff5b-1313-4cbd-82fc-fee5941c1aef" (UID: "340fff5b-1313-4cbd-82fc-fee5941c1aef"). InnerVolumeSpecName "kube-api-access-9ldzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.870191 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec9fe62a-a7c1-4c9b-8520-2046bc10995a-kube-api-access-zjbnn" (OuterVolumeSpecName: "kube-api-access-zjbnn") pod "ec9fe62a-a7c1-4c9b-8520-2046bc10995a" (UID: "ec9fe62a-a7c1-4c9b-8520-2046bc10995a"). InnerVolumeSpecName "kube-api-access-zjbnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.870831 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec9fe62a-a7c1-4c9b-8520-2046bc10995a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ec9fe62a-a7c1-4c9b-8520-2046bc10995a" (UID: "ec9fe62a-a7c1-4c9b-8520-2046bc10995a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.877118 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/340fff5b-1313-4cbd-82fc-fee5941c1aef-scripts" (OuterVolumeSpecName: "scripts") pod "340fff5b-1313-4cbd-82fc-fee5941c1aef" (UID: "340fff5b-1313-4cbd-82fc-fee5941c1aef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.898173 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd15c8a9-0582-4c14-9450-407ad0cfa828","Type":"ContainerStarted","Data":"abef6ce10f9b78d09aba11e8f0720335aa87354e3fd0046148a390edfefff7ec"} Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.910927 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"5e9514e1-cc53-4445-904c-4505fc60a1ea","Type":"ContainerStarted","Data":"938b163b843f4e17857b4a072034c1018ac7a8eb550a4473aec0f820d93c1ca8"} Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.924100 4869 generic.go:334] "Generic (PLEG): container finished" podID="340fff5b-1313-4cbd-82fc-fee5941c1aef" containerID="c48102a109e8a9305cb4408ad78c9f91ca883cff4ad62cb7da26befaae38e271" exitCode=0 Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.924858 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-666b564cfb-pk78f" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.924970 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-666b564cfb-pk78f" event={"ID":"340fff5b-1313-4cbd-82fc-fee5941c1aef","Type":"ContainerDied","Data":"c48102a109e8a9305cb4408ad78c9f91ca883cff4ad62cb7da26befaae38e271"} Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.925042 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-666b564cfb-pk78f" event={"ID":"340fff5b-1313-4cbd-82fc-fee5941c1aef","Type":"ContainerDied","Data":"a3b98fcc48b220040e732cb5d31b573aa2ddaf500c1fcc624754b7d414654846"} Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.925071 4869 scope.go:117] "RemoveContainer" containerID="c48102a109e8a9305cb4408ad78c9f91ca883cff4ad62cb7da26befaae38e271" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.929696 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"b0b57477-4e74-4eb5-a0a0-c11e022f6919","Type":"ContainerStarted","Data":"64d985b658693200b6c809b374fe47785c56cea8b31739c22af4d7a2d556cc00"} Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.945750 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-43b5-account-create-update-5ggll" event={"ID":"ec9fe62a-a7c1-4c9b-8520-2046bc10995a","Type":"ContainerDied","Data":"dbc4a01bbbfec1dc8adfacd8db467501d1a737302640868cbae03124eab2491a"} Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.945820 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbc4a01bbbfec1dc8adfacd8db467501d1a737302640868cbae03124eab2491a" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.945904 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-43b5-account-create-update-5ggll" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.958616 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5bcd88f7f5-258jj" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.965698 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/340fff5b-1313-4cbd-82fc-fee5941c1aef-config-data" (OuterVolumeSpecName: "config-data") pod "340fff5b-1313-4cbd-82fc-fee5941c1aef" (UID: "340fff5b-1313-4cbd-82fc-fee5941c1aef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.965695 4869 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec9fe62a-a7c1-4c9b-8520-2046bc10995a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.965791 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjbnn\" (UniqueName: \"kubernetes.io/projected/ec9fe62a-a7c1-4c9b-8520-2046bc10995a-kube-api-access-zjbnn\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.965808 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/340fff5b-1313-4cbd-82fc-fee5941c1aef-logs\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.965844 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/340fff5b-1313-4cbd-82fc-fee5941c1aef-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:05 crc kubenswrapper[4869]: I0312 15:09:05.965857 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ldzv\" (UniqueName: \"kubernetes.io/projected/340fff5b-1313-4cbd-82fc-fee5941c1aef-kube-api-access-9ldzv\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:06 crc kubenswrapper[4869]: I0312 15:09:06.044023 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/340fff5b-1313-4cbd-82fc-fee5941c1aef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "340fff5b-1313-4cbd-82fc-fee5941c1aef" (UID: "340fff5b-1313-4cbd-82fc-fee5941c1aef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:06 crc kubenswrapper[4869]: I0312 15:09:06.061229 4869 scope.go:117] "RemoveContainer" containerID="cbdd6143d6e8baeff3ee039b1708c0b35bbd32ab8aeef7031d49822ee148b006" Mar 12 15:09:06 crc kubenswrapper[4869]: I0312 15:09:06.068138 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/340fff5b-1313-4cbd-82fc-fee5941c1aef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:06 crc kubenswrapper[4869]: I0312 15:09:06.068164 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/340fff5b-1313-4cbd-82fc-fee5941c1aef-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:06 crc kubenswrapper[4869]: I0312 15:09:06.118826 4869 scope.go:117] "RemoveContainer" containerID="c48102a109e8a9305cb4408ad78c9f91ca883cff4ad62cb7da26befaae38e271" Mar 12 15:09:06 crc kubenswrapper[4869]: E0312 15:09:06.120440 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c48102a109e8a9305cb4408ad78c9f91ca883cff4ad62cb7da26befaae38e271\": container with ID starting with c48102a109e8a9305cb4408ad78c9f91ca883cff4ad62cb7da26befaae38e271 not found: ID does not exist" containerID="c48102a109e8a9305cb4408ad78c9f91ca883cff4ad62cb7da26befaae38e271" Mar 12 15:09:06 crc kubenswrapper[4869]: I0312 15:09:06.120483 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c48102a109e8a9305cb4408ad78c9f91ca883cff4ad62cb7da26befaae38e271"} err="failed to get container status \"c48102a109e8a9305cb4408ad78c9f91ca883cff4ad62cb7da26befaae38e271\": rpc error: code = NotFound desc = could not find container \"c48102a109e8a9305cb4408ad78c9f91ca883cff4ad62cb7da26befaae38e271\": container with ID starting with c48102a109e8a9305cb4408ad78c9f91ca883cff4ad62cb7da26befaae38e271 not found: ID does not exist" Mar 12 15:09:06 crc kubenswrapper[4869]: I0312 15:09:06.120513 4869 scope.go:117] "RemoveContainer" containerID="cbdd6143d6e8baeff3ee039b1708c0b35bbd32ab8aeef7031d49822ee148b006" Mar 12 15:09:06 crc kubenswrapper[4869]: E0312 15:09:06.121207 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbdd6143d6e8baeff3ee039b1708c0b35bbd32ab8aeef7031d49822ee148b006\": container with ID starting with cbdd6143d6e8baeff3ee039b1708c0b35bbd32ab8aeef7031d49822ee148b006 not found: ID does not exist" containerID="cbdd6143d6e8baeff3ee039b1708c0b35bbd32ab8aeef7031d49822ee148b006" Mar 12 15:09:06 crc kubenswrapper[4869]: I0312 15:09:06.121244 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbdd6143d6e8baeff3ee039b1708c0b35bbd32ab8aeef7031d49822ee148b006"} err="failed to get container status \"cbdd6143d6e8baeff3ee039b1708c0b35bbd32ab8aeef7031d49822ee148b006\": rpc error: code = NotFound desc = could not find container \"cbdd6143d6e8baeff3ee039b1708c0b35bbd32ab8aeef7031d49822ee148b006\": container with ID starting with cbdd6143d6e8baeff3ee039b1708c0b35bbd32ab8aeef7031d49822ee148b006 not found: ID does not exist" Mar 12 15:09:06 crc kubenswrapper[4869]: I0312 15:09:06.275803 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/340fff5b-1313-4cbd-82fc-fee5941c1aef-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "340fff5b-1313-4cbd-82fc-fee5941c1aef" (UID: "340fff5b-1313-4cbd-82fc-fee5941c1aef"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:06 crc kubenswrapper[4869]: I0312 15:09:06.275884 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/340fff5b-1313-4cbd-82fc-fee5941c1aef-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "340fff5b-1313-4cbd-82fc-fee5941c1aef" (UID: "340fff5b-1313-4cbd-82fc-fee5941c1aef"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:06 crc kubenswrapper[4869]: I0312 15:09:06.352818 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b204f01-5a99-40b4-9cbf-38711f1d1f82" path="/var/lib/kubelet/pods/9b204f01-5a99-40b4-9cbf-38711f1d1f82/volumes" Mar 12 15:09:06 crc kubenswrapper[4869]: I0312 15:09:06.375287 4869 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/340fff5b-1313-4cbd-82fc-fee5941c1aef-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:06 crc kubenswrapper[4869]: I0312 15:09:06.375314 4869 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/340fff5b-1313-4cbd-82fc-fee5941c1aef-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:06 crc kubenswrapper[4869]: I0312 15:09:06.508549 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 12 15:09:06 crc kubenswrapper[4869]: I0312 15:09:06.755099 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-666b564cfb-pk78f"] Mar 12 15:09:06 crc kubenswrapper[4869]: I0312 15:09:06.781604 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-666b564cfb-pk78f"] Mar 12 15:09:06 crc kubenswrapper[4869]: I0312 15:09:06.978905 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9","Type":"ContainerStarted","Data":"996fca8ead09df13972f40fe124cc62043a5f7eadae19ecfef41dbf98de20e18"} Mar 12 15:09:06 crc kubenswrapper[4869]: I0312 15:09:06.995908 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3d172b0b-59f1-408e-befd-28542b61af1b","Type":"ContainerStarted","Data":"7eba5f6c9b9f83ec8afcd4f4d5caedd7a2f0a6a63bdf99a237ca60d9d9c768b6"} Mar 12 15:09:06 crc kubenswrapper[4869]: I0312 15:09:06.998473 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"5e9514e1-cc53-4445-904c-4505fc60a1ea","Type":"ContainerStarted","Data":"4c8079d6956ec3b6846f4db43100a55ba972a96b2ad8077d35664cb37bb0fdfb"} Mar 12 15:09:07 crc kubenswrapper[4869]: I0312 15:09:07.015745 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"c13a91ca-db8c-42b4-bfca-029e427aff28","Type":"ContainerStarted","Data":"5c0e9b92a4fc76dda029c064be59efd24949d843bc7c3041ec8b02343a1ee8f6"} Mar 12 15:09:07 crc kubenswrapper[4869]: I0312 15:09:07.028911 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"b0b57477-4e74-4eb5-a0a0-c11e022f6919","Type":"ContainerStarted","Data":"3315beab31a76f518526482539460d38db22750c846cb3e5ce94fd38682f3c85"} Mar 12 15:09:07 crc kubenswrapper[4869]: I0312 15:09:07.034112 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=12.505079777 podStartE2EDuration="17.034089574s" podCreationTimestamp="2026-03-12 15:08:50 +0000 UTC" firstStartedPulling="2026-03-12 15:09:00.362256025 +0000 UTC m=+1292.647481303" lastFinishedPulling="2026-03-12 15:09:04.891265822 +0000 UTC m=+1297.176491100" observedRunningTime="2026-03-12 15:09:07.026071765 +0000 UTC m=+1299.311297043" watchObservedRunningTime="2026-03-12 15:09:07.034089574 +0000 UTC m=+1299.319314852" Mar 12 15:09:07 crc kubenswrapper[4869]: I0312 15:09:07.039006 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5bcd88f7f5-258jj" Mar 12 15:09:08 crc kubenswrapper[4869]: I0312 15:09:08.043429 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"c13a91ca-db8c-42b4-bfca-029e427aff28","Type":"ContainerStarted","Data":"34d1276e9015be95facca31f80054df14d295f9bf53676ed3de357b8cc588f0f"} Mar 12 15:09:08 crc kubenswrapper[4869]: I0312 15:09:08.049121 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"b0b57477-4e74-4eb5-a0a0-c11e022f6919","Type":"ContainerStarted","Data":"b2163d8eb79547aade986751a8bd32996b6f56fe75cb0d2b4eee30e3aa7950b0"} Mar 12 15:09:08 crc kubenswrapper[4869]: I0312 15:09:08.054874 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd15c8a9-0582-4c14-9450-407ad0cfa828","Type":"ContainerStarted","Data":"ddb6144292d8b97dcf8e8f16ae0310e23b3126a3c32becdfb1f98d740e54ace6"} Mar 12 15:09:08 crc kubenswrapper[4869]: I0312 15:09:08.057347 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9","Type":"ContainerStarted","Data":"d9d43cab5f4d12f8f0588da191b0634fe4c2c0bece9cd4edab160227813bf17d"} Mar 12 15:09:08 crc kubenswrapper[4869]: I0312 15:09:08.084510 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=13.807615055 podStartE2EDuration="18.084493s" podCreationTimestamp="2026-03-12 15:08:50 +0000 UTC" firstStartedPulling="2026-03-12 15:09:00.585103569 +0000 UTC m=+1292.870328847" lastFinishedPulling="2026-03-12 15:09:04.861981514 +0000 UTC m=+1297.147206792" observedRunningTime="2026-03-12 15:09:08.080833265 +0000 UTC m=+1300.366058543" watchObservedRunningTime="2026-03-12 15:09:08.084493 +0000 UTC m=+1300.369718278" Mar 12 15:09:08 crc kubenswrapper[4869]: I0312 15:09:08.349743 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="340fff5b-1313-4cbd-82fc-fee5941c1aef" path="/var/lib/kubelet/pods/340fff5b-1313-4cbd-82fc-fee5941c1aef/volumes" Mar 12 15:09:08 crc kubenswrapper[4869]: I0312 15:09:08.841457 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tgt7x"] Mar 12 15:09:08 crc kubenswrapper[4869]: E0312 15:09:08.842199 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec9fe62a-a7c1-4c9b-8520-2046bc10995a" containerName="mariadb-account-create-update" Mar 12 15:09:08 crc kubenswrapper[4869]: I0312 15:09:08.842236 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec9fe62a-a7c1-4c9b-8520-2046bc10995a" containerName="mariadb-account-create-update" Mar 12 15:09:08 crc kubenswrapper[4869]: E0312 15:09:08.842260 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="340fff5b-1313-4cbd-82fc-fee5941c1aef" containerName="placement-log" Mar 12 15:09:08 crc kubenswrapper[4869]: I0312 15:09:08.842267 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="340fff5b-1313-4cbd-82fc-fee5941c1aef" containerName="placement-log" Mar 12 15:09:08 crc kubenswrapper[4869]: E0312 15:09:08.842284 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="340fff5b-1313-4cbd-82fc-fee5941c1aef" containerName="placement-api" Mar 12 15:09:08 crc kubenswrapper[4869]: I0312 15:09:08.842290 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="340fff5b-1313-4cbd-82fc-fee5941c1aef" containerName="placement-api" Mar 12 15:09:08 crc kubenswrapper[4869]: I0312 15:09:08.842452 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="340fff5b-1313-4cbd-82fc-fee5941c1aef" containerName="placement-log" Mar 12 15:09:08 crc kubenswrapper[4869]: I0312 15:09:08.842465 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec9fe62a-a7c1-4c9b-8520-2046bc10995a" containerName="mariadb-account-create-update" Mar 12 15:09:08 crc kubenswrapper[4869]: I0312 15:09:08.842484 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="340fff5b-1313-4cbd-82fc-fee5941c1aef" containerName="placement-api" Mar 12 15:09:08 crc kubenswrapper[4869]: I0312 15:09:08.843084 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-tgt7x" Mar 12 15:09:08 crc kubenswrapper[4869]: I0312 15:09:08.845431 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 12 15:09:08 crc kubenswrapper[4869]: I0312 15:09:08.845804 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 12 15:09:08 crc kubenswrapper[4869]: I0312 15:09:08.849792 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-tljpw" Mar 12 15:09:08 crc kubenswrapper[4869]: I0312 15:09:08.872922 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tgt7x"] Mar 12 15:09:08 crc kubenswrapper[4869]: I0312 15:09:08.939007 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c21e651-1883-4a95-a803-e6a5a2b7f457-config-data\") pod \"nova-cell0-conductor-db-sync-tgt7x\" (UID: \"7c21e651-1883-4a95-a803-e6a5a2b7f457\") " pod="openstack/nova-cell0-conductor-db-sync-tgt7x" Mar 12 15:09:08 crc kubenswrapper[4869]: I0312 15:09:08.939056 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c21e651-1883-4a95-a803-e6a5a2b7f457-scripts\") pod \"nova-cell0-conductor-db-sync-tgt7x\" (UID: \"7c21e651-1883-4a95-a803-e6a5a2b7f457\") " pod="openstack/nova-cell0-conductor-db-sync-tgt7x" Mar 12 15:09:08 crc kubenswrapper[4869]: I0312 15:09:08.939149 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c21e651-1883-4a95-a803-e6a5a2b7f457-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-tgt7x\" (UID: \"7c21e651-1883-4a95-a803-e6a5a2b7f457\") " pod="openstack/nova-cell0-conductor-db-sync-tgt7x" Mar 12 15:09:08 crc kubenswrapper[4869]: I0312 15:09:08.939224 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8ssr\" (UniqueName: \"kubernetes.io/projected/7c21e651-1883-4a95-a803-e6a5a2b7f457-kube-api-access-r8ssr\") pod \"nova-cell0-conductor-db-sync-tgt7x\" (UID: \"7c21e651-1883-4a95-a803-e6a5a2b7f457\") " pod="openstack/nova-cell0-conductor-db-sync-tgt7x" Mar 12 15:09:09 crc kubenswrapper[4869]: I0312 15:09:09.040785 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c21e651-1883-4a95-a803-e6a5a2b7f457-config-data\") pod \"nova-cell0-conductor-db-sync-tgt7x\" (UID: \"7c21e651-1883-4a95-a803-e6a5a2b7f457\") " pod="openstack/nova-cell0-conductor-db-sync-tgt7x" Mar 12 15:09:09 crc kubenswrapper[4869]: I0312 15:09:09.040826 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c21e651-1883-4a95-a803-e6a5a2b7f457-scripts\") pod \"nova-cell0-conductor-db-sync-tgt7x\" (UID: \"7c21e651-1883-4a95-a803-e6a5a2b7f457\") " pod="openstack/nova-cell0-conductor-db-sync-tgt7x" Mar 12 15:09:09 crc kubenswrapper[4869]: I0312 15:09:09.040885 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c21e651-1883-4a95-a803-e6a5a2b7f457-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-tgt7x\" (UID: \"7c21e651-1883-4a95-a803-e6a5a2b7f457\") " pod="openstack/nova-cell0-conductor-db-sync-tgt7x" Mar 12 15:09:09 crc kubenswrapper[4869]: I0312 15:09:09.040958 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8ssr\" (UniqueName: \"kubernetes.io/projected/7c21e651-1883-4a95-a803-e6a5a2b7f457-kube-api-access-r8ssr\") pod \"nova-cell0-conductor-db-sync-tgt7x\" (UID: \"7c21e651-1883-4a95-a803-e6a5a2b7f457\") " pod="openstack/nova-cell0-conductor-db-sync-tgt7x" Mar 12 15:09:09 crc kubenswrapper[4869]: I0312 15:09:09.050514 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c21e651-1883-4a95-a803-e6a5a2b7f457-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-tgt7x\" (UID: \"7c21e651-1883-4a95-a803-e6a5a2b7f457\") " pod="openstack/nova-cell0-conductor-db-sync-tgt7x" Mar 12 15:09:09 crc kubenswrapper[4869]: I0312 15:09:09.088413 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3d172b0b-59f1-408e-befd-28542b61af1b","Type":"ContainerStarted","Data":"cce4bcab06920d4235baa811074f9fbb1b931c3373985a3f1efff743276ddd33"} Mar 12 15:09:09 crc kubenswrapper[4869]: I0312 15:09:09.117690 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=15.412755388 podStartE2EDuration="19.117669053s" podCreationTimestamp="2026-03-12 15:08:50 +0000 UTC" firstStartedPulling="2026-03-12 15:09:00.230695361 +0000 UTC m=+1292.515920639" lastFinishedPulling="2026-03-12 15:09:03.935609036 +0000 UTC m=+1296.220834304" observedRunningTime="2026-03-12 15:09:09.106960096 +0000 UTC m=+1301.392185384" watchObservedRunningTime="2026-03-12 15:09:09.117669053 +0000 UTC m=+1301.402894341" Mar 12 15:09:09 crc kubenswrapper[4869]: I0312 15:09:09.138948 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=6.138926211 podStartE2EDuration="6.138926211s" podCreationTimestamp="2026-03-12 15:09:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:09:09.132275381 +0000 UTC m=+1301.417500669" watchObservedRunningTime="2026-03-12 15:09:09.138926211 +0000 UTC m=+1301.424151489" Mar 12 15:09:09 crc kubenswrapper[4869]: I0312 15:09:09.257917 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c21e651-1883-4a95-a803-e6a5a2b7f457-scripts\") pod \"nova-cell0-conductor-db-sync-tgt7x\" (UID: \"7c21e651-1883-4a95-a803-e6a5a2b7f457\") " pod="openstack/nova-cell0-conductor-db-sync-tgt7x" Mar 12 15:09:09 crc kubenswrapper[4869]: I0312 15:09:09.258076 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c21e651-1883-4a95-a803-e6a5a2b7f457-config-data\") pod \"nova-cell0-conductor-db-sync-tgt7x\" (UID: \"7c21e651-1883-4a95-a803-e6a5a2b7f457\") " pod="openstack/nova-cell0-conductor-db-sync-tgt7x" Mar 12 15:09:09 crc kubenswrapper[4869]: I0312 15:09:09.270329 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8ssr\" (UniqueName: \"kubernetes.io/projected/7c21e651-1883-4a95-a803-e6a5a2b7f457-kube-api-access-r8ssr\") pod \"nova-cell0-conductor-db-sync-tgt7x\" (UID: \"7c21e651-1883-4a95-a803-e6a5a2b7f457\") " pod="openstack/nova-cell0-conductor-db-sync-tgt7x" Mar 12 15:09:09 crc kubenswrapper[4869]: I0312 15:09:09.467425 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-tgt7x" Mar 12 15:09:10 crc kubenswrapper[4869]: I0312 15:09:10.036936 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tgt7x"] Mar 12 15:09:10 crc kubenswrapper[4869]: W0312 15:09:10.085931 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c21e651_1883_4a95_a803_e6a5a2b7f457.slice/crio-b1b97fd2a029702b53aea5f04e9529e1b914d9d6edd137d07b0b9c5d44f4b9de WatchSource:0}: Error finding container b1b97fd2a029702b53aea5f04e9529e1b914d9d6edd137d07b0b9c5d44f4b9de: Status 404 returned error can't find the container with id b1b97fd2a029702b53aea5f04e9529e1b914d9d6edd137d07b0b9c5d44f4b9de Mar 12 15:09:10 crc kubenswrapper[4869]: I0312 15:09:10.126360 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-tgt7x" event={"ID":"7c21e651-1883-4a95-a803-e6a5a2b7f457","Type":"ContainerStarted","Data":"b1b97fd2a029702b53aea5f04e9529e1b914d9d6edd137d07b0b9c5d44f4b9de"} Mar 12 15:09:10 crc kubenswrapper[4869]: I0312 15:09:10.822673 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 12 15:09:10 crc kubenswrapper[4869]: I0312 15:09:10.977764 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56696ff475-snqth" Mar 12 15:09:10 crc kubenswrapper[4869]: I0312 15:09:10.988222 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:10 crc kubenswrapper[4869]: I0312 15:09:10.989581 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-volume-volume1-0" podUID="c13a91ca-db8c-42b4-bfca-029e427aff28" containerName="cinder-volume" probeResult="failure" output="Get \"http://10.217.0.189:8080/\": dial tcp 10.217.0.189:8080: connect: connection refused" Mar 12 15:09:11 crc kubenswrapper[4869]: I0312 15:09:11.092201 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95ccfc9f9-9rcc7"] Mar 12 15:09:11 crc kubenswrapper[4869]: I0312 15:09:11.092517 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-95ccfc9f9-9rcc7" podUID="3110530e-08c4-483d-898f-bcb2eeb0cc62" containerName="dnsmasq-dns" containerID="cri-o://b14c0e4c43c6113161339211452c8a28a565cbc34dd4a650599d4d8ab5e77c51" gracePeriod=10 Mar 12 15:09:11 crc kubenswrapper[4869]: I0312 15:09:11.105104 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Mar 12 15:09:11 crc kubenswrapper[4869]: I0312 15:09:11.113056 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-backup-0" podUID="5e9514e1-cc53-4445-904c-4505fc60a1ea" containerName="cinder-backup" probeResult="failure" output="Get \"http://10.217.0.190:8080/\": dial tcp 10.217.0.190:8080: connect: connection refused" Mar 12 15:09:11 crc kubenswrapper[4869]: I0312 15:09:11.141753 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3d172b0b-59f1-408e-befd-28542b61af1b","Type":"ContainerStarted","Data":"e24fb62d54db4a8bec0b9ba2fa964dbb53219dee4918ec0b9b47e6394e7fd826"} Mar 12 15:09:12 crc kubenswrapper[4869]: I0312 15:09:12.151731 4869 generic.go:334] "Generic (PLEG): container finished" podID="3110530e-08c4-483d-898f-bcb2eeb0cc62" containerID="b14c0e4c43c6113161339211452c8a28a565cbc34dd4a650599d4d8ab5e77c51" exitCode=0 Mar 12 15:09:12 crc kubenswrapper[4869]: I0312 15:09:12.153110 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95ccfc9f9-9rcc7" event={"ID":"3110530e-08c4-483d-898f-bcb2eeb0cc62","Type":"ContainerDied","Data":"b14c0e4c43c6113161339211452c8a28a565cbc34dd4a650599d4d8ab5e77c51"} Mar 12 15:09:12 crc kubenswrapper[4869]: I0312 15:09:12.153368 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 12 15:09:12 crc kubenswrapper[4869]: I0312 15:09:12.183465 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=7.183440565 podStartE2EDuration="7.183440565s" podCreationTimestamp="2026-03-12 15:09:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:09:12.172987976 +0000 UTC m=+1304.458213254" watchObservedRunningTime="2026-03-12 15:09:12.183440565 +0000 UTC m=+1304.468665843" Mar 12 15:09:12 crc kubenswrapper[4869]: I0312 15:09:12.801436 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95ccfc9f9-9rcc7" Mar 12 15:09:12 crc kubenswrapper[4869]: I0312 15:09:12.933065 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3110530e-08c4-483d-898f-bcb2eeb0cc62-dns-svc\") pod \"3110530e-08c4-483d-898f-bcb2eeb0cc62\" (UID: \"3110530e-08c4-483d-898f-bcb2eeb0cc62\") " Mar 12 15:09:12 crc kubenswrapper[4869]: I0312 15:09:12.933113 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3110530e-08c4-483d-898f-bcb2eeb0cc62-config\") pod \"3110530e-08c4-483d-898f-bcb2eeb0cc62\" (UID: \"3110530e-08c4-483d-898f-bcb2eeb0cc62\") " Mar 12 15:09:12 crc kubenswrapper[4869]: I0312 15:09:12.933209 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3110530e-08c4-483d-898f-bcb2eeb0cc62-ovsdbserver-nb\") pod \"3110530e-08c4-483d-898f-bcb2eeb0cc62\" (UID: \"3110530e-08c4-483d-898f-bcb2eeb0cc62\") " Mar 12 15:09:12 crc kubenswrapper[4869]: I0312 15:09:12.933311 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3110530e-08c4-483d-898f-bcb2eeb0cc62-dns-swift-storage-0\") pod \"3110530e-08c4-483d-898f-bcb2eeb0cc62\" (UID: \"3110530e-08c4-483d-898f-bcb2eeb0cc62\") " Mar 12 15:09:12 crc kubenswrapper[4869]: I0312 15:09:12.933345 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44mbn\" (UniqueName: \"kubernetes.io/projected/3110530e-08c4-483d-898f-bcb2eeb0cc62-kube-api-access-44mbn\") pod \"3110530e-08c4-483d-898f-bcb2eeb0cc62\" (UID: \"3110530e-08c4-483d-898f-bcb2eeb0cc62\") " Mar 12 15:09:12 crc kubenswrapper[4869]: I0312 15:09:12.933367 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3110530e-08c4-483d-898f-bcb2eeb0cc62-ovsdbserver-sb\") pod \"3110530e-08c4-483d-898f-bcb2eeb0cc62\" (UID: \"3110530e-08c4-483d-898f-bcb2eeb0cc62\") " Mar 12 15:09:12 crc kubenswrapper[4869]: I0312 15:09:12.970291 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3110530e-08c4-483d-898f-bcb2eeb0cc62-kube-api-access-44mbn" (OuterVolumeSpecName: "kube-api-access-44mbn") pod "3110530e-08c4-483d-898f-bcb2eeb0cc62" (UID: "3110530e-08c4-483d-898f-bcb2eeb0cc62"). InnerVolumeSpecName "kube-api-access-44mbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:09:13 crc kubenswrapper[4869]: I0312 15:09:13.016597 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3110530e-08c4-483d-898f-bcb2eeb0cc62-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3110530e-08c4-483d-898f-bcb2eeb0cc62" (UID: "3110530e-08c4-483d-898f-bcb2eeb0cc62"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:09:13 crc kubenswrapper[4869]: I0312 15:09:13.019170 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3110530e-08c4-483d-898f-bcb2eeb0cc62-config" (OuterVolumeSpecName: "config") pod "3110530e-08c4-483d-898f-bcb2eeb0cc62" (UID: "3110530e-08c4-483d-898f-bcb2eeb0cc62"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:09:13 crc kubenswrapper[4869]: I0312 15:09:13.020693 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3110530e-08c4-483d-898f-bcb2eeb0cc62-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3110530e-08c4-483d-898f-bcb2eeb0cc62" (UID: "3110530e-08c4-483d-898f-bcb2eeb0cc62"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:09:13 crc kubenswrapper[4869]: I0312 15:09:13.022731 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3110530e-08c4-483d-898f-bcb2eeb0cc62-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3110530e-08c4-483d-898f-bcb2eeb0cc62" (UID: "3110530e-08c4-483d-898f-bcb2eeb0cc62"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:09:13 crc kubenswrapper[4869]: I0312 15:09:13.035316 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3110530e-08c4-483d-898f-bcb2eeb0cc62-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:13 crc kubenswrapper[4869]: I0312 15:09:13.035354 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44mbn\" (UniqueName: \"kubernetes.io/projected/3110530e-08c4-483d-898f-bcb2eeb0cc62-kube-api-access-44mbn\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:13 crc kubenswrapper[4869]: I0312 15:09:13.035368 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3110530e-08c4-483d-898f-bcb2eeb0cc62-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:13 crc kubenswrapper[4869]: I0312 15:09:13.035379 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3110530e-08c4-483d-898f-bcb2eeb0cc62-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:13 crc kubenswrapper[4869]: I0312 15:09:13.035389 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3110530e-08c4-483d-898f-bcb2eeb0cc62-config\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:13 crc kubenswrapper[4869]: I0312 15:09:13.043138 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3110530e-08c4-483d-898f-bcb2eeb0cc62-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3110530e-08c4-483d-898f-bcb2eeb0cc62" (UID: "3110530e-08c4-483d-898f-bcb2eeb0cc62"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:09:13 crc kubenswrapper[4869]: I0312 15:09:13.136981 4869 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3110530e-08c4-483d-898f-bcb2eeb0cc62-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:13 crc kubenswrapper[4869]: I0312 15:09:13.162814 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95ccfc9f9-9rcc7" event={"ID":"3110530e-08c4-483d-898f-bcb2eeb0cc62","Type":"ContainerDied","Data":"4a6f5fcd37bd9960009b050a58faeb823ec610a3e411721956e863845d147445"} Mar 12 15:09:13 crc kubenswrapper[4869]: I0312 15:09:13.162853 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95ccfc9f9-9rcc7" Mar 12 15:09:13 crc kubenswrapper[4869]: I0312 15:09:13.162866 4869 scope.go:117] "RemoveContainer" containerID="b14c0e4c43c6113161339211452c8a28a565cbc34dd4a650599d4d8ab5e77c51" Mar 12 15:09:13 crc kubenswrapper[4869]: I0312 15:09:13.165846 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd15c8a9-0582-4c14-9450-407ad0cfa828","Type":"ContainerStarted","Data":"1292976ee9e3579ce89cfb4080110d00e5a5768b2e284f96fdc169afd47f4f15"} Mar 12 15:09:13 crc kubenswrapper[4869]: I0312 15:09:13.216441 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95ccfc9f9-9rcc7"] Mar 12 15:09:13 crc kubenswrapper[4869]: I0312 15:09:13.225309 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-95ccfc9f9-9rcc7"] Mar 12 15:09:13 crc kubenswrapper[4869]: I0312 15:09:13.551038 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Mar 12 15:09:14 crc kubenswrapper[4869]: I0312 15:09:14.353675 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3110530e-08c4-483d-898f-bcb2eeb0cc62" path="/var/lib/kubelet/pods/3110530e-08c4-483d-898f-bcb2eeb0cc62/volumes" Mar 12 15:09:15 crc kubenswrapper[4869]: I0312 15:09:15.824607 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="f2897896-f1c6-4ec2-9d89-fc19b2bf18a9" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.187:8080/\": dial tcp 10.217.0.187:8080: connect: connection refused" Mar 12 15:09:15 crc kubenswrapper[4869]: I0312 15:09:15.988290 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-volume-volume1-0" podUID="c13a91ca-db8c-42b4-bfca-029e427aff28" containerName="cinder-volume" probeResult="failure" output="Get \"http://10.217.0.189:8080/\": dial tcp 10.217.0.189:8080: connect: connection refused" Mar 12 15:09:16 crc kubenswrapper[4869]: I0312 15:09:16.104095 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-backup-0" podUID="5e9514e1-cc53-4445-904c-4505fc60a1ea" containerName="cinder-backup" probeResult="failure" output="Get \"http://10.217.0.190:8080/\": dial tcp 10.217.0.190:8080: connect: connection refused" Mar 12 15:09:19 crc kubenswrapper[4869]: I0312 15:09:19.556188 4869 scope.go:117] "RemoveContainer" containerID="192ea470b280bb582eae77c3f3779e33d7e7b61f275b0a207a15770c19a306ec" Mar 12 15:09:20 crc kubenswrapper[4869]: I0312 15:09:20.501451 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Mar 12 15:09:20 crc kubenswrapper[4869]: I0312 15:09:20.549715 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Mar 12 15:09:20 crc kubenswrapper[4869]: I0312 15:09:20.726724 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="3d172b0b-59f1-408e-befd-28542b61af1b" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.194:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 15:09:21 crc kubenswrapper[4869]: I0312 15:09:21.145681 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="f2897896-f1c6-4ec2-9d89-fc19b2bf18a9" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 15:09:21 crc kubenswrapper[4869]: I0312 15:09:21.287519 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="6838a2d7-2052-45b9-a8d5-3aa6639bccb4" containerName="manila-share" containerID="cri-o://8024b6d2f80b1844ef74c5d435773a5be37299846584c993574bc63e515f0c67" gracePeriod=30 Mar 12 15:09:21 crc kubenswrapper[4869]: I0312 15:09:21.287917 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="6838a2d7-2052-45b9-a8d5-3aa6639bccb4" containerName="probe" containerID="cri-o://7f81bd9fade21a35c31dd509835b1c12df5880f2683e340a53bc73ea3406d1ca" gracePeriod=30 Mar 12 15:09:21 crc kubenswrapper[4869]: I0312 15:09:21.339798 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-volume-volume1-0" podUID="c13a91ca-db8c-42b4-bfca-029e427aff28" containerName="cinder-volume" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 15:09:21 crc kubenswrapper[4869]: I0312 15:09:21.441031 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Mar 12 15:09:21 crc kubenswrapper[4869]: I0312 15:09:21.516929 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-backup-0"] Mar 12 15:09:22 crc kubenswrapper[4869]: I0312 15:09:22.316041 4869 generic.go:334] "Generic (PLEG): container finished" podID="6838a2d7-2052-45b9-a8d5-3aa6639bccb4" containerID="7f81bd9fade21a35c31dd509835b1c12df5880f2683e340a53bc73ea3406d1ca" exitCode=0 Mar 12 15:09:22 crc kubenswrapper[4869]: I0312 15:09:22.316676 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-backup-0" podUID="5e9514e1-cc53-4445-904c-4505fc60a1ea" containerName="cinder-backup" containerID="cri-o://938b163b843f4e17857b4a072034c1018ac7a8eb550a4473aec0f820d93c1ca8" gracePeriod=30 Mar 12 15:09:22 crc kubenswrapper[4869]: I0312 15:09:22.316976 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"6838a2d7-2052-45b9-a8d5-3aa6639bccb4","Type":"ContainerDied","Data":"7f81bd9fade21a35c31dd509835b1c12df5880f2683e340a53bc73ea3406d1ca"} Mar 12 15:09:22 crc kubenswrapper[4869]: I0312 15:09:22.317306 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-backup-0" podUID="5e9514e1-cc53-4445-904c-4505fc60a1ea" containerName="probe" containerID="cri-o://4c8079d6956ec3b6846f4db43100a55ba972a96b2ad8077d35664cb37bb0fdfb" gracePeriod=30 Mar 12 15:09:22 crc kubenswrapper[4869]: I0312 15:09:22.983185 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 12 15:09:23 crc kubenswrapper[4869]: I0312 15:09:23.335823 4869 generic.go:334] "Generic (PLEG): container finished" podID="5e9514e1-cc53-4445-904c-4505fc60a1ea" containerID="4c8079d6956ec3b6846f4db43100a55ba972a96b2ad8077d35664cb37bb0fdfb" exitCode=0 Mar 12 15:09:23 crc kubenswrapper[4869]: I0312 15:09:23.335854 4869 generic.go:334] "Generic (PLEG): container finished" podID="5e9514e1-cc53-4445-904c-4505fc60a1ea" containerID="938b163b843f4e17857b4a072034c1018ac7a8eb550a4473aec0f820d93c1ca8" exitCode=0 Mar 12 15:09:23 crc kubenswrapper[4869]: I0312 15:09:23.335895 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"5e9514e1-cc53-4445-904c-4505fc60a1ea","Type":"ContainerDied","Data":"4c8079d6956ec3b6846f4db43100a55ba972a96b2ad8077d35664cb37bb0fdfb"} Mar 12 15:09:23 crc kubenswrapper[4869]: I0312 15:09:23.335951 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"5e9514e1-cc53-4445-904c-4505fc60a1ea","Type":"ContainerDied","Data":"938b163b843f4e17857b4a072034c1018ac7a8eb550a4473aec0f820d93c1ca8"} Mar 12 15:09:23 crc kubenswrapper[4869]: I0312 15:09:23.340932 4869 generic.go:334] "Generic (PLEG): container finished" podID="6838a2d7-2052-45b9-a8d5-3aa6639bccb4" containerID="8024b6d2f80b1844ef74c5d435773a5be37299846584c993574bc63e515f0c67" exitCode=1 Mar 12 15:09:23 crc kubenswrapper[4869]: I0312 15:09:23.341025 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"6838a2d7-2052-45b9-a8d5-3aa6639bccb4","Type":"ContainerDied","Data":"8024b6d2f80b1844ef74c5d435773a5be37299846584c993574bc63e515f0c67"} Mar 12 15:09:25 crc kubenswrapper[4869]: I0312 15:09:25.646648 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Mar 12 15:09:25 crc kubenswrapper[4869]: I0312 15:09:25.841308 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 12 15:09:25 crc kubenswrapper[4869]: I0312 15:09:25.901328 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 15:09:25 crc kubenswrapper[4869]: I0312 15:09:25.996788 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:26 crc kubenswrapper[4869]: I0312 15:09:26.060440 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 12 15:09:26 crc kubenswrapper[4869]: I0312 15:09:26.367607 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-volume-volume1-0" podUID="c13a91ca-db8c-42b4-bfca-029e427aff28" containerName="cinder-volume" containerID="cri-o://5c0e9b92a4fc76dda029c064be59efd24949d843bc7c3041ec8b02343a1ee8f6" gracePeriod=30 Mar 12 15:09:26 crc kubenswrapper[4869]: I0312 15:09:26.367630 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-volume-volume1-0" podUID="c13a91ca-db8c-42b4-bfca-029e427aff28" containerName="probe" containerID="cri-o://34d1276e9015be95facca31f80054df14d295f9bf53676ed3de357b8cc588f0f" gracePeriod=30 Mar 12 15:09:26 crc kubenswrapper[4869]: I0312 15:09:26.367713 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="f2897896-f1c6-4ec2-9d89-fc19b2bf18a9" containerName="probe" containerID="cri-o://d9d43cab5f4d12f8f0588da191b0634fe4c2c0bece9cd4edab160227813bf17d" gracePeriod=30 Mar 12 15:09:26 crc kubenswrapper[4869]: I0312 15:09:26.367704 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="f2897896-f1c6-4ec2-9d89-fc19b2bf18a9" containerName="cinder-scheduler" containerID="cri-o://996fca8ead09df13972f40fe124cc62043a5f7eadae19ecfef41dbf98de20e18" gracePeriod=30 Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.410295 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"df9ac527-ae76-4cb7-b474-61f5699e610f","Type":"ContainerStarted","Data":"918fd1148c12f181a6ee3411d3a138b3791889d5ff4272298e85f07e31aa251f"} Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.419923 4869 generic.go:334] "Generic (PLEG): container finished" podID="f2897896-f1c6-4ec2-9d89-fc19b2bf18a9" containerID="d9d43cab5f4d12f8f0588da191b0634fe4c2c0bece9cd4edab160227813bf17d" exitCode=0 Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.419966 4869 generic.go:334] "Generic (PLEG): container finished" podID="f2897896-f1c6-4ec2-9d89-fc19b2bf18a9" containerID="996fca8ead09df13972f40fe124cc62043a5f7eadae19ecfef41dbf98de20e18" exitCode=0 Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.420039 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9","Type":"ContainerDied","Data":"d9d43cab5f4d12f8f0588da191b0634fe4c2c0bece9cd4edab160227813bf17d"} Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.420071 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9","Type":"ContainerDied","Data":"996fca8ead09df13972f40fe124cc62043a5f7eadae19ecfef41dbf98de20e18"} Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.433670 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"5e9514e1-cc53-4445-904c-4505fc60a1ea","Type":"ContainerDied","Data":"bb647e871607301d071226b3a30170442ae891103c7a693f204539af0acbfd22"} Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.433712 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb647e871607301d071226b3a30170442ae891103c7a693f204539af0acbfd22" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.443186 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.805978773 podStartE2EDuration="54.443171422s" podCreationTimestamp="2026-03-12 15:08:33 +0000 UTC" firstStartedPulling="2026-03-12 15:08:35.084109874 +0000 UTC m=+1267.369335152" lastFinishedPulling="2026-03-12 15:09:26.721302523 +0000 UTC m=+1319.006527801" observedRunningTime="2026-03-12 15:09:27.440921977 +0000 UTC m=+1319.726147255" watchObservedRunningTime="2026-03-12 15:09:27.443171422 +0000 UTC m=+1319.728396700" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.462479 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"6838a2d7-2052-45b9-a8d5-3aa6639bccb4","Type":"ContainerDied","Data":"5eef3bff8ede5e368449574dbd8de81c3649bcd7bf68cc7651c22688dcd5131d"} Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.462517 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5eef3bff8ede5e368449574dbd8de81c3649bcd7bf68cc7651c22688dcd5131d" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.470700 4869 generic.go:334] "Generic (PLEG): container finished" podID="c13a91ca-db8c-42b4-bfca-029e427aff28" containerID="34d1276e9015be95facca31f80054df14d295f9bf53676ed3de357b8cc588f0f" exitCode=0 Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.470765 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"c13a91ca-db8c-42b4-bfca-029e427aff28","Type":"ContainerDied","Data":"34d1276e9015be95facca31f80054df14d295f9bf53676ed3de357b8cc588f0f"} Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.472607 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-tgt7x" event={"ID":"7c21e651-1883-4a95-a803-e6a5a2b7f457","Type":"ContainerStarted","Data":"fb8feff685efe1b60d51bd5cf1b516b74dd7980d841d58162ce0d145a168e576"} Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.481940 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd15c8a9-0582-4c14-9450-407ad0cfa828","Type":"ContainerStarted","Data":"df9ada5ff4cc69b25fabf28f675a3451db0d54dce2b7dce3f095cf1db1450059"} Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.482155 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cd15c8a9-0582-4c14-9450-407ad0cfa828" containerName="ceilometer-central-agent" containerID="cri-o://abef6ce10f9b78d09aba11e8f0720335aa87354e3fd0046148a390edfefff7ec" gracePeriod=30 Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.482777 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.482837 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cd15c8a9-0582-4c14-9450-407ad0cfa828" containerName="proxy-httpd" containerID="cri-o://df9ada5ff4cc69b25fabf28f675a3451db0d54dce2b7dce3f095cf1db1450059" gracePeriod=30 Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.482894 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cd15c8a9-0582-4c14-9450-407ad0cfa828" containerName="sg-core" containerID="cri-o://1292976ee9e3579ce89cfb4080110d00e5a5768b2e284f96fdc169afd47f4f15" gracePeriod=30 Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.482942 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cd15c8a9-0582-4c14-9450-407ad0cfa828" containerName="ceilometer-notification-agent" containerID="cri-o://ddb6144292d8b97dcf8e8f16ae0310e23b3126a3c32becdfb1f98d740e54ace6" gracePeriod=30 Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.499926 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-tgt7x" podStartSLOduration=2.774724909 podStartE2EDuration="19.499904904s" podCreationTimestamp="2026-03-12 15:09:08 +0000 UTC" firstStartedPulling="2026-03-12 15:09:10.103245283 +0000 UTC m=+1302.388470561" lastFinishedPulling="2026-03-12 15:09:26.828425278 +0000 UTC m=+1319.113650556" observedRunningTime="2026-03-12 15:09:27.49309082 +0000 UTC m=+1319.778316088" watchObservedRunningTime="2026-03-12 15:09:27.499904904 +0000 UTC m=+1319.785130182" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.519428 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.889463609 podStartE2EDuration="28.519409302s" podCreationTimestamp="2026-03-12 15:08:59 +0000 UTC" firstStartedPulling="2026-03-12 15:09:00.583254266 +0000 UTC m=+1292.868479544" lastFinishedPulling="2026-03-12 15:09:26.213199969 +0000 UTC m=+1318.498425237" observedRunningTime="2026-03-12 15:09:27.517078936 +0000 UTC m=+1319.802304224" watchObservedRunningTime="2026-03-12 15:09:27.519409302 +0000 UTC m=+1319.804634580" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.548534 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.565468 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.664388 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e9514e1-cc53-4445-904c-4505fc60a1ea-scripts\") pod \"5e9514e1-cc53-4445-904c-4505fc60a1ea\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.664479 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhbtk\" (UniqueName: \"kubernetes.io/projected/5e9514e1-cc53-4445-904c-4505fc60a1ea-kube-api-access-mhbtk\") pod \"5e9514e1-cc53-4445-904c-4505fc60a1ea\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.664552 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-var-lib-cinder\") pod \"5e9514e1-cc53-4445-904c-4505fc60a1ea\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.664585 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-var-locks-brick\") pod \"5e9514e1-cc53-4445-904c-4505fc60a1ea\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.664601 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e9514e1-cc53-4445-904c-4505fc60a1ea-config-data-custom\") pod \"5e9514e1-cc53-4445-904c-4505fc60a1ea\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.664625 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-etc-nvme\") pod \"5e9514e1-cc53-4445-904c-4505fc60a1ea\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.664658 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e9514e1-cc53-4445-904c-4505fc60a1ea-config-data\") pod \"5e9514e1-cc53-4445-904c-4505fc60a1ea\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.664689 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-etc-iscsi\") pod \"5e9514e1-cc53-4445-904c-4505fc60a1ea\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.664705 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-etc-machine-id\") pod \"5e9514e1-cc53-4445-904c-4505fc60a1ea\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.664740 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-dev\") pod \"5e9514e1-cc53-4445-904c-4505fc60a1ea\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.664775 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-sys\") pod \"5e9514e1-cc53-4445-904c-4505fc60a1ea\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.664811 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9514e1-cc53-4445-904c-4505fc60a1ea-combined-ca-bundle\") pod \"5e9514e1-cc53-4445-904c-4505fc60a1ea\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.664841 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-var-locks-cinder\") pod \"5e9514e1-cc53-4445-904c-4505fc60a1ea\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.664862 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5e9514e1-cc53-4445-904c-4505fc60a1ea-ceph\") pod \"5e9514e1-cc53-4445-904c-4505fc60a1ea\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.664917 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-run\") pod \"5e9514e1-cc53-4445-904c-4505fc60a1ea\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.664948 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-lib-modules\") pod \"5e9514e1-cc53-4445-904c-4505fc60a1ea\" (UID: \"5e9514e1-cc53-4445-904c-4505fc60a1ea\") " Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.665145 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "5e9514e1-cc53-4445-904c-4505fc60a1ea" (UID: "5e9514e1-cc53-4445-904c-4505fc60a1ea"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.665450 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "5e9514e1-cc53-4445-904c-4505fc60a1ea" (UID: "5e9514e1-cc53-4445-904c-4505fc60a1ea"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.665593 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5e9514e1-cc53-4445-904c-4505fc60a1ea" (UID: "5e9514e1-cc53-4445-904c-4505fc60a1ea"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.665619 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-dev" (OuterVolumeSpecName: "dev") pod "5e9514e1-cc53-4445-904c-4505fc60a1ea" (UID: "5e9514e1-cc53-4445-904c-4505fc60a1ea"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.665635 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-sys" (OuterVolumeSpecName: "sys") pod "5e9514e1-cc53-4445-904c-4505fc60a1ea" (UID: "5e9514e1-cc53-4445-904c-4505fc60a1ea"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.666443 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "5e9514e1-cc53-4445-904c-4505fc60a1ea" (UID: "5e9514e1-cc53-4445-904c-4505fc60a1ea"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.668028 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-run" (OuterVolumeSpecName: "run") pod "5e9514e1-cc53-4445-904c-4505fc60a1ea" (UID: "5e9514e1-cc53-4445-904c-4505fc60a1ea"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.668084 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "5e9514e1-cc53-4445-904c-4505fc60a1ea" (UID: "5e9514e1-cc53-4445-904c-4505fc60a1ea"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.673399 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e9514e1-cc53-4445-904c-4505fc60a1ea-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5e9514e1-cc53-4445-904c-4505fc60a1ea" (UID: "5e9514e1-cc53-4445-904c-4505fc60a1ea"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.674650 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "5e9514e1-cc53-4445-904c-4505fc60a1ea" (UID: "5e9514e1-cc53-4445-904c-4505fc60a1ea"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.674682 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "5e9514e1-cc53-4445-904c-4505fc60a1ea" (UID: "5e9514e1-cc53-4445-904c-4505fc60a1ea"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.674769 4869 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.674829 4869 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.676666 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e9514e1-cc53-4445-904c-4505fc60a1ea-scripts" (OuterVolumeSpecName: "scripts") pod "5e9514e1-cc53-4445-904c-4505fc60a1ea" (UID: "5e9514e1-cc53-4445-904c-4505fc60a1ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.679487 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e9514e1-cc53-4445-904c-4505fc60a1ea-ceph" (OuterVolumeSpecName: "ceph") pod "5e9514e1-cc53-4445-904c-4505fc60a1ea" (UID: "5e9514e1-cc53-4445-904c-4505fc60a1ea"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.681887 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e9514e1-cc53-4445-904c-4505fc60a1ea-kube-api-access-mhbtk" (OuterVolumeSpecName: "kube-api-access-mhbtk") pod "5e9514e1-cc53-4445-904c-4505fc60a1ea" (UID: "5e9514e1-cc53-4445-904c-4505fc60a1ea"). InnerVolumeSpecName "kube-api-access-mhbtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.777985 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-var-lib-manila\") pod \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\" (UID: \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\") " Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.778035 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-scripts\") pod \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\" (UID: \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\") " Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.778060 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-combined-ca-bundle\") pod \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\" (UID: \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\") " Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.778082 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-config-data-custom\") pod \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\" (UID: \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\") " Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.778101 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-ceph\") pod \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\" (UID: \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\") " Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.778146 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-config-data\") pod \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\" (UID: \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\") " Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.778210 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-etc-machine-id\") pod \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\" (UID: \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\") " Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.778281 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqc8q\" (UniqueName: \"kubernetes.io/projected/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-kube-api-access-kqc8q\") pod \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\" (UID: \"6838a2d7-2052-45b9-a8d5-3aa6639bccb4\") " Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.778665 4869 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-run\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.778678 4869 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.778687 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e9514e1-cc53-4445-904c-4505fc60a1ea-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.778697 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhbtk\" (UniqueName: \"kubernetes.io/projected/5e9514e1-cc53-4445-904c-4505fc60a1ea-kube-api-access-mhbtk\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.778706 4869 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-var-lib-cinder\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.778733 4869 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e9514e1-cc53-4445-904c-4505fc60a1ea-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.778741 4869 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.778749 4869 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.778757 4869 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-dev\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.778766 4869 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-sys\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.778774 4869 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/5e9514e1-cc53-4445-904c-4505fc60a1ea-var-locks-cinder\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.778781 4869 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5e9514e1-cc53-4445-904c-4505fc60a1ea-ceph\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.782095 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "6838a2d7-2052-45b9-a8d5-3aa6639bccb4" (UID: "6838a2d7-2052-45b9-a8d5-3aa6639bccb4"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.785059 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.785450 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6838a2d7-2052-45b9-a8d5-3aa6639bccb4" (UID: "6838a2d7-2052-45b9-a8d5-3aa6639bccb4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.785709 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-kube-api-access-kqc8q" (OuterVolumeSpecName: "kube-api-access-kqc8q") pod "6838a2d7-2052-45b9-a8d5-3aa6639bccb4" (UID: "6838a2d7-2052-45b9-a8d5-3aa6639bccb4"). InnerVolumeSpecName "kube-api-access-kqc8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.786615 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e9514e1-cc53-4445-904c-4505fc60a1ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e9514e1-cc53-4445-904c-4505fc60a1ea" (UID: "5e9514e1-cc53-4445-904c-4505fc60a1ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.786868 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-scripts" (OuterVolumeSpecName: "scripts") pod "6838a2d7-2052-45b9-a8d5-3aa6639bccb4" (UID: "6838a2d7-2052-45b9-a8d5-3aa6639bccb4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.822430 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6838a2d7-2052-45b9-a8d5-3aa6639bccb4" (UID: "6838a2d7-2052-45b9-a8d5-3aa6639bccb4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.822436 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-ceph" (OuterVolumeSpecName: "ceph") pod "6838a2d7-2052-45b9-a8d5-3aa6639bccb4" (UID: "6838a2d7-2052-45b9-a8d5-3aa6639bccb4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.885723 4869 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.886052 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqc8q\" (UniqueName: \"kubernetes.io/projected/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-kube-api-access-kqc8q\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.886063 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9514e1-cc53-4445-904c-4505fc60a1ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.886077 4869 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-var-lib-manila\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.886086 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.886112 4869 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.886123 4869 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-ceph\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.897839 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6838a2d7-2052-45b9-a8d5-3aa6639bccb4" (UID: "6838a2d7-2052-45b9-a8d5-3aa6639bccb4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.917503 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e9514e1-cc53-4445-904c-4505fc60a1ea-config-data" (OuterVolumeSpecName: "config-data") pod "5e9514e1-cc53-4445-904c-4505fc60a1ea" (UID: "5e9514e1-cc53-4445-904c-4505fc60a1ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.923922 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.939033 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-config-data" (OuterVolumeSpecName: "config-data") pod "6838a2d7-2052-45b9-a8d5-3aa6639bccb4" (UID: "6838a2d7-2052-45b9-a8d5-3aa6639bccb4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.987336 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-etc-machine-id\") pod \"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9\" (UID: \"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9\") " Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.987387 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-config-data\") pod \"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9\" (UID: \"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9\") " Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.987420 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whx8k\" (UniqueName: \"kubernetes.io/projected/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-kube-api-access-whx8k\") pod \"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9\" (UID: \"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9\") " Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.987524 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-config-data-custom\") pod \"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9\" (UID: \"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9\") " Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.987632 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-scripts\") pod \"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9\" (UID: \"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9\") " Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.987718 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-combined-ca-bundle\") pod \"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9\" (UID: \"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9\") " Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.987830 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f2897896-f1c6-4ec2-9d89-fc19b2bf18a9" (UID: "f2897896-f1c6-4ec2-9d89-fc19b2bf18a9"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.988067 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e9514e1-cc53-4445-904c-4505fc60a1ea-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.988079 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.988090 4869 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.988099 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6838a2d7-2052-45b9-a8d5-3aa6639bccb4-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.991662 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-kube-api-access-whx8k" (OuterVolumeSpecName: "kube-api-access-whx8k") pod "f2897896-f1c6-4ec2-9d89-fc19b2bf18a9" (UID: "f2897896-f1c6-4ec2-9d89-fc19b2bf18a9"). InnerVolumeSpecName "kube-api-access-whx8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.991873 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f2897896-f1c6-4ec2-9d89-fc19b2bf18a9" (UID: "f2897896-f1c6-4ec2-9d89-fc19b2bf18a9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:27 crc kubenswrapper[4869]: I0312 15:09:27.994887 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-scripts" (OuterVolumeSpecName: "scripts") pod "f2897896-f1c6-4ec2-9d89-fc19b2bf18a9" (UID: "f2897896-f1c6-4ec2-9d89-fc19b2bf18a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.056568 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f2897896-f1c6-4ec2-9d89-fc19b2bf18a9" (UID: "f2897896-f1c6-4ec2-9d89-fc19b2bf18a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.089548 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28v8v\" (UniqueName: \"kubernetes.io/projected/c13a91ca-db8c-42b4-bfca-029e427aff28-kube-api-access-28v8v\") pod \"c13a91ca-db8c-42b4-bfca-029e427aff28\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.089608 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-var-locks-cinder\") pod \"c13a91ca-db8c-42b4-bfca-029e427aff28\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.089645 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c13a91ca-db8c-42b4-bfca-029e427aff28-config-data\") pod \"c13a91ca-db8c-42b4-bfca-029e427aff28\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.089664 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-etc-machine-id\") pod \"c13a91ca-db8c-42b4-bfca-029e427aff28\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.089680 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-lib-modules\") pod \"c13a91ca-db8c-42b4-bfca-029e427aff28\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.089713 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-dev\") pod \"c13a91ca-db8c-42b4-bfca-029e427aff28\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.089752 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-var-locks-brick\") pod \"c13a91ca-db8c-42b4-bfca-029e427aff28\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.089770 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-run\") pod \"c13a91ca-db8c-42b4-bfca-029e427aff28\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.089790 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-var-lib-cinder\") pod \"c13a91ca-db8c-42b4-bfca-029e427aff28\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.089812 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-sys\") pod \"c13a91ca-db8c-42b4-bfca-029e427aff28\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.089874 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c13a91ca-db8c-42b4-bfca-029e427aff28-ceph\") pod \"c13a91ca-db8c-42b4-bfca-029e427aff28\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.089916 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c13a91ca-db8c-42b4-bfca-029e427aff28-combined-ca-bundle\") pod \"c13a91ca-db8c-42b4-bfca-029e427aff28\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.089939 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c13a91ca-db8c-42b4-bfca-029e427aff28-scripts\") pod \"c13a91ca-db8c-42b4-bfca-029e427aff28\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.089970 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c13a91ca-db8c-42b4-bfca-029e427aff28-config-data-custom\") pod \"c13a91ca-db8c-42b4-bfca-029e427aff28\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.090004 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-etc-nvme\") pod \"c13a91ca-db8c-42b4-bfca-029e427aff28\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.090039 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-etc-iscsi\") pod \"c13a91ca-db8c-42b4-bfca-029e427aff28\" (UID: \"c13a91ca-db8c-42b4-bfca-029e427aff28\") " Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.090386 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.090404 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.090417 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whx8k\" (UniqueName: \"kubernetes.io/projected/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-kube-api-access-whx8k\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.090428 4869 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.090479 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "c13a91ca-db8c-42b4-bfca-029e427aff28" (UID: "c13a91ca-db8c-42b4-bfca-029e427aff28"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.090836 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-run" (OuterVolumeSpecName: "run") pod "c13a91ca-db8c-42b4-bfca-029e427aff28" (UID: "c13a91ca-db8c-42b4-bfca-029e427aff28"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.090901 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "c13a91ca-db8c-42b4-bfca-029e427aff28" (UID: "c13a91ca-db8c-42b4-bfca-029e427aff28"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.090941 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-sys" (OuterVolumeSpecName: "sys") pod "c13a91ca-db8c-42b4-bfca-029e427aff28" (UID: "c13a91ca-db8c-42b4-bfca-029e427aff28"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.091054 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "c13a91ca-db8c-42b4-bfca-029e427aff28" (UID: "c13a91ca-db8c-42b4-bfca-029e427aff28"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.091127 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "c13a91ca-db8c-42b4-bfca-029e427aff28" (UID: "c13a91ca-db8c-42b4-bfca-029e427aff28"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.091170 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "c13a91ca-db8c-42b4-bfca-029e427aff28" (UID: "c13a91ca-db8c-42b4-bfca-029e427aff28"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.091174 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c13a91ca-db8c-42b4-bfca-029e427aff28" (UID: "c13a91ca-db8c-42b4-bfca-029e427aff28"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.091206 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-dev" (OuterVolumeSpecName: "dev") pod "c13a91ca-db8c-42b4-bfca-029e427aff28" (UID: "c13a91ca-db8c-42b4-bfca-029e427aff28"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.091237 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "c13a91ca-db8c-42b4-bfca-029e427aff28" (UID: "c13a91ca-db8c-42b4-bfca-029e427aff28"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.094039 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c13a91ca-db8c-42b4-bfca-029e427aff28-kube-api-access-28v8v" (OuterVolumeSpecName: "kube-api-access-28v8v") pod "c13a91ca-db8c-42b4-bfca-029e427aff28" (UID: "c13a91ca-db8c-42b4-bfca-029e427aff28"). InnerVolumeSpecName "kube-api-access-28v8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.094711 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c13a91ca-db8c-42b4-bfca-029e427aff28-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c13a91ca-db8c-42b4-bfca-029e427aff28" (UID: "c13a91ca-db8c-42b4-bfca-029e427aff28"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.099002 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-config-data" (OuterVolumeSpecName: "config-data") pod "f2897896-f1c6-4ec2-9d89-fc19b2bf18a9" (UID: "f2897896-f1c6-4ec2-9d89-fc19b2bf18a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.099934 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c13a91ca-db8c-42b4-bfca-029e427aff28-ceph" (OuterVolumeSpecName: "ceph") pod "c13a91ca-db8c-42b4-bfca-029e427aff28" (UID: "c13a91ca-db8c-42b4-bfca-029e427aff28"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.102151 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c13a91ca-db8c-42b4-bfca-029e427aff28-scripts" (OuterVolumeSpecName: "scripts") pod "c13a91ca-db8c-42b4-bfca-029e427aff28" (UID: "c13a91ca-db8c-42b4-bfca-029e427aff28"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.145561 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c13a91ca-db8c-42b4-bfca-029e427aff28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c13a91ca-db8c-42b4-bfca-029e427aff28" (UID: "c13a91ca-db8c-42b4-bfca-029e427aff28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.192709 4869 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-var-locks-cinder\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.192750 4869 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.192762 4869 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-lib-modules\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.192773 4869 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-dev\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.192785 4869 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-var-locks-brick\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.192795 4869 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-run\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.192806 4869 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-var-lib-cinder\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.192817 4869 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-sys\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.192827 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.192837 4869 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c13a91ca-db8c-42b4-bfca-029e427aff28-ceph\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.192849 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c13a91ca-db8c-42b4-bfca-029e427aff28-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.192859 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c13a91ca-db8c-42b4-bfca-029e427aff28-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.192869 4869 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c13a91ca-db8c-42b4-bfca-029e427aff28-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.192879 4869 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-etc-nvme\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.192890 4869 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c13a91ca-db8c-42b4-bfca-029e427aff28-etc-iscsi\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.192928 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28v8v\" (UniqueName: \"kubernetes.io/projected/c13a91ca-db8c-42b4-bfca-029e427aff28-kube-api-access-28v8v\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.194472 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c13a91ca-db8c-42b4-bfca-029e427aff28-config-data" (OuterVolumeSpecName: "config-data") pod "c13a91ca-db8c-42b4-bfca-029e427aff28" (UID: "c13a91ca-db8c-42b4-bfca-029e427aff28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.302197 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c13a91ca-db8c-42b4-bfca-029e427aff28-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.492281 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f2897896-f1c6-4ec2-9d89-fc19b2bf18a9","Type":"ContainerDied","Data":"1024636ffd65b6773aaf79a479050440cd4b9c5201e56d0c9d3de77a7fe57bbc"} Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.492678 4869 scope.go:117] "RemoveContainer" containerID="d9d43cab5f4d12f8f0588da191b0634fe4c2c0bece9cd4edab160227813bf17d" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.492866 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.500845 4869 generic.go:334] "Generic (PLEG): container finished" podID="c13a91ca-db8c-42b4-bfca-029e427aff28" containerID="5c0e9b92a4fc76dda029c064be59efd24949d843bc7c3041ec8b02343a1ee8f6" exitCode=0 Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.500915 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"c13a91ca-db8c-42b4-bfca-029e427aff28","Type":"ContainerDied","Data":"5c0e9b92a4fc76dda029c064be59efd24949d843bc7c3041ec8b02343a1ee8f6"} Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.500959 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"c13a91ca-db8c-42b4-bfca-029e427aff28","Type":"ContainerDied","Data":"23f0e9b3a7cf88948884e8f788df4eb82dcd7e1e08852ac82e535a3497644ef7"} Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.501017 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.513741 4869 generic.go:334] "Generic (PLEG): container finished" podID="cd15c8a9-0582-4c14-9450-407ad0cfa828" containerID="1292976ee9e3579ce89cfb4080110d00e5a5768b2e284f96fdc169afd47f4f15" exitCode=2 Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.514035 4869 generic.go:334] "Generic (PLEG): container finished" podID="cd15c8a9-0582-4c14-9450-407ad0cfa828" containerID="ddb6144292d8b97dcf8e8f16ae0310e23b3126a3c32becdfb1f98d740e54ace6" exitCode=0 Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.514163 4869 generic.go:334] "Generic (PLEG): container finished" podID="cd15c8a9-0582-4c14-9450-407ad0cfa828" containerID="abef6ce10f9b78d09aba11e8f0720335aa87354e3fd0046148a390edfefff7ec" exitCode=0 Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.515290 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.516039 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd15c8a9-0582-4c14-9450-407ad0cfa828","Type":"ContainerDied","Data":"1292976ee9e3579ce89cfb4080110d00e5a5768b2e284f96fdc169afd47f4f15"} Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.516083 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd15c8a9-0582-4c14-9450-407ad0cfa828","Type":"ContainerDied","Data":"ddb6144292d8b97dcf8e8f16ae0310e23b3126a3c32becdfb1f98d740e54ace6"} Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.516109 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd15c8a9-0582-4c14-9450-407ad0cfa828","Type":"ContainerDied","Data":"abef6ce10f9b78d09aba11e8f0720335aa87354e3fd0046148a390edfefff7ec"} Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.517081 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.532995 4869 scope.go:117] "RemoveContainer" containerID="996fca8ead09df13972f40fe124cc62043a5f7eadae19ecfef41dbf98de20e18" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.534528 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.550826 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.566961 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 15:09:28 crc kubenswrapper[4869]: E0312 15:09:28.567349 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2897896-f1c6-4ec2-9d89-fc19b2bf18a9" containerName="cinder-scheduler" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.567363 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2897896-f1c6-4ec2-9d89-fc19b2bf18a9" containerName="cinder-scheduler" Mar 12 15:09:28 crc kubenswrapper[4869]: E0312 15:09:28.567379 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e9514e1-cc53-4445-904c-4505fc60a1ea" containerName="cinder-backup" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.567386 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e9514e1-cc53-4445-904c-4505fc60a1ea" containerName="cinder-backup" Mar 12 15:09:28 crc kubenswrapper[4869]: E0312 15:09:28.567405 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6838a2d7-2052-45b9-a8d5-3aa6639bccb4" containerName="manila-share" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.567410 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="6838a2d7-2052-45b9-a8d5-3aa6639bccb4" containerName="manila-share" Mar 12 15:09:28 crc kubenswrapper[4869]: E0312 15:09:28.567428 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3110530e-08c4-483d-898f-bcb2eeb0cc62" containerName="init" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.567434 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="3110530e-08c4-483d-898f-bcb2eeb0cc62" containerName="init" Mar 12 15:09:28 crc kubenswrapper[4869]: E0312 15:09:28.567446 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3110530e-08c4-483d-898f-bcb2eeb0cc62" containerName="dnsmasq-dns" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.567451 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="3110530e-08c4-483d-898f-bcb2eeb0cc62" containerName="dnsmasq-dns" Mar 12 15:09:28 crc kubenswrapper[4869]: E0312 15:09:28.567462 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6838a2d7-2052-45b9-a8d5-3aa6639bccb4" containerName="probe" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.567468 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="6838a2d7-2052-45b9-a8d5-3aa6639bccb4" containerName="probe" Mar 12 15:09:28 crc kubenswrapper[4869]: E0312 15:09:28.567480 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c13a91ca-db8c-42b4-bfca-029e427aff28" containerName="cinder-volume" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.567485 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="c13a91ca-db8c-42b4-bfca-029e427aff28" containerName="cinder-volume" Mar 12 15:09:28 crc kubenswrapper[4869]: E0312 15:09:28.567496 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e9514e1-cc53-4445-904c-4505fc60a1ea" containerName="probe" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.567502 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e9514e1-cc53-4445-904c-4505fc60a1ea" containerName="probe" Mar 12 15:09:28 crc kubenswrapper[4869]: E0312 15:09:28.567511 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2897896-f1c6-4ec2-9d89-fc19b2bf18a9" containerName="probe" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.567518 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2897896-f1c6-4ec2-9d89-fc19b2bf18a9" containerName="probe" Mar 12 15:09:28 crc kubenswrapper[4869]: E0312 15:09:28.567530 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c13a91ca-db8c-42b4-bfca-029e427aff28" containerName="probe" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.567549 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="c13a91ca-db8c-42b4-bfca-029e427aff28" containerName="probe" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.567724 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="6838a2d7-2052-45b9-a8d5-3aa6639bccb4" containerName="probe" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.567743 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="c13a91ca-db8c-42b4-bfca-029e427aff28" containerName="probe" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.567752 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2897896-f1c6-4ec2-9d89-fc19b2bf18a9" containerName="cinder-scheduler" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.567762 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2897896-f1c6-4ec2-9d89-fc19b2bf18a9" containerName="probe" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.567772 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="3110530e-08c4-483d-898f-bcb2eeb0cc62" containerName="dnsmasq-dns" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.567784 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="6838a2d7-2052-45b9-a8d5-3aa6639bccb4" containerName="manila-share" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.567790 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="c13a91ca-db8c-42b4-bfca-029e427aff28" containerName="cinder-volume" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.567798 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e9514e1-cc53-4445-904c-4505fc60a1ea" containerName="probe" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.567809 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e9514e1-cc53-4445-904c-4505fc60a1ea" containerName="cinder-backup" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.568706 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.572726 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.593623 4869 scope.go:117] "RemoveContainer" containerID="34d1276e9015be95facca31f80054df14d295f9bf53676ed3de357b8cc588f0f" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.593726 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.608893 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/95fcb56b-b075-469f-9a02-bcc737a18c26-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"95fcb56b-b075-469f-9a02-bcc737a18c26\") " pod="openstack/cinder-scheduler-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.608952 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95fcb56b-b075-469f-9a02-bcc737a18c26-scripts\") pod \"cinder-scheduler-0\" (UID: \"95fcb56b-b075-469f-9a02-bcc737a18c26\") " pod="openstack/cinder-scheduler-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.608990 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95fcb56b-b075-469f-9a02-bcc737a18c26-config-data\") pod \"cinder-scheduler-0\" (UID: \"95fcb56b-b075-469f-9a02-bcc737a18c26\") " pod="openstack/cinder-scheduler-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.609029 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95fcb56b-b075-469f-9a02-bcc737a18c26-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"95fcb56b-b075-469f-9a02-bcc737a18c26\") " pod="openstack/cinder-scheduler-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.609053 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95fcb56b-b075-469f-9a02-bcc737a18c26-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"95fcb56b-b075-469f-9a02-bcc737a18c26\") " pod="openstack/cinder-scheduler-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.609316 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgdqb\" (UniqueName: \"kubernetes.io/projected/95fcb56b-b075-469f-9a02-bcc737a18c26-kube-api-access-vgdqb\") pod \"cinder-scheduler-0\" (UID: \"95fcb56b-b075-469f-9a02-bcc737a18c26\") " pod="openstack/cinder-scheduler-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.642765 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.658445 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.667849 4869 scope.go:117] "RemoveContainer" containerID="5c0e9b92a4fc76dda029c064be59efd24949d843bc7c3041ec8b02343a1ee8f6" Mar 12 15:09:28 crc kubenswrapper[4869]: E0312 15:09:28.679628 4869 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc13a91ca_db8c_42b4_bfca_029e427aff28.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc13a91ca_db8c_42b4_bfca_029e427aff28.slice/crio-23f0e9b3a7cf88948884e8f788df4eb82dcd7e1e08852ac82e535a3497644ef7\": RecentStats: unable to find data in memory cache]" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.682836 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.703604 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.710802 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgdqb\" (UniqueName: \"kubernetes.io/projected/95fcb56b-b075-469f-9a02-bcc737a18c26-kube-api-access-vgdqb\") pod \"cinder-scheduler-0\" (UID: \"95fcb56b-b075-469f-9a02-bcc737a18c26\") " pod="openstack/cinder-scheduler-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.710875 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/95fcb56b-b075-469f-9a02-bcc737a18c26-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"95fcb56b-b075-469f-9a02-bcc737a18c26\") " pod="openstack/cinder-scheduler-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.710909 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95fcb56b-b075-469f-9a02-bcc737a18c26-scripts\") pod \"cinder-scheduler-0\" (UID: \"95fcb56b-b075-469f-9a02-bcc737a18c26\") " pod="openstack/cinder-scheduler-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.710946 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95fcb56b-b075-469f-9a02-bcc737a18c26-config-data\") pod \"cinder-scheduler-0\" (UID: \"95fcb56b-b075-469f-9a02-bcc737a18c26\") " pod="openstack/cinder-scheduler-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.710979 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95fcb56b-b075-469f-9a02-bcc737a18c26-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"95fcb56b-b075-469f-9a02-bcc737a18c26\") " pod="openstack/cinder-scheduler-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.711005 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95fcb56b-b075-469f-9a02-bcc737a18c26-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"95fcb56b-b075-469f-9a02-bcc737a18c26\") " pod="openstack/cinder-scheduler-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.714969 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-backup-0"] Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.715028 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/95fcb56b-b075-469f-9a02-bcc737a18c26-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"95fcb56b-b075-469f-9a02-bcc737a18c26\") " pod="openstack/cinder-scheduler-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.720074 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95fcb56b-b075-469f-9a02-bcc737a18c26-scripts\") pod \"cinder-scheduler-0\" (UID: \"95fcb56b-b075-469f-9a02-bcc737a18c26\") " pod="openstack/cinder-scheduler-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.720090 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95fcb56b-b075-469f-9a02-bcc737a18c26-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"95fcb56b-b075-469f-9a02-bcc737a18c26\") " pod="openstack/cinder-scheduler-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.722391 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95fcb56b-b075-469f-9a02-bcc737a18c26-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"95fcb56b-b075-469f-9a02-bcc737a18c26\") " pod="openstack/cinder-scheduler-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.724596 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95fcb56b-b075-469f-9a02-bcc737a18c26-config-data\") pod \"cinder-scheduler-0\" (UID: \"95fcb56b-b075-469f-9a02-bcc737a18c26\") " pod="openstack/cinder-scheduler-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.727809 4869 scope.go:117] "RemoveContainer" containerID="34d1276e9015be95facca31f80054df14d295f9bf53676ed3de357b8cc588f0f" Mar 12 15:09:28 crc kubenswrapper[4869]: E0312 15:09:28.728665 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34d1276e9015be95facca31f80054df14d295f9bf53676ed3de357b8cc588f0f\": container with ID starting with 34d1276e9015be95facca31f80054df14d295f9bf53676ed3de357b8cc588f0f not found: ID does not exist" containerID="34d1276e9015be95facca31f80054df14d295f9bf53676ed3de357b8cc588f0f" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.728704 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34d1276e9015be95facca31f80054df14d295f9bf53676ed3de357b8cc588f0f"} err="failed to get container status \"34d1276e9015be95facca31f80054df14d295f9bf53676ed3de357b8cc588f0f\": rpc error: code = NotFound desc = could not find container \"34d1276e9015be95facca31f80054df14d295f9bf53676ed3de357b8cc588f0f\": container with ID starting with 34d1276e9015be95facca31f80054df14d295f9bf53676ed3de357b8cc588f0f not found: ID does not exist" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.728822 4869 scope.go:117] "RemoveContainer" containerID="5c0e9b92a4fc76dda029c064be59efd24949d843bc7c3041ec8b02343a1ee8f6" Mar 12 15:09:28 crc kubenswrapper[4869]: E0312 15:09:28.729374 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c0e9b92a4fc76dda029c064be59efd24949d843bc7c3041ec8b02343a1ee8f6\": container with ID starting with 5c0e9b92a4fc76dda029c064be59efd24949d843bc7c3041ec8b02343a1ee8f6 not found: ID does not exist" containerID="5c0e9b92a4fc76dda029c064be59efd24949d843bc7c3041ec8b02343a1ee8f6" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.729426 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c0e9b92a4fc76dda029c064be59efd24949d843bc7c3041ec8b02343a1ee8f6"} err="failed to get container status \"5c0e9b92a4fc76dda029c064be59efd24949d843bc7c3041ec8b02343a1ee8f6\": rpc error: code = NotFound desc = could not find container \"5c0e9b92a4fc76dda029c064be59efd24949d843bc7c3041ec8b02343a1ee8f6\": container with ID starting with 5c0e9b92a4fc76dda029c064be59efd24949d843bc7c3041ec8b02343a1ee8f6 not found: ID does not exist" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.732854 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgdqb\" (UniqueName: \"kubernetes.io/projected/95fcb56b-b075-469f-9a02-bcc737a18c26-kube-api-access-vgdqb\") pod \"cinder-scheduler-0\" (UID: \"95fcb56b-b075-469f-9a02-bcc737a18c26\") " pod="openstack/cinder-scheduler-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.758989 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.760966 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.764126 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.769843 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.778099 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.781023 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.800851 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-backup-0"] Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.809818 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.847752 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.879414 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.882006 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.889414 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.899935 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.917735 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9523917e-a416-4304-9479-9ef0a1a9a09d-ceph\") pod \"manila-share-share1-0\" (UID: \"9523917e-a416-4304-9479-9ef0a1a9a09d\") " pod="openstack/manila-share-share1-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.917808 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9523917e-a416-4304-9479-9ef0a1a9a09d-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"9523917e-a416-4304-9479-9ef0a1a9a09d\") " pod="openstack/manila-share-share1-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.917833 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/870974b7-5400-44ae-91be-ca7bc532764c-dev\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.917883 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/870974b7-5400-44ae-91be-ca7bc532764c-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.917935 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/870974b7-5400-44ae-91be-ca7bc532764c-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.917956 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/870974b7-5400-44ae-91be-ca7bc532764c-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.917975 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/870974b7-5400-44ae-91be-ca7bc532764c-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.917996 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnhw7\" (UniqueName: \"kubernetes.io/projected/870974b7-5400-44ae-91be-ca7bc532764c-kube-api-access-fnhw7\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.918028 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9523917e-a416-4304-9479-9ef0a1a9a09d-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"9523917e-a416-4304-9479-9ef0a1a9a09d\") " pod="openstack/manila-share-share1-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.918055 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/870974b7-5400-44ae-91be-ca7bc532764c-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.918082 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/870974b7-5400-44ae-91be-ca7bc532764c-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.918108 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9v28\" (UniqueName: \"kubernetes.io/projected/9523917e-a416-4304-9479-9ef0a1a9a09d-kube-api-access-v9v28\") pod \"manila-share-share1-0\" (UID: \"9523917e-a416-4304-9479-9ef0a1a9a09d\") " pod="openstack/manila-share-share1-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.918130 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9523917e-a416-4304-9479-9ef0a1a9a09d-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"9523917e-a416-4304-9479-9ef0a1a9a09d\") " pod="openstack/manila-share-share1-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.918189 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/870974b7-5400-44ae-91be-ca7bc532764c-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.918220 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9523917e-a416-4304-9479-9ef0a1a9a09d-config-data\") pod \"manila-share-share1-0\" (UID: \"9523917e-a416-4304-9479-9ef0a1a9a09d\") " pod="openstack/manila-share-share1-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.918241 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/9523917e-a416-4304-9479-9ef0a1a9a09d-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"9523917e-a416-4304-9479-9ef0a1a9a09d\") " pod="openstack/manila-share-share1-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.918276 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/870974b7-5400-44ae-91be-ca7bc532764c-sys\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.918295 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9523917e-a416-4304-9479-9ef0a1a9a09d-scripts\") pod \"manila-share-share1-0\" (UID: \"9523917e-a416-4304-9479-9ef0a1a9a09d\") " pod="openstack/manila-share-share1-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.918315 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/870974b7-5400-44ae-91be-ca7bc532764c-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.918337 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/870974b7-5400-44ae-91be-ca7bc532764c-run\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.918365 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/870974b7-5400-44ae-91be-ca7bc532764c-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.918383 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870974b7-5400-44ae-91be-ca7bc532764c-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.918403 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/870974b7-5400-44ae-91be-ca7bc532764c-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.918421 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/870974b7-5400-44ae-91be-ca7bc532764c-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:28 crc kubenswrapper[4869]: I0312 15:09:28.930594 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025087 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/870974b7-5400-44ae-91be-ca7bc532764c-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025135 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9523917e-a416-4304-9479-9ef0a1a9a09d-config-data\") pod \"manila-share-share1-0\" (UID: \"9523917e-a416-4304-9479-9ef0a1a9a09d\") " pod="openstack/manila-share-share1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025155 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/9523917e-a416-4304-9479-9ef0a1a9a09d-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"9523917e-a416-4304-9479-9ef0a1a9a09d\") " pod="openstack/manila-share-share1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025185 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/870974b7-5400-44ae-91be-ca7bc532764c-sys\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025202 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9523917e-a416-4304-9479-9ef0a1a9a09d-scripts\") pod \"manila-share-share1-0\" (UID: \"9523917e-a416-4304-9479-9ef0a1a9a09d\") " pod="openstack/manila-share-share1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025218 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/870974b7-5400-44ae-91be-ca7bc532764c-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025238 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449d619a-d94f-45bc-9066-919a85d84f76-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025253 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/870974b7-5400-44ae-91be-ca7bc532764c-run\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025274 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/449d619a-d94f-45bc-9066-919a85d84f76-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025289 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/449d619a-d94f-45bc-9066-919a85d84f76-run\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025322 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/870974b7-5400-44ae-91be-ca7bc532764c-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025336 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870974b7-5400-44ae-91be-ca7bc532764c-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025351 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/870974b7-5400-44ae-91be-ca7bc532764c-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025364 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/449d619a-d94f-45bc-9066-919a85d84f76-etc-nvme\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025380 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/870974b7-5400-44ae-91be-ca7bc532764c-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025395 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/449d619a-d94f-45bc-9066-919a85d84f76-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025411 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/449d619a-d94f-45bc-9066-919a85d84f76-sys\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025425 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/449d619a-d94f-45bc-9066-919a85d84f76-config-data\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025453 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9523917e-a416-4304-9479-9ef0a1a9a09d-ceph\") pod \"manila-share-share1-0\" (UID: \"9523917e-a416-4304-9479-9ef0a1a9a09d\") " pod="openstack/manila-share-share1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025471 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9523917e-a416-4304-9479-9ef0a1a9a09d-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"9523917e-a416-4304-9479-9ef0a1a9a09d\") " pod="openstack/manila-share-share1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025494 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/870974b7-5400-44ae-91be-ca7bc532764c-dev\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025510 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/449d619a-d94f-45bc-9066-919a85d84f76-ceph\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025534 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlzn2\" (UniqueName: \"kubernetes.io/projected/449d619a-d94f-45bc-9066-919a85d84f76-kube-api-access-zlzn2\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025576 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/870974b7-5400-44ae-91be-ca7bc532764c-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025595 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/449d619a-d94f-45bc-9066-919a85d84f76-lib-modules\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025609 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/449d619a-d94f-45bc-9066-919a85d84f76-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025628 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/449d619a-d94f-45bc-9066-919a85d84f76-dev\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025654 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/449d619a-d94f-45bc-9066-919a85d84f76-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025680 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/870974b7-5400-44ae-91be-ca7bc532764c-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025701 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnhw7\" (UniqueName: \"kubernetes.io/projected/870974b7-5400-44ae-91be-ca7bc532764c-kube-api-access-fnhw7\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025721 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/870974b7-5400-44ae-91be-ca7bc532764c-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025741 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/870974b7-5400-44ae-91be-ca7bc532764c-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025767 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/449d619a-d94f-45bc-9066-919a85d84f76-config-data-custom\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025787 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9523917e-a416-4304-9479-9ef0a1a9a09d-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"9523917e-a416-4304-9479-9ef0a1a9a09d\") " pod="openstack/manila-share-share1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025807 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/870974b7-5400-44ae-91be-ca7bc532764c-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025830 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/870974b7-5400-44ae-91be-ca7bc532764c-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025850 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9v28\" (UniqueName: \"kubernetes.io/projected/9523917e-a416-4304-9479-9ef0a1a9a09d-kube-api-access-v9v28\") pod \"manila-share-share1-0\" (UID: \"9523917e-a416-4304-9479-9ef0a1a9a09d\") " pod="openstack/manila-share-share1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025866 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9523917e-a416-4304-9479-9ef0a1a9a09d-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"9523917e-a416-4304-9479-9ef0a1a9a09d\") " pod="openstack/manila-share-share1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025882 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/449d619a-d94f-45bc-9066-919a85d84f76-scripts\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.025907 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/449d619a-d94f-45bc-9066-919a85d84f76-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.026020 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/870974b7-5400-44ae-91be-ca7bc532764c-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.028257 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/870974b7-5400-44ae-91be-ca7bc532764c-dev\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.028301 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/9523917e-a416-4304-9479-9ef0a1a9a09d-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"9523917e-a416-4304-9479-9ef0a1a9a09d\") " pod="openstack/manila-share-share1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.028332 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/870974b7-5400-44ae-91be-ca7bc532764c-sys\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.028577 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/870974b7-5400-44ae-91be-ca7bc532764c-run\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.028758 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/870974b7-5400-44ae-91be-ca7bc532764c-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.034086 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9523917e-a416-4304-9479-9ef0a1a9a09d-ceph\") pod \"manila-share-share1-0\" (UID: \"9523917e-a416-4304-9479-9ef0a1a9a09d\") " pod="openstack/manila-share-share1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.034165 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9523917e-a416-4304-9479-9ef0a1a9a09d-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"9523917e-a416-4304-9479-9ef0a1a9a09d\") " pod="openstack/manila-share-share1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.028260 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/870974b7-5400-44ae-91be-ca7bc532764c-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.034241 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/870974b7-5400-44ae-91be-ca7bc532764c-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.034266 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/870974b7-5400-44ae-91be-ca7bc532764c-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.035694 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/870974b7-5400-44ae-91be-ca7bc532764c-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.035745 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/870974b7-5400-44ae-91be-ca7bc532764c-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.037425 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/870974b7-5400-44ae-91be-ca7bc532764c-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.037599 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9523917e-a416-4304-9479-9ef0a1a9a09d-scripts\") pod \"manila-share-share1-0\" (UID: \"9523917e-a416-4304-9479-9ef0a1a9a09d\") " pod="openstack/manila-share-share1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.039124 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/870974b7-5400-44ae-91be-ca7bc532764c-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.045829 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9523917e-a416-4304-9479-9ef0a1a9a09d-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"9523917e-a416-4304-9479-9ef0a1a9a09d\") " pod="openstack/manila-share-share1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.046756 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/870974b7-5400-44ae-91be-ca7bc532764c-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.046996 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9523917e-a416-4304-9479-9ef0a1a9a09d-config-data\") pod \"manila-share-share1-0\" (UID: \"9523917e-a416-4304-9479-9ef0a1a9a09d\") " pod="openstack/manila-share-share1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.047335 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/870974b7-5400-44ae-91be-ca7bc532764c-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.065636 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9523917e-a416-4304-9479-9ef0a1a9a09d-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"9523917e-a416-4304-9479-9ef0a1a9a09d\") " pod="openstack/manila-share-share1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.072785 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870974b7-5400-44ae-91be-ca7bc532764c-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.086594 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnhw7\" (UniqueName: \"kubernetes.io/projected/870974b7-5400-44ae-91be-ca7bc532764c-kube-api-access-fnhw7\") pod \"cinder-volume-volume1-0\" (UID: \"870974b7-5400-44ae-91be-ca7bc532764c\") " pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.088231 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9v28\" (UniqueName: \"kubernetes.io/projected/9523917e-a416-4304-9479-9ef0a1a9a09d-kube-api-access-v9v28\") pod \"manila-share-share1-0\" (UID: \"9523917e-a416-4304-9479-9ef0a1a9a09d\") " pod="openstack/manila-share-share1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.127785 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/449d619a-d94f-45bc-9066-919a85d84f76-lib-modules\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.127855 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/449d619a-d94f-45bc-9066-919a85d84f76-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.127883 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/449d619a-d94f-45bc-9066-919a85d84f76-dev\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.127916 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/449d619a-d94f-45bc-9066-919a85d84f76-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.127956 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/449d619a-d94f-45bc-9066-919a85d84f76-config-data-custom\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.128007 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/449d619a-d94f-45bc-9066-919a85d84f76-scripts\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.128034 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/449d619a-d94f-45bc-9066-919a85d84f76-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.128114 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449d619a-d94f-45bc-9066-919a85d84f76-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.128147 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/449d619a-d94f-45bc-9066-919a85d84f76-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.128165 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/449d619a-d94f-45bc-9066-919a85d84f76-run\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.128187 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/449d619a-d94f-45bc-9066-919a85d84f76-etc-nvme\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.128209 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/449d619a-d94f-45bc-9066-919a85d84f76-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.128227 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/449d619a-d94f-45bc-9066-919a85d84f76-sys\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.128247 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/449d619a-d94f-45bc-9066-919a85d84f76-config-data\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.128295 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/449d619a-d94f-45bc-9066-919a85d84f76-ceph\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.128324 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlzn2\" (UniqueName: \"kubernetes.io/projected/449d619a-d94f-45bc-9066-919a85d84f76-kube-api-access-zlzn2\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.128784 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/449d619a-d94f-45bc-9066-919a85d84f76-lib-modules\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.128860 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/449d619a-d94f-45bc-9066-919a85d84f76-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.128895 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/449d619a-d94f-45bc-9066-919a85d84f76-dev\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.128924 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/449d619a-d94f-45bc-9066-919a85d84f76-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.129436 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/449d619a-d94f-45bc-9066-919a85d84f76-run\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.130781 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/449d619a-d94f-45bc-9066-919a85d84f76-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.131632 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/449d619a-d94f-45bc-9066-919a85d84f76-sys\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.131706 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/449d619a-d94f-45bc-9066-919a85d84f76-etc-nvme\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.131728 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/449d619a-d94f-45bc-9066-919a85d84f76-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.131763 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/449d619a-d94f-45bc-9066-919a85d84f76-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.132436 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/449d619a-d94f-45bc-9066-919a85d84f76-scripts\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.134965 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/449d619a-d94f-45bc-9066-919a85d84f76-ceph\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.134983 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/449d619a-d94f-45bc-9066-919a85d84f76-config-data-custom\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.135348 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449d619a-d94f-45bc-9066-919a85d84f76-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.136377 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/449d619a-d94f-45bc-9066-919a85d84f76-config-data\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.158909 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlzn2\" (UniqueName: \"kubernetes.io/projected/449d619a-d94f-45bc-9066-919a85d84f76-kube-api-access-zlzn2\") pod \"cinder-backup-0\" (UID: \"449d619a-d94f-45bc-9066-919a85d84f76\") " pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.159984 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.171237 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.204054 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.557551 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 15:09:29 crc kubenswrapper[4869]: W0312 15:09:29.567506 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95fcb56b_b075_469f_9a02_bcc737a18c26.slice/crio-81efa6df8a54c1ba761085934e9689266c2c8ee1a62b688b0c2b0bc6483b4895 WatchSource:0}: Error finding container 81efa6df8a54c1ba761085934e9689266c2c8ee1a62b688b0c2b0bc6483b4895: Status 404 returned error can't find the container with id 81efa6df8a54c1ba761085934e9689266c2c8ee1a62b688b0c2b0bc6483b4895 Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.798799 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 12 15:09:29 crc kubenswrapper[4869]: I0312 15:09:29.927179 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 12 15:09:29 crc kubenswrapper[4869]: W0312 15:09:29.941010 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod449d619a_d94f_45bc_9066_919a85d84f76.slice/crio-85a73f91cf202aaaf4d90fc70dfe453f13db1ba3a642a0838fc8ee5ef50adf6b WatchSource:0}: Error finding container 85a73f91cf202aaaf4d90fc70dfe453f13db1ba3a642a0838fc8ee5ef50adf6b: Status 404 returned error can't find the container with id 85a73f91cf202aaaf4d90fc70dfe453f13db1ba3a642a0838fc8ee5ef50adf6b Mar 12 15:09:30 crc kubenswrapper[4869]: I0312 15:09:30.356448 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e9514e1-cc53-4445-904c-4505fc60a1ea" path="/var/lib/kubelet/pods/5e9514e1-cc53-4445-904c-4505fc60a1ea/volumes" Mar 12 15:09:30 crc kubenswrapper[4869]: I0312 15:09:30.358084 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6838a2d7-2052-45b9-a8d5-3aa6639bccb4" path="/var/lib/kubelet/pods/6838a2d7-2052-45b9-a8d5-3aa6639bccb4/volumes" Mar 12 15:09:30 crc kubenswrapper[4869]: I0312 15:09:30.359218 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c13a91ca-db8c-42b4-bfca-029e427aff28" path="/var/lib/kubelet/pods/c13a91ca-db8c-42b4-bfca-029e427aff28/volumes" Mar 12 15:09:30 crc kubenswrapper[4869]: I0312 15:09:30.363152 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2897896-f1c6-4ec2-9d89-fc19b2bf18a9" path="/var/lib/kubelet/pods/f2897896-f1c6-4ec2-9d89-fc19b2bf18a9/volumes" Mar 12 15:09:30 crc kubenswrapper[4869]: I0312 15:09:30.565790 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"95fcb56b-b075-469f-9a02-bcc737a18c26","Type":"ContainerStarted","Data":"26f546c3e004a433d0ba1ec7f63190906cf71401ef4345b221738d7bfc6195e9"} Mar 12 15:09:30 crc kubenswrapper[4869]: I0312 15:09:30.565840 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"95fcb56b-b075-469f-9a02-bcc737a18c26","Type":"ContainerStarted","Data":"81efa6df8a54c1ba761085934e9689266c2c8ee1a62b688b0c2b0bc6483b4895"} Mar 12 15:09:30 crc kubenswrapper[4869]: I0312 15:09:30.575012 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"449d619a-d94f-45bc-9066-919a85d84f76","Type":"ContainerStarted","Data":"5c43f6d5ce79167d0dbff0add7e1da7bbe7877b8921a07a1e8429d3a339b823b"} Mar 12 15:09:30 crc kubenswrapper[4869]: I0312 15:09:30.575050 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"449d619a-d94f-45bc-9066-919a85d84f76","Type":"ContainerStarted","Data":"5ec8fd073d5ace4d2815543adc2e3bab5d36e27228b05a1a36f15287d65ab513"} Mar 12 15:09:30 crc kubenswrapper[4869]: I0312 15:09:30.575060 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"449d619a-d94f-45bc-9066-919a85d84f76","Type":"ContainerStarted","Data":"85a73f91cf202aaaf4d90fc70dfe453f13db1ba3a642a0838fc8ee5ef50adf6b"} Mar 12 15:09:30 crc kubenswrapper[4869]: I0312 15:09:30.585389 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"870974b7-5400-44ae-91be-ca7bc532764c","Type":"ContainerStarted","Data":"d850e83a1a0f2958a690b28e05933d7f0090cfd2be3d32520c9352fc84fe28bf"} Mar 12 15:09:30 crc kubenswrapper[4869]: I0312 15:09:30.585456 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"870974b7-5400-44ae-91be-ca7bc532764c","Type":"ContainerStarted","Data":"bf545aa02efe20091e18e05f26914ac4dffd1a8bf96342546007029dd259ded6"} Mar 12 15:09:30 crc kubenswrapper[4869]: I0312 15:09:30.585470 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"870974b7-5400-44ae-91be-ca7bc532764c","Type":"ContainerStarted","Data":"8fd5197b75ba78ddfedf7cd32463dca2dc8faa39c54a54553d94423ee65c1e12"} Mar 12 15:09:30 crc kubenswrapper[4869]: I0312 15:09:30.598640 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.598617909 podStartE2EDuration="2.598617909s" podCreationTimestamp="2026-03-12 15:09:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:09:30.595336986 +0000 UTC m=+1322.880562264" watchObservedRunningTime="2026-03-12 15:09:30.598617909 +0000 UTC m=+1322.883843197" Mar 12 15:09:30 crc kubenswrapper[4869]: I0312 15:09:30.629479 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.629457731 podStartE2EDuration="2.629457731s" podCreationTimestamp="2026-03-12 15:09:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:09:30.621445382 +0000 UTC m=+1322.906670660" watchObservedRunningTime="2026-03-12 15:09:30.629457731 +0000 UTC m=+1322.914683019" Mar 12 15:09:30 crc kubenswrapper[4869]: I0312 15:09:30.858056 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 12 15:09:30 crc kubenswrapper[4869]: W0312 15:09:30.864715 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9523917e_a416_4304_9479_9ef0a1a9a09d.slice/crio-afdead76b74a9641e7ecd8c165d110306a2c8517154bb22d44dc82b1733bb320 WatchSource:0}: Error finding container afdead76b74a9641e7ecd8c165d110306a2c8517154bb22d44dc82b1733bb320: Status 404 returned error can't find the container with id afdead76b74a9641e7ecd8c165d110306a2c8517154bb22d44dc82b1733bb320 Mar 12 15:09:31 crc kubenswrapper[4869]: I0312 15:09:31.605038 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"95fcb56b-b075-469f-9a02-bcc737a18c26","Type":"ContainerStarted","Data":"4d4ab4e5903eb8f10673a92a43cf10000d079a2db74a2fa56ad8d54a1d871286"} Mar 12 15:09:31 crc kubenswrapper[4869]: I0312 15:09:31.609903 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"9523917e-a416-4304-9479-9ef0a1a9a09d","Type":"ContainerStarted","Data":"fb56d5f1492fe16fdc14a8ef14dd891cafdcd0a46a07ecd28bd17a7a9807c29e"} Mar 12 15:09:31 crc kubenswrapper[4869]: I0312 15:09:31.609948 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"9523917e-a416-4304-9479-9ef0a1a9a09d","Type":"ContainerStarted","Data":"afdead76b74a9641e7ecd8c165d110306a2c8517154bb22d44dc82b1733bb320"} Mar 12 15:09:31 crc kubenswrapper[4869]: I0312 15:09:31.630996 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.630977708 podStartE2EDuration="3.630977708s" podCreationTimestamp="2026-03-12 15:09:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:09:31.623616568 +0000 UTC m=+1323.908841846" watchObservedRunningTime="2026-03-12 15:09:31.630977708 +0000 UTC m=+1323.916202986" Mar 12 15:09:32 crc kubenswrapper[4869]: I0312 15:09:32.632652 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"9523917e-a416-4304-9479-9ef0a1a9a09d","Type":"ContainerStarted","Data":"04f5c4096dcc87775c4643c200727f0123c573b6db9209d9ac3a7437de7da497"} Mar 12 15:09:32 crc kubenswrapper[4869]: I0312 15:09:32.667606 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=4.667586339 podStartE2EDuration="4.667586339s" podCreationTimestamp="2026-03-12 15:09:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:09:32.6634184 +0000 UTC m=+1324.948643698" watchObservedRunningTime="2026-03-12 15:09:32.667586339 +0000 UTC m=+1324.952811617" Mar 12 15:09:33 crc kubenswrapper[4869]: I0312 15:09:33.018845 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 15:09:33 crc kubenswrapper[4869]: I0312 15:09:33.019688 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c423b809-18ee-493c-91b6-d30846b0d68b" containerName="glance-log" containerID="cri-o://c64bce07796839a0d8f8c8631eb323e6f9d63a40fb647956a4d8232516d66ec7" gracePeriod=30 Mar 12 15:09:33 crc kubenswrapper[4869]: I0312 15:09:33.019810 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c423b809-18ee-493c-91b6-d30846b0d68b" containerName="glance-httpd" containerID="cri-o://26f026590c1088bfa4ee32afd489c3b3d7eeb3392b7e739c59c0af659d2a2d7e" gracePeriod=30 Mar 12 15:09:33 crc kubenswrapper[4869]: I0312 15:09:33.643665 4869 generic.go:334] "Generic (PLEG): container finished" podID="c423b809-18ee-493c-91b6-d30846b0d68b" containerID="c64bce07796839a0d8f8c8631eb323e6f9d63a40fb647956a4d8232516d66ec7" exitCode=143 Mar 12 15:09:33 crc kubenswrapper[4869]: I0312 15:09:33.643747 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c423b809-18ee-493c-91b6-d30846b0d68b","Type":"ContainerDied","Data":"c64bce07796839a0d8f8c8631eb323e6f9d63a40fb647956a4d8232516d66ec7"} Mar 12 15:09:33 crc kubenswrapper[4869]: I0312 15:09:33.931441 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 12 15:09:34 crc kubenswrapper[4869]: I0312 15:09:34.160803 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:34 crc kubenswrapper[4869]: I0312 15:09:34.204828 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Mar 12 15:09:34 crc kubenswrapper[4869]: I0312 15:09:34.450892 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Mar 12 15:09:34 crc kubenswrapper[4869]: I0312 15:09:34.588597 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 15:09:34 crc kubenswrapper[4869]: I0312 15:09:34.652375 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6" containerName="glance-log" containerID="cri-o://6af107b35856299eb7786081ab0f1d7915dbb1dec4b6a48b4a08830298eb68bd" gracePeriod=30 Mar 12 15:09:34 crc kubenswrapper[4869]: I0312 15:09:34.652444 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6" containerName="glance-httpd" containerID="cri-o://e29dc032d9f4f31a2776c6c744d3b4f9d0b1aaa6bed5a578ba80645b08858909" gracePeriod=30 Mar 12 15:09:35 crc kubenswrapper[4869]: I0312 15:09:35.662228 4869 generic.go:334] "Generic (PLEG): container finished" podID="e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6" containerID="6af107b35856299eb7786081ab0f1d7915dbb1dec4b6a48b4a08830298eb68bd" exitCode=143 Mar 12 15:09:35 crc kubenswrapper[4869]: I0312 15:09:35.662574 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6","Type":"ContainerDied","Data":"6af107b35856299eb7786081ab0f1d7915dbb1dec4b6a48b4a08830298eb68bd"} Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.660136 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.675105 4869 generic.go:334] "Generic (PLEG): container finished" podID="c423b809-18ee-493c-91b6-d30846b0d68b" containerID="26f026590c1088bfa4ee32afd489c3b3d7eeb3392b7e739c59c0af659d2a2d7e" exitCode=0 Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.675201 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.675226 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c423b809-18ee-493c-91b6-d30846b0d68b","Type":"ContainerDied","Data":"26f026590c1088bfa4ee32afd489c3b3d7eeb3392b7e739c59c0af659d2a2d7e"} Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.676846 4869 scope.go:117] "RemoveContainer" containerID="26f026590c1088bfa4ee32afd489c3b3d7eeb3392b7e739c59c0af659d2a2d7e" Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.677292 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c423b809-18ee-493c-91b6-d30846b0d68b","Type":"ContainerDied","Data":"01c200ab9f8e075cedd24d767c813308013cc63c491fd7aaaf63118ff09ea6e5"} Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.711694 4869 scope.go:117] "RemoveContainer" containerID="c64bce07796839a0d8f8c8631eb323e6f9d63a40fb647956a4d8232516d66ec7" Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.740848 4869 scope.go:117] "RemoveContainer" containerID="26f026590c1088bfa4ee32afd489c3b3d7eeb3392b7e739c59c0af659d2a2d7e" Mar 12 15:09:36 crc kubenswrapper[4869]: E0312 15:09:36.741357 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26f026590c1088bfa4ee32afd489c3b3d7eeb3392b7e739c59c0af659d2a2d7e\": container with ID starting with 26f026590c1088bfa4ee32afd489c3b3d7eeb3392b7e739c59c0af659d2a2d7e not found: ID does not exist" containerID="26f026590c1088bfa4ee32afd489c3b3d7eeb3392b7e739c59c0af659d2a2d7e" Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.741398 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26f026590c1088bfa4ee32afd489c3b3d7eeb3392b7e739c59c0af659d2a2d7e"} err="failed to get container status \"26f026590c1088bfa4ee32afd489c3b3d7eeb3392b7e739c59c0af659d2a2d7e\": rpc error: code = NotFound desc = could not find container \"26f026590c1088bfa4ee32afd489c3b3d7eeb3392b7e739c59c0af659d2a2d7e\": container with ID starting with 26f026590c1088bfa4ee32afd489c3b3d7eeb3392b7e739c59c0af659d2a2d7e not found: ID does not exist" Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.741422 4869 scope.go:117] "RemoveContainer" containerID="c64bce07796839a0d8f8c8631eb323e6f9d63a40fb647956a4d8232516d66ec7" Mar 12 15:09:36 crc kubenswrapper[4869]: E0312 15:09:36.741977 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c64bce07796839a0d8f8c8631eb323e6f9d63a40fb647956a4d8232516d66ec7\": container with ID starting with c64bce07796839a0d8f8c8631eb323e6f9d63a40fb647956a4d8232516d66ec7 not found: ID does not exist" containerID="c64bce07796839a0d8f8c8631eb323e6f9d63a40fb647956a4d8232516d66ec7" Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.742027 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c64bce07796839a0d8f8c8631eb323e6f9d63a40fb647956a4d8232516d66ec7"} err="failed to get container status \"c64bce07796839a0d8f8c8631eb323e6f9d63a40fb647956a4d8232516d66ec7\": rpc error: code = NotFound desc = could not find container \"c64bce07796839a0d8f8c8631eb323e6f9d63a40fb647956a4d8232516d66ec7\": container with ID starting with c64bce07796839a0d8f8c8631eb323e6f9d63a40fb647956a4d8232516d66ec7 not found: ID does not exist" Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.795605 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c423b809-18ee-493c-91b6-d30846b0d68b-httpd-run\") pod \"c423b809-18ee-493c-91b6-d30846b0d68b\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.795704 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c423b809-18ee-493c-91b6-d30846b0d68b-scripts\") pod \"c423b809-18ee-493c-91b6-d30846b0d68b\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.795732 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"c423b809-18ee-493c-91b6-d30846b0d68b\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.795794 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnfpc\" (UniqueName: \"kubernetes.io/projected/c423b809-18ee-493c-91b6-d30846b0d68b-kube-api-access-rnfpc\") pod \"c423b809-18ee-493c-91b6-d30846b0d68b\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.795831 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c423b809-18ee-493c-91b6-d30846b0d68b-ceph\") pod \"c423b809-18ee-493c-91b6-d30846b0d68b\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.795870 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c423b809-18ee-493c-91b6-d30846b0d68b-combined-ca-bundle\") pod \"c423b809-18ee-493c-91b6-d30846b0d68b\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.795903 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c423b809-18ee-493c-91b6-d30846b0d68b-public-tls-certs\") pod \"c423b809-18ee-493c-91b6-d30846b0d68b\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.795937 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c423b809-18ee-493c-91b6-d30846b0d68b-logs\") pod \"c423b809-18ee-493c-91b6-d30846b0d68b\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.795955 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c423b809-18ee-493c-91b6-d30846b0d68b-config-data\") pod \"c423b809-18ee-493c-91b6-d30846b0d68b\" (UID: \"c423b809-18ee-493c-91b6-d30846b0d68b\") " Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.796008 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c423b809-18ee-493c-91b6-d30846b0d68b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c423b809-18ee-493c-91b6-d30846b0d68b" (UID: "c423b809-18ee-493c-91b6-d30846b0d68b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.796433 4869 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c423b809-18ee-493c-91b6-d30846b0d68b-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.796702 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c423b809-18ee-493c-91b6-d30846b0d68b-logs" (OuterVolumeSpecName: "logs") pod "c423b809-18ee-493c-91b6-d30846b0d68b" (UID: "c423b809-18ee-493c-91b6-d30846b0d68b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.800550 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c423b809-18ee-493c-91b6-d30846b0d68b-scripts" (OuterVolumeSpecName: "scripts") pod "c423b809-18ee-493c-91b6-d30846b0d68b" (UID: "c423b809-18ee-493c-91b6-d30846b0d68b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.802609 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c423b809-18ee-493c-91b6-d30846b0d68b-kube-api-access-rnfpc" (OuterVolumeSpecName: "kube-api-access-rnfpc") pod "c423b809-18ee-493c-91b6-d30846b0d68b" (UID: "c423b809-18ee-493c-91b6-d30846b0d68b"). InnerVolumeSpecName "kube-api-access-rnfpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.803162 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "c423b809-18ee-493c-91b6-d30846b0d68b" (UID: "c423b809-18ee-493c-91b6-d30846b0d68b"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.804739 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c423b809-18ee-493c-91b6-d30846b0d68b-ceph" (OuterVolumeSpecName: "ceph") pod "c423b809-18ee-493c-91b6-d30846b0d68b" (UID: "c423b809-18ee-493c-91b6-d30846b0d68b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.844234 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c423b809-18ee-493c-91b6-d30846b0d68b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c423b809-18ee-493c-91b6-d30846b0d68b" (UID: "c423b809-18ee-493c-91b6-d30846b0d68b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.882711 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c423b809-18ee-493c-91b6-d30846b0d68b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c423b809-18ee-493c-91b6-d30846b0d68b" (UID: "c423b809-18ee-493c-91b6-d30846b0d68b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.887205 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c423b809-18ee-493c-91b6-d30846b0d68b-config-data" (OuterVolumeSpecName: "config-data") pod "c423b809-18ee-493c-91b6-d30846b0d68b" (UID: "c423b809-18ee-493c-91b6-d30846b0d68b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.898378 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c423b809-18ee-493c-91b6-d30846b0d68b-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.898432 4869 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.898444 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnfpc\" (UniqueName: \"kubernetes.io/projected/c423b809-18ee-493c-91b6-d30846b0d68b-kube-api-access-rnfpc\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.898455 4869 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c423b809-18ee-493c-91b6-d30846b0d68b-ceph\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.898464 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c423b809-18ee-493c-91b6-d30846b0d68b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.898472 4869 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c423b809-18ee-493c-91b6-d30846b0d68b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.898482 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c423b809-18ee-493c-91b6-d30846b0d68b-logs\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.898490 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c423b809-18ee-493c-91b6-d30846b0d68b-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:36 crc kubenswrapper[4869]: I0312 15:09:36.919908 4869 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.000133 4869 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.018434 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.034132 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.078663 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 15:09:37 crc kubenswrapper[4869]: E0312 15:09:37.079441 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c423b809-18ee-493c-91b6-d30846b0d68b" containerName="glance-httpd" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.079466 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="c423b809-18ee-493c-91b6-d30846b0d68b" containerName="glance-httpd" Mar 12 15:09:37 crc kubenswrapper[4869]: E0312 15:09:37.079503 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c423b809-18ee-493c-91b6-d30846b0d68b" containerName="glance-log" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.079510 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="c423b809-18ee-493c-91b6-d30846b0d68b" containerName="glance-log" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.079747 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="c423b809-18ee-493c-91b6-d30846b0d68b" containerName="glance-log" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.079776 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="c423b809-18ee-493c-91b6-d30846b0d68b" containerName="glance-httpd" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.080940 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.087191 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.087627 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.087632 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.203475 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/442f7481-9051-4bc5-b3e5-13eef1e6ff01-logs\") pod \"glance-default-external-api-0\" (UID: \"442f7481-9051-4bc5-b3e5-13eef1e6ff01\") " pod="openstack/glance-default-external-api-0" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.203774 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/442f7481-9051-4bc5-b3e5-13eef1e6ff01-scripts\") pod \"glance-default-external-api-0\" (UID: \"442f7481-9051-4bc5-b3e5-13eef1e6ff01\") " pod="openstack/glance-default-external-api-0" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.203874 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/442f7481-9051-4bc5-b3e5-13eef1e6ff01-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"442f7481-9051-4bc5-b3e5-13eef1e6ff01\") " pod="openstack/glance-default-external-api-0" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.203968 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/442f7481-9051-4bc5-b3e5-13eef1e6ff01-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"442f7481-9051-4bc5-b3e5-13eef1e6ff01\") " pod="openstack/glance-default-external-api-0" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.204069 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj7nj\" (UniqueName: \"kubernetes.io/projected/442f7481-9051-4bc5-b3e5-13eef1e6ff01-kube-api-access-mj7nj\") pod \"glance-default-external-api-0\" (UID: \"442f7481-9051-4bc5-b3e5-13eef1e6ff01\") " pod="openstack/glance-default-external-api-0" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.204231 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"442f7481-9051-4bc5-b3e5-13eef1e6ff01\") " pod="openstack/glance-default-external-api-0" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.204306 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/442f7481-9051-4bc5-b3e5-13eef1e6ff01-config-data\") pod \"glance-default-external-api-0\" (UID: \"442f7481-9051-4bc5-b3e5-13eef1e6ff01\") " pod="openstack/glance-default-external-api-0" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.204418 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/442f7481-9051-4bc5-b3e5-13eef1e6ff01-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"442f7481-9051-4bc5-b3e5-13eef1e6ff01\") " pod="openstack/glance-default-external-api-0" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.204513 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/442f7481-9051-4bc5-b3e5-13eef1e6ff01-ceph\") pod \"glance-default-external-api-0\" (UID: \"442f7481-9051-4bc5-b3e5-13eef1e6ff01\") " pod="openstack/glance-default-external-api-0" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.306849 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"442f7481-9051-4bc5-b3e5-13eef1e6ff01\") " pod="openstack/glance-default-external-api-0" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.306897 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/442f7481-9051-4bc5-b3e5-13eef1e6ff01-config-data\") pod \"glance-default-external-api-0\" (UID: \"442f7481-9051-4bc5-b3e5-13eef1e6ff01\") " pod="openstack/glance-default-external-api-0" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.306932 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/442f7481-9051-4bc5-b3e5-13eef1e6ff01-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"442f7481-9051-4bc5-b3e5-13eef1e6ff01\") " pod="openstack/glance-default-external-api-0" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.306959 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/442f7481-9051-4bc5-b3e5-13eef1e6ff01-ceph\") pod \"glance-default-external-api-0\" (UID: \"442f7481-9051-4bc5-b3e5-13eef1e6ff01\") " pod="openstack/glance-default-external-api-0" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.307013 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/442f7481-9051-4bc5-b3e5-13eef1e6ff01-logs\") pod \"glance-default-external-api-0\" (UID: \"442f7481-9051-4bc5-b3e5-13eef1e6ff01\") " pod="openstack/glance-default-external-api-0" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.307053 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/442f7481-9051-4bc5-b3e5-13eef1e6ff01-scripts\") pod \"glance-default-external-api-0\" (UID: \"442f7481-9051-4bc5-b3e5-13eef1e6ff01\") " pod="openstack/glance-default-external-api-0" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.307077 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/442f7481-9051-4bc5-b3e5-13eef1e6ff01-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"442f7481-9051-4bc5-b3e5-13eef1e6ff01\") " pod="openstack/glance-default-external-api-0" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.307112 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/442f7481-9051-4bc5-b3e5-13eef1e6ff01-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"442f7481-9051-4bc5-b3e5-13eef1e6ff01\") " pod="openstack/glance-default-external-api-0" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.307140 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj7nj\" (UniqueName: \"kubernetes.io/projected/442f7481-9051-4bc5-b3e5-13eef1e6ff01-kube-api-access-mj7nj\") pod \"glance-default-external-api-0\" (UID: \"442f7481-9051-4bc5-b3e5-13eef1e6ff01\") " pod="openstack/glance-default-external-api-0" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.307151 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"442f7481-9051-4bc5-b3e5-13eef1e6ff01\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.307698 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/442f7481-9051-4bc5-b3e5-13eef1e6ff01-logs\") pod \"glance-default-external-api-0\" (UID: \"442f7481-9051-4bc5-b3e5-13eef1e6ff01\") " pod="openstack/glance-default-external-api-0" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.308018 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/442f7481-9051-4bc5-b3e5-13eef1e6ff01-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"442f7481-9051-4bc5-b3e5-13eef1e6ff01\") " pod="openstack/glance-default-external-api-0" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.312249 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/442f7481-9051-4bc5-b3e5-13eef1e6ff01-scripts\") pod \"glance-default-external-api-0\" (UID: \"442f7481-9051-4bc5-b3e5-13eef1e6ff01\") " pod="openstack/glance-default-external-api-0" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.312323 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/442f7481-9051-4bc5-b3e5-13eef1e6ff01-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"442f7481-9051-4bc5-b3e5-13eef1e6ff01\") " pod="openstack/glance-default-external-api-0" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.317273 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/442f7481-9051-4bc5-b3e5-13eef1e6ff01-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"442f7481-9051-4bc5-b3e5-13eef1e6ff01\") " pod="openstack/glance-default-external-api-0" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.317720 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/442f7481-9051-4bc5-b3e5-13eef1e6ff01-config-data\") pod \"glance-default-external-api-0\" (UID: \"442f7481-9051-4bc5-b3e5-13eef1e6ff01\") " pod="openstack/glance-default-external-api-0" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.338272 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj7nj\" (UniqueName: \"kubernetes.io/projected/442f7481-9051-4bc5-b3e5-13eef1e6ff01-kube-api-access-mj7nj\") pod \"glance-default-external-api-0\" (UID: \"442f7481-9051-4bc5-b3e5-13eef1e6ff01\") " pod="openstack/glance-default-external-api-0" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.341273 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/442f7481-9051-4bc5-b3e5-13eef1e6ff01-ceph\") pod \"glance-default-external-api-0\" (UID: \"442f7481-9051-4bc5-b3e5-13eef1e6ff01\") " pod="openstack/glance-default-external-api-0" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.350021 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"442f7481-9051-4bc5-b3e5-13eef1e6ff01\") " pod="openstack/glance-default-external-api-0" Mar 12 15:09:37 crc kubenswrapper[4869]: I0312 15:09:37.411053 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.053499 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.174336 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.328195 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.328252 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hdtp\" (UniqueName: \"kubernetes.io/projected/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-kube-api-access-5hdtp\") pod \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.328345 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-combined-ca-bundle\") pod \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.328401 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-httpd-run\") pod \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.328481 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-scripts\") pod \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.328507 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-ceph\") pod \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.328679 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-internal-tls-certs\") pod \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.328719 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-config-data\") pod \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.328737 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-logs\") pod \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\" (UID: \"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6\") " Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.329404 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6" (UID: "e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.329905 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-logs" (OuterVolumeSpecName: "logs") pod "e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6" (UID: "e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.337323 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-scripts" (OuterVolumeSpecName: "scripts") pod "e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6" (UID: "e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.337456 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-ceph" (OuterVolumeSpecName: "ceph") pod "e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6" (UID: "e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.338433 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6" (UID: "e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.361563 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-kube-api-access-5hdtp" (OuterVolumeSpecName: "kube-api-access-5hdtp") pod "e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6" (UID: "e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6"). InnerVolumeSpecName "kube-api-access-5hdtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.385523 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c423b809-18ee-493c-91b6-d30846b0d68b" path="/var/lib/kubelet/pods/c423b809-18ee-493c-91b6-d30846b0d68b/volumes" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.391748 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6" (UID: "e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.435642 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.435674 4869 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-ceph\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.435684 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-logs\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.435709 4869 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.435721 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hdtp\" (UniqueName: \"kubernetes.io/projected/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-kube-api-access-5hdtp\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.435734 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.435746 4869 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.457812 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-config-data" (OuterVolumeSpecName: "config-data") pod "e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6" (UID: "e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.464759 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6" (UID: "e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.482298 4869 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.538506 4869 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.538565 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.538579 4869 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.696974 4869 generic.go:334] "Generic (PLEG): container finished" podID="e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6" containerID="e29dc032d9f4f31a2776c6c744d3b4f9d0b1aaa6bed5a578ba80645b08858909" exitCode=0 Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.697018 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.697037 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6","Type":"ContainerDied","Data":"e29dc032d9f4f31a2776c6c744d3b4f9d0b1aaa6bed5a578ba80645b08858909"} Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.697661 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6","Type":"ContainerDied","Data":"49683d4279b6798f13e61af4fe8150324e31674036fc6f7529b14a0bc0b88b1f"} Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.697697 4869 scope.go:117] "RemoveContainer" containerID="e29dc032d9f4f31a2776c6c744d3b4f9d0b1aaa6bed5a578ba80645b08858909" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.699159 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"442f7481-9051-4bc5-b3e5-13eef1e6ff01","Type":"ContainerStarted","Data":"ded9112cc0e951519bf3f2ef6d0ab5acb67e41c711fdb9250e4d45573241617b"} Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.719663 4869 scope.go:117] "RemoveContainer" containerID="6af107b35856299eb7786081ab0f1d7915dbb1dec4b6a48b4a08830298eb68bd" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.735441 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.750438 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.761460 4869 scope.go:117] "RemoveContainer" containerID="e29dc032d9f4f31a2776c6c744d3b4f9d0b1aaa6bed5a578ba80645b08858909" Mar 12 15:09:38 crc kubenswrapper[4869]: E0312 15:09:38.761810 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e29dc032d9f4f31a2776c6c744d3b4f9d0b1aaa6bed5a578ba80645b08858909\": container with ID starting with e29dc032d9f4f31a2776c6c744d3b4f9d0b1aaa6bed5a578ba80645b08858909 not found: ID does not exist" containerID="e29dc032d9f4f31a2776c6c744d3b4f9d0b1aaa6bed5a578ba80645b08858909" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.761849 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e29dc032d9f4f31a2776c6c744d3b4f9d0b1aaa6bed5a578ba80645b08858909"} err="failed to get container status \"e29dc032d9f4f31a2776c6c744d3b4f9d0b1aaa6bed5a578ba80645b08858909\": rpc error: code = NotFound desc = could not find container \"e29dc032d9f4f31a2776c6c744d3b4f9d0b1aaa6bed5a578ba80645b08858909\": container with ID starting with e29dc032d9f4f31a2776c6c744d3b4f9d0b1aaa6bed5a578ba80645b08858909 not found: ID does not exist" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.761870 4869 scope.go:117] "RemoveContainer" containerID="6af107b35856299eb7786081ab0f1d7915dbb1dec4b6a48b4a08830298eb68bd" Mar 12 15:09:38 crc kubenswrapper[4869]: E0312 15:09:38.762059 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6af107b35856299eb7786081ab0f1d7915dbb1dec4b6a48b4a08830298eb68bd\": container with ID starting with 6af107b35856299eb7786081ab0f1d7915dbb1dec4b6a48b4a08830298eb68bd not found: ID does not exist" containerID="6af107b35856299eb7786081ab0f1d7915dbb1dec4b6a48b4a08830298eb68bd" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.762091 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6af107b35856299eb7786081ab0f1d7915dbb1dec4b6a48b4a08830298eb68bd"} err="failed to get container status \"6af107b35856299eb7786081ab0f1d7915dbb1dec4b6a48b4a08830298eb68bd\": rpc error: code = NotFound desc = could not find container \"6af107b35856299eb7786081ab0f1d7915dbb1dec4b6a48b4a08830298eb68bd\": container with ID starting with 6af107b35856299eb7786081ab0f1d7915dbb1dec4b6a48b4a08830298eb68bd not found: ID does not exist" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.772103 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 15:09:38 crc kubenswrapper[4869]: E0312 15:09:38.772767 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6" containerName="glance-log" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.772791 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6" containerName="glance-log" Mar 12 15:09:38 crc kubenswrapper[4869]: E0312 15:09:38.772815 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6" containerName="glance-httpd" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.772824 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6" containerName="glance-httpd" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.773053 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6" containerName="glance-log" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.773071 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6" containerName="glance-httpd" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.774374 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.777022 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.777201 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.802894 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.843634 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/471f34e2-b42f-468a-bd32-a1fa9612dbdc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"471f34e2-b42f-468a-bd32-a1fa9612dbdc\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.843864 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/471f34e2-b42f-468a-bd32-a1fa9612dbdc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"471f34e2-b42f-468a-bd32-a1fa9612dbdc\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.844018 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/471f34e2-b42f-468a-bd32-a1fa9612dbdc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"471f34e2-b42f-468a-bd32-a1fa9612dbdc\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.844114 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/471f34e2-b42f-468a-bd32-a1fa9612dbdc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"471f34e2-b42f-468a-bd32-a1fa9612dbdc\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.844218 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/471f34e2-b42f-468a-bd32-a1fa9612dbdc-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"471f34e2-b42f-468a-bd32-a1fa9612dbdc\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.844310 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"471f34e2-b42f-468a-bd32-a1fa9612dbdc\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.844403 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84gzc\" (UniqueName: \"kubernetes.io/projected/471f34e2-b42f-468a-bd32-a1fa9612dbdc-kube-api-access-84gzc\") pod \"glance-default-internal-api-0\" (UID: \"471f34e2-b42f-468a-bd32-a1fa9612dbdc\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.844568 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/471f34e2-b42f-468a-bd32-a1fa9612dbdc-ceph\") pod \"glance-default-internal-api-0\" (UID: \"471f34e2-b42f-468a-bd32-a1fa9612dbdc\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.844704 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/471f34e2-b42f-468a-bd32-a1fa9612dbdc-logs\") pod \"glance-default-internal-api-0\" (UID: \"471f34e2-b42f-468a-bd32-a1fa9612dbdc\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.946369 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/471f34e2-b42f-468a-bd32-a1fa9612dbdc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"471f34e2-b42f-468a-bd32-a1fa9612dbdc\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.946416 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/471f34e2-b42f-468a-bd32-a1fa9612dbdc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"471f34e2-b42f-468a-bd32-a1fa9612dbdc\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.946458 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/471f34e2-b42f-468a-bd32-a1fa9612dbdc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"471f34e2-b42f-468a-bd32-a1fa9612dbdc\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.946481 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/471f34e2-b42f-468a-bd32-a1fa9612dbdc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"471f34e2-b42f-468a-bd32-a1fa9612dbdc\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.946528 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/471f34e2-b42f-468a-bd32-a1fa9612dbdc-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"471f34e2-b42f-468a-bd32-a1fa9612dbdc\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.946585 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"471f34e2-b42f-468a-bd32-a1fa9612dbdc\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.946626 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84gzc\" (UniqueName: \"kubernetes.io/projected/471f34e2-b42f-468a-bd32-a1fa9612dbdc-kube-api-access-84gzc\") pod \"glance-default-internal-api-0\" (UID: \"471f34e2-b42f-468a-bd32-a1fa9612dbdc\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.946673 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/471f34e2-b42f-468a-bd32-a1fa9612dbdc-ceph\") pod \"glance-default-internal-api-0\" (UID: \"471f34e2-b42f-468a-bd32-a1fa9612dbdc\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.946714 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/471f34e2-b42f-468a-bd32-a1fa9612dbdc-logs\") pod \"glance-default-internal-api-0\" (UID: \"471f34e2-b42f-468a-bd32-a1fa9612dbdc\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.947201 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/471f34e2-b42f-468a-bd32-a1fa9612dbdc-logs\") pod \"glance-default-internal-api-0\" (UID: \"471f34e2-b42f-468a-bd32-a1fa9612dbdc\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.947566 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"471f34e2-b42f-468a-bd32-a1fa9612dbdc\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.949087 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/471f34e2-b42f-468a-bd32-a1fa9612dbdc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"471f34e2-b42f-468a-bd32-a1fa9612dbdc\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.951128 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/471f34e2-b42f-468a-bd32-a1fa9612dbdc-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"471f34e2-b42f-468a-bd32-a1fa9612dbdc\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.951499 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/471f34e2-b42f-468a-bd32-a1fa9612dbdc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"471f34e2-b42f-468a-bd32-a1fa9612dbdc\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.953373 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/471f34e2-b42f-468a-bd32-a1fa9612dbdc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"471f34e2-b42f-468a-bd32-a1fa9612dbdc\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.957773 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/471f34e2-b42f-468a-bd32-a1fa9612dbdc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"471f34e2-b42f-468a-bd32-a1fa9612dbdc\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.972215 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/471f34e2-b42f-468a-bd32-a1fa9612dbdc-ceph\") pod \"glance-default-internal-api-0\" (UID: \"471f34e2-b42f-468a-bd32-a1fa9612dbdc\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.973238 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84gzc\" (UniqueName: \"kubernetes.io/projected/471f34e2-b42f-468a-bd32-a1fa9612dbdc-kube-api-access-84gzc\") pod \"glance-default-internal-api-0\" (UID: \"471f34e2-b42f-468a-bd32-a1fa9612dbdc\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:09:38 crc kubenswrapper[4869]: I0312 15:09:38.992461 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"471f34e2-b42f-468a-bd32-a1fa9612dbdc\") " pod="openstack/glance-default-internal-api-0" Mar 12 15:09:39 crc kubenswrapper[4869]: E0312 15:09:39.019054 4869 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc13a91ca_db8c_42b4_bfca_029e427aff28.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc13a91ca_db8c_42b4_bfca_029e427aff28.slice/crio-23f0e9b3a7cf88948884e8f788df4eb82dcd7e1e08852ac82e535a3497644ef7\": RecentStats: unable to find data in memory cache]" Mar 12 15:09:39 crc kubenswrapper[4869]: I0312 15:09:39.097842 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 15:09:39 crc kubenswrapper[4869]: I0312 15:09:39.152488 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 12 15:09:39 crc kubenswrapper[4869]: I0312 15:09:39.172984 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Mar 12 15:09:39 crc kubenswrapper[4869]: I0312 15:09:39.545839 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Mar 12 15:09:39 crc kubenswrapper[4869]: I0312 15:09:39.711009 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"442f7481-9051-4bc5-b3e5-13eef1e6ff01","Type":"ContainerStarted","Data":"3f77594f199b31779ce0103ce6a580f2ff30565133c581d375d8d699fd831992"} Mar 12 15:09:39 crc kubenswrapper[4869]: I0312 15:09:39.711362 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"442f7481-9051-4bc5-b3e5-13eef1e6ff01","Type":"ContainerStarted","Data":"5919067d1bd49afbbe8831d6d93ddfcb6c1bbc9bc3ca5079b78cce2b76f624db"} Mar 12 15:09:39 crc kubenswrapper[4869]: I0312 15:09:39.712678 4869 generic.go:334] "Generic (PLEG): container finished" podID="7c21e651-1883-4a95-a803-e6a5a2b7f457" containerID="fb8feff685efe1b60d51bd5cf1b516b74dd7980d841d58162ce0d145a168e576" exitCode=0 Mar 12 15:09:39 crc kubenswrapper[4869]: I0312 15:09:39.712737 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-tgt7x" event={"ID":"7c21e651-1883-4a95-a803-e6a5a2b7f457","Type":"ContainerDied","Data":"fb8feff685efe1b60d51bd5cf1b516b74dd7980d841d58162ce0d145a168e576"} Mar 12 15:09:39 crc kubenswrapper[4869]: I0312 15:09:39.736660 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=2.736634011 podStartE2EDuration="2.736634011s" podCreationTimestamp="2026-03-12 15:09:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:09:39.7355555 +0000 UTC m=+1332.020780808" watchObservedRunningTime="2026-03-12 15:09:39.736634011 +0000 UTC m=+1332.021859289" Mar 12 15:09:39 crc kubenswrapper[4869]: I0312 15:09:39.833133 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 15:09:40 crc kubenswrapper[4869]: I0312 15:09:40.347300 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6" path="/var/lib/kubelet/pods/e2cc7b5a-a92b-4ad2-8778-d6c25782eeb6/volumes" Mar 12 15:09:40 crc kubenswrapper[4869]: I0312 15:09:40.731996 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"471f34e2-b42f-468a-bd32-a1fa9612dbdc","Type":"ContainerStarted","Data":"f7884b31ede27b37386b9467a3de2d4ad766b8d132d45233cbfeea4a8434527d"} Mar 12 15:09:40 crc kubenswrapper[4869]: I0312 15:09:40.732051 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"471f34e2-b42f-468a-bd32-a1fa9612dbdc","Type":"ContainerStarted","Data":"2755b5edf8bd4708ed353ae03f5b91be54e470aae25af05be1fd4c6e4f963992"} Mar 12 15:09:41 crc kubenswrapper[4869]: I0312 15:09:41.091519 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-tgt7x" Mar 12 15:09:41 crc kubenswrapper[4869]: I0312 15:09:41.207482 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c21e651-1883-4a95-a803-e6a5a2b7f457-scripts\") pod \"7c21e651-1883-4a95-a803-e6a5a2b7f457\" (UID: \"7c21e651-1883-4a95-a803-e6a5a2b7f457\") " Mar 12 15:09:41 crc kubenswrapper[4869]: I0312 15:09:41.207569 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c21e651-1883-4a95-a803-e6a5a2b7f457-combined-ca-bundle\") pod \"7c21e651-1883-4a95-a803-e6a5a2b7f457\" (UID: \"7c21e651-1883-4a95-a803-e6a5a2b7f457\") " Mar 12 15:09:41 crc kubenswrapper[4869]: I0312 15:09:41.207716 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c21e651-1883-4a95-a803-e6a5a2b7f457-config-data\") pod \"7c21e651-1883-4a95-a803-e6a5a2b7f457\" (UID: \"7c21e651-1883-4a95-a803-e6a5a2b7f457\") " Mar 12 15:09:41 crc kubenswrapper[4869]: I0312 15:09:41.207821 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8ssr\" (UniqueName: \"kubernetes.io/projected/7c21e651-1883-4a95-a803-e6a5a2b7f457-kube-api-access-r8ssr\") pod \"7c21e651-1883-4a95-a803-e6a5a2b7f457\" (UID: \"7c21e651-1883-4a95-a803-e6a5a2b7f457\") " Mar 12 15:09:41 crc kubenswrapper[4869]: I0312 15:09:41.214308 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c21e651-1883-4a95-a803-e6a5a2b7f457-scripts" (OuterVolumeSpecName: "scripts") pod "7c21e651-1883-4a95-a803-e6a5a2b7f457" (UID: "7c21e651-1883-4a95-a803-e6a5a2b7f457"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:41 crc kubenswrapper[4869]: I0312 15:09:41.215740 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c21e651-1883-4a95-a803-e6a5a2b7f457-kube-api-access-r8ssr" (OuterVolumeSpecName: "kube-api-access-r8ssr") pod "7c21e651-1883-4a95-a803-e6a5a2b7f457" (UID: "7c21e651-1883-4a95-a803-e6a5a2b7f457"). InnerVolumeSpecName "kube-api-access-r8ssr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:09:41 crc kubenswrapper[4869]: I0312 15:09:41.257480 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c21e651-1883-4a95-a803-e6a5a2b7f457-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c21e651-1883-4a95-a803-e6a5a2b7f457" (UID: "7c21e651-1883-4a95-a803-e6a5a2b7f457"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:41 crc kubenswrapper[4869]: I0312 15:09:41.259809 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c21e651-1883-4a95-a803-e6a5a2b7f457-config-data" (OuterVolumeSpecName: "config-data") pod "7c21e651-1883-4a95-a803-e6a5a2b7f457" (UID: "7c21e651-1883-4a95-a803-e6a5a2b7f457"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:41 crc kubenswrapper[4869]: I0312 15:09:41.310596 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c21e651-1883-4a95-a803-e6a5a2b7f457-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:41 crc kubenswrapper[4869]: I0312 15:09:41.310631 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8ssr\" (UniqueName: \"kubernetes.io/projected/7c21e651-1883-4a95-a803-e6a5a2b7f457-kube-api-access-r8ssr\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:41 crc kubenswrapper[4869]: I0312 15:09:41.310642 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c21e651-1883-4a95-a803-e6a5a2b7f457-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:41 crc kubenswrapper[4869]: I0312 15:09:41.310650 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c21e651-1883-4a95-a803-e6a5a2b7f457-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:41 crc kubenswrapper[4869]: I0312 15:09:41.741494 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-tgt7x" event={"ID":"7c21e651-1883-4a95-a803-e6a5a2b7f457","Type":"ContainerDied","Data":"b1b97fd2a029702b53aea5f04e9529e1b914d9d6edd137d07b0b9c5d44f4b9de"} Mar 12 15:09:41 crc kubenswrapper[4869]: I0312 15:09:41.741807 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1b97fd2a029702b53aea5f04e9529e1b914d9d6edd137d07b0b9c5d44f4b9de" Mar 12 15:09:41 crc kubenswrapper[4869]: I0312 15:09:41.741656 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-tgt7x" Mar 12 15:09:41 crc kubenswrapper[4869]: I0312 15:09:41.753457 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"471f34e2-b42f-468a-bd32-a1fa9612dbdc","Type":"ContainerStarted","Data":"806d8f88b86f21684e6ac9a21d2f5f2c731a83572c9014ac0c08e5deea379322"} Mar 12 15:09:41 crc kubenswrapper[4869]: I0312 15:09:41.779578 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.779557017 podStartE2EDuration="3.779557017s" podCreationTimestamp="2026-03-12 15:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:09:41.776310924 +0000 UTC m=+1334.061536212" watchObservedRunningTime="2026-03-12 15:09:41.779557017 +0000 UTC m=+1334.064782305" Mar 12 15:09:41 crc kubenswrapper[4869]: I0312 15:09:41.882125 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 12 15:09:41 crc kubenswrapper[4869]: E0312 15:09:41.882496 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c21e651-1883-4a95-a803-e6a5a2b7f457" containerName="nova-cell0-conductor-db-sync" Mar 12 15:09:41 crc kubenswrapper[4869]: I0312 15:09:41.882511 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c21e651-1883-4a95-a803-e6a5a2b7f457" containerName="nova-cell0-conductor-db-sync" Mar 12 15:09:41 crc kubenswrapper[4869]: I0312 15:09:41.882683 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c21e651-1883-4a95-a803-e6a5a2b7f457" containerName="nova-cell0-conductor-db-sync" Mar 12 15:09:41 crc kubenswrapper[4869]: I0312 15:09:41.883246 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 12 15:09:41 crc kubenswrapper[4869]: I0312 15:09:41.887888 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-tljpw" Mar 12 15:09:41 crc kubenswrapper[4869]: I0312 15:09:41.888064 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 12 15:09:41 crc kubenswrapper[4869]: I0312 15:09:41.899301 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 12 15:09:41 crc kubenswrapper[4869]: I0312 15:09:41.923186 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb211657-8745-48c2-ac07-ab3ae88fe808-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bb211657-8745-48c2-ac07-ab3ae88fe808\") " pod="openstack/nova-cell0-conductor-0" Mar 12 15:09:41 crc kubenswrapper[4869]: I0312 15:09:41.923379 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jssm5\" (UniqueName: \"kubernetes.io/projected/bb211657-8745-48c2-ac07-ab3ae88fe808-kube-api-access-jssm5\") pod \"nova-cell0-conductor-0\" (UID: \"bb211657-8745-48c2-ac07-ab3ae88fe808\") " pod="openstack/nova-cell0-conductor-0" Mar 12 15:09:41 crc kubenswrapper[4869]: I0312 15:09:41.923417 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb211657-8745-48c2-ac07-ab3ae88fe808-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bb211657-8745-48c2-ac07-ab3ae88fe808\") " pod="openstack/nova-cell0-conductor-0" Mar 12 15:09:42 crc kubenswrapper[4869]: I0312 15:09:42.025443 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb211657-8745-48c2-ac07-ab3ae88fe808-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bb211657-8745-48c2-ac07-ab3ae88fe808\") " pod="openstack/nova-cell0-conductor-0" Mar 12 15:09:42 crc kubenswrapper[4869]: I0312 15:09:42.025606 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jssm5\" (UniqueName: \"kubernetes.io/projected/bb211657-8745-48c2-ac07-ab3ae88fe808-kube-api-access-jssm5\") pod \"nova-cell0-conductor-0\" (UID: \"bb211657-8745-48c2-ac07-ab3ae88fe808\") " pod="openstack/nova-cell0-conductor-0" Mar 12 15:09:42 crc kubenswrapper[4869]: I0312 15:09:42.025633 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb211657-8745-48c2-ac07-ab3ae88fe808-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bb211657-8745-48c2-ac07-ab3ae88fe808\") " pod="openstack/nova-cell0-conductor-0" Mar 12 15:09:42 crc kubenswrapper[4869]: I0312 15:09:42.035367 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb211657-8745-48c2-ac07-ab3ae88fe808-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bb211657-8745-48c2-ac07-ab3ae88fe808\") " pod="openstack/nova-cell0-conductor-0" Mar 12 15:09:42 crc kubenswrapper[4869]: I0312 15:09:42.045672 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb211657-8745-48c2-ac07-ab3ae88fe808-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bb211657-8745-48c2-ac07-ab3ae88fe808\") " pod="openstack/nova-cell0-conductor-0" Mar 12 15:09:42 crc kubenswrapper[4869]: I0312 15:09:42.046169 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jssm5\" (UniqueName: \"kubernetes.io/projected/bb211657-8745-48c2-ac07-ab3ae88fe808-kube-api-access-jssm5\") pod \"nova-cell0-conductor-0\" (UID: \"bb211657-8745-48c2-ac07-ab3ae88fe808\") " pod="openstack/nova-cell0-conductor-0" Mar 12 15:09:42 crc kubenswrapper[4869]: I0312 15:09:42.199958 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 12 15:09:42 crc kubenswrapper[4869]: I0312 15:09:42.749220 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 12 15:09:42 crc kubenswrapper[4869]: W0312 15:09:42.757363 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb211657_8745_48c2_ac07_ab3ae88fe808.slice/crio-e5c20270a037603f797cbff34ce83b7ca64bad4dcd108d9185dc71b142a6240f WatchSource:0}: Error finding container e5c20270a037603f797cbff34ce83b7ca64bad4dcd108d9185dc71b142a6240f: Status 404 returned error can't find the container with id e5c20270a037603f797cbff34ce83b7ca64bad4dcd108d9185dc71b142a6240f Mar 12 15:09:43 crc kubenswrapper[4869]: I0312 15:09:43.774650 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bb211657-8745-48c2-ac07-ab3ae88fe808","Type":"ContainerStarted","Data":"8ff43c2fba19d7e4cca02e00a60b51633fdd905c8cf416806ebc547fed83e0e6"} Mar 12 15:09:43 crc kubenswrapper[4869]: I0312 15:09:43.776225 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bb211657-8745-48c2-ac07-ab3ae88fe808","Type":"ContainerStarted","Data":"e5c20270a037603f797cbff34ce83b7ca64bad4dcd108d9185dc71b142a6240f"} Mar 12 15:09:43 crc kubenswrapper[4869]: I0312 15:09:43.776318 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 12 15:09:43 crc kubenswrapper[4869]: I0312 15:09:43.796017 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.796000175 podStartE2EDuration="2.796000175s" podCreationTimestamp="2026-03-12 15:09:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:09:43.791074634 +0000 UTC m=+1336.076299912" watchObservedRunningTime="2026-03-12 15:09:43.796000175 +0000 UTC m=+1336.081225453" Mar 12 15:09:44 crc kubenswrapper[4869]: I0312 15:09:44.558485 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 12 15:09:45 crc kubenswrapper[4869]: I0312 15:09:45.795772 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="bb211657-8745-48c2-ac07-ab3ae88fe808" containerName="nova-cell0-conductor-conductor" containerID="cri-o://8ff43c2fba19d7e4cca02e00a60b51633fdd905c8cf416806ebc547fed83e0e6" gracePeriod=30 Mar 12 15:09:46 crc kubenswrapper[4869]: I0312 15:09:46.573753 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 12 15:09:46 crc kubenswrapper[4869]: I0312 15:09:46.719021 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb211657-8745-48c2-ac07-ab3ae88fe808-config-data\") pod \"bb211657-8745-48c2-ac07-ab3ae88fe808\" (UID: \"bb211657-8745-48c2-ac07-ab3ae88fe808\") " Mar 12 15:09:46 crc kubenswrapper[4869]: I0312 15:09:46.719211 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jssm5\" (UniqueName: \"kubernetes.io/projected/bb211657-8745-48c2-ac07-ab3ae88fe808-kube-api-access-jssm5\") pod \"bb211657-8745-48c2-ac07-ab3ae88fe808\" (UID: \"bb211657-8745-48c2-ac07-ab3ae88fe808\") " Mar 12 15:09:46 crc kubenswrapper[4869]: I0312 15:09:46.719352 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb211657-8745-48c2-ac07-ab3ae88fe808-combined-ca-bundle\") pod \"bb211657-8745-48c2-ac07-ab3ae88fe808\" (UID: \"bb211657-8745-48c2-ac07-ab3ae88fe808\") " Mar 12 15:09:46 crc kubenswrapper[4869]: I0312 15:09:46.725871 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb211657-8745-48c2-ac07-ab3ae88fe808-kube-api-access-jssm5" (OuterVolumeSpecName: "kube-api-access-jssm5") pod "bb211657-8745-48c2-ac07-ab3ae88fe808" (UID: "bb211657-8745-48c2-ac07-ab3ae88fe808"). InnerVolumeSpecName "kube-api-access-jssm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:09:46 crc kubenswrapper[4869]: I0312 15:09:46.750494 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb211657-8745-48c2-ac07-ab3ae88fe808-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb211657-8745-48c2-ac07-ab3ae88fe808" (UID: "bb211657-8745-48c2-ac07-ab3ae88fe808"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:46 crc kubenswrapper[4869]: I0312 15:09:46.752846 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb211657-8745-48c2-ac07-ab3ae88fe808-config-data" (OuterVolumeSpecName: "config-data") pod "bb211657-8745-48c2-ac07-ab3ae88fe808" (UID: "bb211657-8745-48c2-ac07-ab3ae88fe808"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:46 crc kubenswrapper[4869]: I0312 15:09:46.823694 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb211657-8745-48c2-ac07-ab3ae88fe808-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:46 crc kubenswrapper[4869]: I0312 15:09:46.823753 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jssm5\" (UniqueName: \"kubernetes.io/projected/bb211657-8745-48c2-ac07-ab3ae88fe808-kube-api-access-jssm5\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:46 crc kubenswrapper[4869]: I0312 15:09:46.823769 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb211657-8745-48c2-ac07-ab3ae88fe808-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:46 crc kubenswrapper[4869]: I0312 15:09:46.828019 4869 generic.go:334] "Generic (PLEG): container finished" podID="bb211657-8745-48c2-ac07-ab3ae88fe808" containerID="8ff43c2fba19d7e4cca02e00a60b51633fdd905c8cf416806ebc547fed83e0e6" exitCode=0 Mar 12 15:09:46 crc kubenswrapper[4869]: I0312 15:09:46.828062 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bb211657-8745-48c2-ac07-ab3ae88fe808","Type":"ContainerDied","Data":"8ff43c2fba19d7e4cca02e00a60b51633fdd905c8cf416806ebc547fed83e0e6"} Mar 12 15:09:46 crc kubenswrapper[4869]: I0312 15:09:46.828092 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bb211657-8745-48c2-ac07-ab3ae88fe808","Type":"ContainerDied","Data":"e5c20270a037603f797cbff34ce83b7ca64bad4dcd108d9185dc71b142a6240f"} Mar 12 15:09:46 crc kubenswrapper[4869]: I0312 15:09:46.828112 4869 scope.go:117] "RemoveContainer" containerID="8ff43c2fba19d7e4cca02e00a60b51633fdd905c8cf416806ebc547fed83e0e6" Mar 12 15:09:46 crc kubenswrapper[4869]: I0312 15:09:46.828124 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 12 15:09:46 crc kubenswrapper[4869]: I0312 15:09:46.851508 4869 scope.go:117] "RemoveContainer" containerID="8ff43c2fba19d7e4cca02e00a60b51633fdd905c8cf416806ebc547fed83e0e6" Mar 12 15:09:46 crc kubenswrapper[4869]: E0312 15:09:46.851906 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ff43c2fba19d7e4cca02e00a60b51633fdd905c8cf416806ebc547fed83e0e6\": container with ID starting with 8ff43c2fba19d7e4cca02e00a60b51633fdd905c8cf416806ebc547fed83e0e6 not found: ID does not exist" containerID="8ff43c2fba19d7e4cca02e00a60b51633fdd905c8cf416806ebc547fed83e0e6" Mar 12 15:09:46 crc kubenswrapper[4869]: I0312 15:09:46.852004 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ff43c2fba19d7e4cca02e00a60b51633fdd905c8cf416806ebc547fed83e0e6"} err="failed to get container status \"8ff43c2fba19d7e4cca02e00a60b51633fdd905c8cf416806ebc547fed83e0e6\": rpc error: code = NotFound desc = could not find container \"8ff43c2fba19d7e4cca02e00a60b51633fdd905c8cf416806ebc547fed83e0e6\": container with ID starting with 8ff43c2fba19d7e4cca02e00a60b51633fdd905c8cf416806ebc547fed83e0e6 not found: ID does not exist" Mar 12 15:09:46 crc kubenswrapper[4869]: I0312 15:09:46.877915 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 12 15:09:46 crc kubenswrapper[4869]: I0312 15:09:46.886445 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 12 15:09:46 crc kubenswrapper[4869]: I0312 15:09:46.896485 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 12 15:09:46 crc kubenswrapper[4869]: E0312 15:09:46.897023 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb211657-8745-48c2-ac07-ab3ae88fe808" containerName="nova-cell0-conductor-conductor" Mar 12 15:09:46 crc kubenswrapper[4869]: I0312 15:09:46.897044 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb211657-8745-48c2-ac07-ab3ae88fe808" containerName="nova-cell0-conductor-conductor" Mar 12 15:09:46 crc kubenswrapper[4869]: I0312 15:09:46.897253 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb211657-8745-48c2-ac07-ab3ae88fe808" containerName="nova-cell0-conductor-conductor" Mar 12 15:09:46 crc kubenswrapper[4869]: I0312 15:09:46.898131 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 12 15:09:46 crc kubenswrapper[4869]: I0312 15:09:46.910350 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 12 15:09:46 crc kubenswrapper[4869]: I0312 15:09:46.915024 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-tljpw" Mar 12 15:09:46 crc kubenswrapper[4869]: I0312 15:09:46.915341 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 12 15:09:46 crc kubenswrapper[4869]: I0312 15:09:46.925848 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba5ec75b-57f3-43a2-b6eb-4f876d368fae-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ba5ec75b-57f3-43a2-b6eb-4f876d368fae\") " pod="openstack/nova-cell0-conductor-0" Mar 12 15:09:46 crc kubenswrapper[4869]: I0312 15:09:46.926037 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba5ec75b-57f3-43a2-b6eb-4f876d368fae-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ba5ec75b-57f3-43a2-b6eb-4f876d368fae\") " pod="openstack/nova-cell0-conductor-0" Mar 12 15:09:46 crc kubenswrapper[4869]: I0312 15:09:46.926142 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sp94\" (UniqueName: \"kubernetes.io/projected/ba5ec75b-57f3-43a2-b6eb-4f876d368fae-kube-api-access-4sp94\") pod \"nova-cell0-conductor-0\" (UID: \"ba5ec75b-57f3-43a2-b6eb-4f876d368fae\") " pod="openstack/nova-cell0-conductor-0" Mar 12 15:09:47 crc kubenswrapper[4869]: I0312 15:09:47.027039 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba5ec75b-57f3-43a2-b6eb-4f876d368fae-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ba5ec75b-57f3-43a2-b6eb-4f876d368fae\") " pod="openstack/nova-cell0-conductor-0" Mar 12 15:09:47 crc kubenswrapper[4869]: I0312 15:09:47.027135 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba5ec75b-57f3-43a2-b6eb-4f876d368fae-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ba5ec75b-57f3-43a2-b6eb-4f876d368fae\") " pod="openstack/nova-cell0-conductor-0" Mar 12 15:09:47 crc kubenswrapper[4869]: I0312 15:09:47.027160 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sp94\" (UniqueName: \"kubernetes.io/projected/ba5ec75b-57f3-43a2-b6eb-4f876d368fae-kube-api-access-4sp94\") pod \"nova-cell0-conductor-0\" (UID: \"ba5ec75b-57f3-43a2-b6eb-4f876d368fae\") " pod="openstack/nova-cell0-conductor-0" Mar 12 15:09:47 crc kubenswrapper[4869]: I0312 15:09:47.033469 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba5ec75b-57f3-43a2-b6eb-4f876d368fae-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ba5ec75b-57f3-43a2-b6eb-4f876d368fae\") " pod="openstack/nova-cell0-conductor-0" Mar 12 15:09:47 crc kubenswrapper[4869]: I0312 15:09:47.037226 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba5ec75b-57f3-43a2-b6eb-4f876d368fae-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ba5ec75b-57f3-43a2-b6eb-4f876d368fae\") " pod="openstack/nova-cell0-conductor-0" Mar 12 15:09:47 crc kubenswrapper[4869]: I0312 15:09:47.063173 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sp94\" (UniqueName: \"kubernetes.io/projected/ba5ec75b-57f3-43a2-b6eb-4f876d368fae-kube-api-access-4sp94\") pod \"nova-cell0-conductor-0\" (UID: \"ba5ec75b-57f3-43a2-b6eb-4f876d368fae\") " pod="openstack/nova-cell0-conductor-0" Mar 12 15:09:47 crc kubenswrapper[4869]: I0312 15:09:47.244063 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 12 15:09:47 crc kubenswrapper[4869]: I0312 15:09:47.412296 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 12 15:09:47 crc kubenswrapper[4869]: I0312 15:09:47.413483 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 12 15:09:47 crc kubenswrapper[4869]: I0312 15:09:47.451261 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 12 15:09:47 crc kubenswrapper[4869]: I0312 15:09:47.451713 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 12 15:09:47 crc kubenswrapper[4869]: I0312 15:09:47.656133 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 12 15:09:47 crc kubenswrapper[4869]: I0312 15:09:47.839130 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ba5ec75b-57f3-43a2-b6eb-4f876d368fae","Type":"ContainerStarted","Data":"14e52ad39c3a172359f545600ae3bed3b8fb39f5600cd5c3fce8d5f481ff0d7e"} Mar 12 15:09:47 crc kubenswrapper[4869]: I0312 15:09:47.839778 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 12 15:09:47 crc kubenswrapper[4869]: I0312 15:09:47.840078 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 12 15:09:48 crc kubenswrapper[4869]: I0312 15:09:48.353370 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb211657-8745-48c2-ac07-ab3ae88fe808" path="/var/lib/kubelet/pods/bb211657-8745-48c2-ac07-ab3ae88fe808/volumes" Mar 12 15:09:48 crc kubenswrapper[4869]: I0312 15:09:48.854933 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ba5ec75b-57f3-43a2-b6eb-4f876d368fae","Type":"ContainerStarted","Data":"31509f1d1d7b1916de4521929b9979d0c7b6dba0e857f3cdbec01e3e574402f7"} Mar 12 15:09:48 crc kubenswrapper[4869]: I0312 15:09:48.882077 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.882056265 podStartE2EDuration="2.882056265s" podCreationTimestamp="2026-03-12 15:09:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:09:48.871767231 +0000 UTC m=+1341.156992529" watchObservedRunningTime="2026-03-12 15:09:48.882056265 +0000 UTC m=+1341.167281543" Mar 12 15:09:49 crc kubenswrapper[4869]: I0312 15:09:49.099057 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 12 15:09:49 crc kubenswrapper[4869]: I0312 15:09:49.100733 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 12 15:09:49 crc kubenswrapper[4869]: I0312 15:09:49.129964 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 12 15:09:49 crc kubenswrapper[4869]: I0312 15:09:49.146191 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 12 15:09:49 crc kubenswrapper[4869]: E0312 15:09:49.307134 4869 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc13a91ca_db8c_42b4_bfca_029e427aff28.slice/crio-23f0e9b3a7cf88948884e8f788df4eb82dcd7e1e08852ac82e535a3497644ef7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc13a91ca_db8c_42b4_bfca_029e427aff28.slice\": RecentStats: unable to find data in memory cache]" Mar 12 15:09:49 crc kubenswrapper[4869]: I0312 15:09:49.854182 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 12 15:09:49 crc kubenswrapper[4869]: I0312 15:09:49.863689 4869 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 15:09:49 crc kubenswrapper[4869]: I0312 15:09:49.864338 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 12 15:09:49 crc kubenswrapper[4869]: I0312 15:09:49.864381 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 12 15:09:49 crc kubenswrapper[4869]: I0312 15:09:49.864391 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 12 15:09:49 crc kubenswrapper[4869]: I0312 15:09:49.935027 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 12 15:09:50 crc kubenswrapper[4869]: I0312 15:09:50.687636 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Mar 12 15:09:52 crc kubenswrapper[4869]: I0312 15:09:52.016100 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 12 15:09:52 crc kubenswrapper[4869]: I0312 15:09:52.016522 4869 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 15:09:52 crc kubenswrapper[4869]: I0312 15:09:52.020573 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 12 15:09:52 crc kubenswrapper[4869]: I0312 15:09:52.269017 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 12 15:09:52 crc kubenswrapper[4869]: I0312 15:09:52.729145 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-bzm2j"] Mar 12 15:09:52 crc kubenswrapper[4869]: I0312 15:09:52.730626 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bzm2j" Mar 12 15:09:52 crc kubenswrapper[4869]: I0312 15:09:52.733550 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 12 15:09:52 crc kubenswrapper[4869]: I0312 15:09:52.733716 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 12 15:09:52 crc kubenswrapper[4869]: I0312 15:09:52.738306 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-bzm2j"] Mar 12 15:09:52 crc kubenswrapper[4869]: I0312 15:09:52.863204 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2829ee85-1daa-4418-b99c-90bbebceb7c2-scripts\") pod \"nova-cell0-cell-mapping-bzm2j\" (UID: \"2829ee85-1daa-4418-b99c-90bbebceb7c2\") " pod="openstack/nova-cell0-cell-mapping-bzm2j" Mar 12 15:09:52 crc kubenswrapper[4869]: I0312 15:09:52.863340 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2829ee85-1daa-4418-b99c-90bbebceb7c2-config-data\") pod \"nova-cell0-cell-mapping-bzm2j\" (UID: \"2829ee85-1daa-4418-b99c-90bbebceb7c2\") " pod="openstack/nova-cell0-cell-mapping-bzm2j" Mar 12 15:09:52 crc kubenswrapper[4869]: I0312 15:09:52.863361 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5jqp\" (UniqueName: \"kubernetes.io/projected/2829ee85-1daa-4418-b99c-90bbebceb7c2-kube-api-access-n5jqp\") pod \"nova-cell0-cell-mapping-bzm2j\" (UID: \"2829ee85-1daa-4418-b99c-90bbebceb7c2\") " pod="openstack/nova-cell0-cell-mapping-bzm2j" Mar 12 15:09:52 crc kubenswrapper[4869]: I0312 15:09:52.863410 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2829ee85-1daa-4418-b99c-90bbebceb7c2-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bzm2j\" (UID: \"2829ee85-1daa-4418-b99c-90bbebceb7c2\") " pod="openstack/nova-cell0-cell-mapping-bzm2j" Mar 12 15:09:52 crc kubenswrapper[4869]: I0312 15:09:52.885041 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 12 15:09:52 crc kubenswrapper[4869]: I0312 15:09:52.886894 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 15:09:52 crc kubenswrapper[4869]: I0312 15:09:52.889988 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 12 15:09:52 crc kubenswrapper[4869]: I0312 15:09:52.913043 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 15:09:52 crc kubenswrapper[4869]: I0312 15:09:52.965719 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2829ee85-1daa-4418-b99c-90bbebceb7c2-config-data\") pod \"nova-cell0-cell-mapping-bzm2j\" (UID: \"2829ee85-1daa-4418-b99c-90bbebceb7c2\") " pod="openstack/nova-cell0-cell-mapping-bzm2j" Mar 12 15:09:52 crc kubenswrapper[4869]: I0312 15:09:52.965757 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5jqp\" (UniqueName: \"kubernetes.io/projected/2829ee85-1daa-4418-b99c-90bbebceb7c2-kube-api-access-n5jqp\") pod \"nova-cell0-cell-mapping-bzm2j\" (UID: \"2829ee85-1daa-4418-b99c-90bbebceb7c2\") " pod="openstack/nova-cell0-cell-mapping-bzm2j" Mar 12 15:09:52 crc kubenswrapper[4869]: I0312 15:09:52.965785 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txcmm\" (UniqueName: \"kubernetes.io/projected/7df96c47-1886-48fe-8ee5-8436498d91ce-kube-api-access-txcmm\") pod \"nova-metadata-0\" (UID: \"7df96c47-1886-48fe-8ee5-8436498d91ce\") " pod="openstack/nova-metadata-0" Mar 12 15:09:52 crc kubenswrapper[4869]: I0312 15:09:52.965805 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df96c47-1886-48fe-8ee5-8436498d91ce-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7df96c47-1886-48fe-8ee5-8436498d91ce\") " pod="openstack/nova-metadata-0" Mar 12 15:09:52 crc kubenswrapper[4869]: I0312 15:09:52.965857 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2829ee85-1daa-4418-b99c-90bbebceb7c2-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bzm2j\" (UID: \"2829ee85-1daa-4418-b99c-90bbebceb7c2\") " pod="openstack/nova-cell0-cell-mapping-bzm2j" Mar 12 15:09:52 crc kubenswrapper[4869]: I0312 15:09:52.965924 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2829ee85-1daa-4418-b99c-90bbebceb7c2-scripts\") pod \"nova-cell0-cell-mapping-bzm2j\" (UID: \"2829ee85-1daa-4418-b99c-90bbebceb7c2\") " pod="openstack/nova-cell0-cell-mapping-bzm2j" Mar 12 15:09:52 crc kubenswrapper[4869]: I0312 15:09:52.965951 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7df96c47-1886-48fe-8ee5-8436498d91ce-config-data\") pod \"nova-metadata-0\" (UID: \"7df96c47-1886-48fe-8ee5-8436498d91ce\") " pod="openstack/nova-metadata-0" Mar 12 15:09:52 crc kubenswrapper[4869]: I0312 15:09:52.965980 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7df96c47-1886-48fe-8ee5-8436498d91ce-logs\") pod \"nova-metadata-0\" (UID: \"7df96c47-1886-48fe-8ee5-8436498d91ce\") " pod="openstack/nova-metadata-0" Mar 12 15:09:52 crc kubenswrapper[4869]: I0312 15:09:52.974735 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2829ee85-1daa-4418-b99c-90bbebceb7c2-config-data\") pod \"nova-cell0-cell-mapping-bzm2j\" (UID: \"2829ee85-1daa-4418-b99c-90bbebceb7c2\") " pod="openstack/nova-cell0-cell-mapping-bzm2j" Mar 12 15:09:52 crc kubenswrapper[4869]: I0312 15:09:52.975688 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2829ee85-1daa-4418-b99c-90bbebceb7c2-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bzm2j\" (UID: \"2829ee85-1daa-4418-b99c-90bbebceb7c2\") " pod="openstack/nova-cell0-cell-mapping-bzm2j" Mar 12 15:09:52 crc kubenswrapper[4869]: I0312 15:09:52.995654 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2829ee85-1daa-4418-b99c-90bbebceb7c2-scripts\") pod \"nova-cell0-cell-mapping-bzm2j\" (UID: \"2829ee85-1daa-4418-b99c-90bbebceb7c2\") " pod="openstack/nova-cell0-cell-mapping-bzm2j" Mar 12 15:09:52 crc kubenswrapper[4869]: I0312 15:09:52.996828 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5jqp\" (UniqueName: \"kubernetes.io/projected/2829ee85-1daa-4418-b99c-90bbebceb7c2-kube-api-access-n5jqp\") pod \"nova-cell0-cell-mapping-bzm2j\" (UID: \"2829ee85-1daa-4418-b99c-90bbebceb7c2\") " pod="openstack/nova-cell0-cell-mapping-bzm2j" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.029750 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b6c754dc9-jzxfs"] Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.031264 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b6c754dc9-jzxfs" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.054020 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b6c754dc9-jzxfs"] Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.063923 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bzm2j" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.069928 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7df96c47-1886-48fe-8ee5-8436498d91ce-config-data\") pod \"nova-metadata-0\" (UID: \"7df96c47-1886-48fe-8ee5-8436498d91ce\") " pod="openstack/nova-metadata-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.070168 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7df96c47-1886-48fe-8ee5-8436498d91ce-logs\") pod \"nova-metadata-0\" (UID: \"7df96c47-1886-48fe-8ee5-8436498d91ce\") " pod="openstack/nova-metadata-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.071009 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/079bc323-9373-49b7-ba69-d64155e9e902-ovsdbserver-sb\") pod \"dnsmasq-dns-6b6c754dc9-jzxfs\" (UID: \"079bc323-9373-49b7-ba69-d64155e9e902\") " pod="openstack/dnsmasq-dns-6b6c754dc9-jzxfs" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.071029 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7df96c47-1886-48fe-8ee5-8436498d91ce-logs\") pod \"nova-metadata-0\" (UID: \"7df96c47-1886-48fe-8ee5-8436498d91ce\") " pod="openstack/nova-metadata-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.071167 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/079bc323-9373-49b7-ba69-d64155e9e902-config\") pod \"dnsmasq-dns-6b6c754dc9-jzxfs\" (UID: \"079bc323-9373-49b7-ba69-d64155e9e902\") " pod="openstack/dnsmasq-dns-6b6c754dc9-jzxfs" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.071276 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/079bc323-9373-49b7-ba69-d64155e9e902-dns-swift-storage-0\") pod \"dnsmasq-dns-6b6c754dc9-jzxfs\" (UID: \"079bc323-9373-49b7-ba69-d64155e9e902\") " pod="openstack/dnsmasq-dns-6b6c754dc9-jzxfs" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.071323 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txcmm\" (UniqueName: \"kubernetes.io/projected/7df96c47-1886-48fe-8ee5-8436498d91ce-kube-api-access-txcmm\") pod \"nova-metadata-0\" (UID: \"7df96c47-1886-48fe-8ee5-8436498d91ce\") " pod="openstack/nova-metadata-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.071344 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df96c47-1886-48fe-8ee5-8436498d91ce-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7df96c47-1886-48fe-8ee5-8436498d91ce\") " pod="openstack/nova-metadata-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.071375 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/079bc323-9373-49b7-ba69-d64155e9e902-dns-svc\") pod \"dnsmasq-dns-6b6c754dc9-jzxfs\" (UID: \"079bc323-9373-49b7-ba69-d64155e9e902\") " pod="openstack/dnsmasq-dns-6b6c754dc9-jzxfs" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.071392 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/079bc323-9373-49b7-ba69-d64155e9e902-ovsdbserver-nb\") pod \"dnsmasq-dns-6b6c754dc9-jzxfs\" (UID: \"079bc323-9373-49b7-ba69-d64155e9e902\") " pod="openstack/dnsmasq-dns-6b6c754dc9-jzxfs" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.071474 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb2h4\" (UniqueName: \"kubernetes.io/projected/079bc323-9373-49b7-ba69-d64155e9e902-kube-api-access-mb2h4\") pod \"dnsmasq-dns-6b6c754dc9-jzxfs\" (UID: \"079bc323-9373-49b7-ba69-d64155e9e902\") " pod="openstack/dnsmasq-dns-6b6c754dc9-jzxfs" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.081023 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df96c47-1886-48fe-8ee5-8436498d91ce-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7df96c47-1886-48fe-8ee5-8436498d91ce\") " pod="openstack/nova-metadata-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.093829 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7df96c47-1886-48fe-8ee5-8436498d91ce-config-data\") pod \"nova-metadata-0\" (UID: \"7df96c47-1886-48fe-8ee5-8436498d91ce\") " pod="openstack/nova-metadata-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.101219 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.103603 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.110145 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txcmm\" (UniqueName: \"kubernetes.io/projected/7df96c47-1886-48fe-8ee5-8436498d91ce-kube-api-access-txcmm\") pod \"nova-metadata-0\" (UID: \"7df96c47-1886-48fe-8ee5-8436498d91ce\") " pod="openstack/nova-metadata-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.110474 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.178633 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/079bc323-9373-49b7-ba69-d64155e9e902-dns-svc\") pod \"dnsmasq-dns-6b6c754dc9-jzxfs\" (UID: \"079bc323-9373-49b7-ba69-d64155e9e902\") " pod="openstack/dnsmasq-dns-6b6c754dc9-jzxfs" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.178682 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/079bc323-9373-49b7-ba69-d64155e9e902-ovsdbserver-nb\") pod \"dnsmasq-dns-6b6c754dc9-jzxfs\" (UID: \"079bc323-9373-49b7-ba69-d64155e9e902\") " pod="openstack/dnsmasq-dns-6b6c754dc9-jzxfs" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.178775 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb2h4\" (UniqueName: \"kubernetes.io/projected/079bc323-9373-49b7-ba69-d64155e9e902-kube-api-access-mb2h4\") pod \"dnsmasq-dns-6b6c754dc9-jzxfs\" (UID: \"079bc323-9373-49b7-ba69-d64155e9e902\") " pod="openstack/dnsmasq-dns-6b6c754dc9-jzxfs" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.178961 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/079bc323-9373-49b7-ba69-d64155e9e902-ovsdbserver-sb\") pod \"dnsmasq-dns-6b6c754dc9-jzxfs\" (UID: \"079bc323-9373-49b7-ba69-d64155e9e902\") " pod="openstack/dnsmasq-dns-6b6c754dc9-jzxfs" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.178991 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/079bc323-9373-49b7-ba69-d64155e9e902-config\") pod \"dnsmasq-dns-6b6c754dc9-jzxfs\" (UID: \"079bc323-9373-49b7-ba69-d64155e9e902\") " pod="openstack/dnsmasq-dns-6b6c754dc9-jzxfs" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.179079 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/079bc323-9373-49b7-ba69-d64155e9e902-dns-swift-storage-0\") pod \"dnsmasq-dns-6b6c754dc9-jzxfs\" (UID: \"079bc323-9373-49b7-ba69-d64155e9e902\") " pod="openstack/dnsmasq-dns-6b6c754dc9-jzxfs" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.179966 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/079bc323-9373-49b7-ba69-d64155e9e902-dns-swift-storage-0\") pod \"dnsmasq-dns-6b6c754dc9-jzxfs\" (UID: \"079bc323-9373-49b7-ba69-d64155e9e902\") " pod="openstack/dnsmasq-dns-6b6c754dc9-jzxfs" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.180709 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/079bc323-9373-49b7-ba69-d64155e9e902-dns-svc\") pod \"dnsmasq-dns-6b6c754dc9-jzxfs\" (UID: \"079bc323-9373-49b7-ba69-d64155e9e902\") " pod="openstack/dnsmasq-dns-6b6c754dc9-jzxfs" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.181529 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/079bc323-9373-49b7-ba69-d64155e9e902-config\") pod \"dnsmasq-dns-6b6c754dc9-jzxfs\" (UID: \"079bc323-9373-49b7-ba69-d64155e9e902\") " pod="openstack/dnsmasq-dns-6b6c754dc9-jzxfs" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.181714 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/079bc323-9373-49b7-ba69-d64155e9e902-ovsdbserver-nb\") pod \"dnsmasq-dns-6b6c754dc9-jzxfs\" (UID: \"079bc323-9373-49b7-ba69-d64155e9e902\") " pod="openstack/dnsmasq-dns-6b6c754dc9-jzxfs" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.184882 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/079bc323-9373-49b7-ba69-d64155e9e902-ovsdbserver-sb\") pod \"dnsmasq-dns-6b6c754dc9-jzxfs\" (UID: \"079bc323-9373-49b7-ba69-d64155e9e902\") " pod="openstack/dnsmasq-dns-6b6c754dc9-jzxfs" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.184943 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.210469 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb2h4\" (UniqueName: \"kubernetes.io/projected/079bc323-9373-49b7-ba69-d64155e9e902-kube-api-access-mb2h4\") pod \"dnsmasq-dns-6b6c754dc9-jzxfs\" (UID: \"079bc323-9373-49b7-ba69-d64155e9e902\") " pod="openstack/dnsmasq-dns-6b6c754dc9-jzxfs" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.214675 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.216970 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.222270 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.233768 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.263217 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.279824 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.280020 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f1248e0-398e-48f9-adad-d214c7700bb8-config-data\") pod \"nova-api-0\" (UID: \"5f1248e0-398e-48f9-adad-d214c7700bb8\") " pod="openstack/nova-api-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.280054 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ba9b14c-abed-454e-a2f3-809ccafcd42f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0ba9b14c-abed-454e-a2f3-809ccafcd42f\") " pod="openstack/nova-scheduler-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.280076 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ba9b14c-abed-454e-a2f3-809ccafcd42f-config-data\") pod \"nova-scheduler-0\" (UID: \"0ba9b14c-abed-454e-a2f3-809ccafcd42f\") " pod="openstack/nova-scheduler-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.280131 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f1248e0-398e-48f9-adad-d214c7700bb8-logs\") pod \"nova-api-0\" (UID: \"5f1248e0-398e-48f9-adad-d214c7700bb8\") " pod="openstack/nova-api-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.280169 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6cn6\" (UniqueName: \"kubernetes.io/projected/5f1248e0-398e-48f9-adad-d214c7700bb8-kube-api-access-z6cn6\") pod \"nova-api-0\" (UID: \"5f1248e0-398e-48f9-adad-d214c7700bb8\") " pod="openstack/nova-api-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.280211 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjvs4\" (UniqueName: \"kubernetes.io/projected/0ba9b14c-abed-454e-a2f3-809ccafcd42f-kube-api-access-tjvs4\") pod \"nova-scheduler-0\" (UID: \"0ba9b14c-abed-454e-a2f3-809ccafcd42f\") " pod="openstack/nova-scheduler-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.280235 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f1248e0-398e-48f9-adad-d214c7700bb8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5f1248e0-398e-48f9-adad-d214c7700bb8\") " pod="openstack/nova-api-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.281033 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.287853 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.294225 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.372128 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b6c754dc9-jzxfs" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.389695 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjvs4\" (UniqueName: \"kubernetes.io/projected/0ba9b14c-abed-454e-a2f3-809ccafcd42f-kube-api-access-tjvs4\") pod \"nova-scheduler-0\" (UID: \"0ba9b14c-abed-454e-a2f3-809ccafcd42f\") " pod="openstack/nova-scheduler-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.389745 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f1248e0-398e-48f9-adad-d214c7700bb8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5f1248e0-398e-48f9-adad-d214c7700bb8\") " pod="openstack/nova-api-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.392034 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f1248e0-398e-48f9-adad-d214c7700bb8-config-data\") pod \"nova-api-0\" (UID: \"5f1248e0-398e-48f9-adad-d214c7700bb8\") " pod="openstack/nova-api-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.392312 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ba9b14c-abed-454e-a2f3-809ccafcd42f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0ba9b14c-abed-454e-a2f3-809ccafcd42f\") " pod="openstack/nova-scheduler-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.392379 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ba9b14c-abed-454e-a2f3-809ccafcd42f-config-data\") pod \"nova-scheduler-0\" (UID: \"0ba9b14c-abed-454e-a2f3-809ccafcd42f\") " pod="openstack/nova-scheduler-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.392549 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f1248e0-398e-48f9-adad-d214c7700bb8-logs\") pod \"nova-api-0\" (UID: \"5f1248e0-398e-48f9-adad-d214c7700bb8\") " pod="openstack/nova-api-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.392682 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6cn6\" (UniqueName: \"kubernetes.io/projected/5f1248e0-398e-48f9-adad-d214c7700bb8-kube-api-access-z6cn6\") pod \"nova-api-0\" (UID: \"5f1248e0-398e-48f9-adad-d214c7700bb8\") " pod="openstack/nova-api-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.393140 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f1248e0-398e-48f9-adad-d214c7700bb8-logs\") pod \"nova-api-0\" (UID: \"5f1248e0-398e-48f9-adad-d214c7700bb8\") " pod="openstack/nova-api-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.404666 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f1248e0-398e-48f9-adad-d214c7700bb8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5f1248e0-398e-48f9-adad-d214c7700bb8\") " pod="openstack/nova-api-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.410049 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6cn6\" (UniqueName: \"kubernetes.io/projected/5f1248e0-398e-48f9-adad-d214c7700bb8-kube-api-access-z6cn6\") pod \"nova-api-0\" (UID: \"5f1248e0-398e-48f9-adad-d214c7700bb8\") " pod="openstack/nova-api-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.410749 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ba9b14c-abed-454e-a2f3-809ccafcd42f-config-data\") pod \"nova-scheduler-0\" (UID: \"0ba9b14c-abed-454e-a2f3-809ccafcd42f\") " pod="openstack/nova-scheduler-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.414574 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjvs4\" (UniqueName: \"kubernetes.io/projected/0ba9b14c-abed-454e-a2f3-809ccafcd42f-kube-api-access-tjvs4\") pod \"nova-scheduler-0\" (UID: \"0ba9b14c-abed-454e-a2f3-809ccafcd42f\") " pod="openstack/nova-scheduler-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.416582 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ba9b14c-abed-454e-a2f3-809ccafcd42f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0ba9b14c-abed-454e-a2f3-809ccafcd42f\") " pod="openstack/nova-scheduler-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.416980 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f1248e0-398e-48f9-adad-d214c7700bb8-config-data\") pod \"nova-api-0\" (UID: \"5f1248e0-398e-48f9-adad-d214c7700bb8\") " pod="openstack/nova-api-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.494723 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aec7dc5f-67e4-4920-8454-2b3e1f43d593-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"aec7dc5f-67e4-4920-8454-2b3e1f43d593\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.494798 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aec7dc5f-67e4-4920-8454-2b3e1f43d593-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"aec7dc5f-67e4-4920-8454-2b3e1f43d593\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.494845 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qmt8\" (UniqueName: \"kubernetes.io/projected/aec7dc5f-67e4-4920-8454-2b3e1f43d593-kube-api-access-2qmt8\") pod \"nova-cell1-novncproxy-0\" (UID: \"aec7dc5f-67e4-4920-8454-2b3e1f43d593\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.501358 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.582968 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.597716 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aec7dc5f-67e4-4920-8454-2b3e1f43d593-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"aec7dc5f-67e4-4920-8454-2b3e1f43d593\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.597777 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aec7dc5f-67e4-4920-8454-2b3e1f43d593-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"aec7dc5f-67e4-4920-8454-2b3e1f43d593\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.597823 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qmt8\" (UniqueName: \"kubernetes.io/projected/aec7dc5f-67e4-4920-8454-2b3e1f43d593-kube-api-access-2qmt8\") pod \"nova-cell1-novncproxy-0\" (UID: \"aec7dc5f-67e4-4920-8454-2b3e1f43d593\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.605322 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aec7dc5f-67e4-4920-8454-2b3e1f43d593-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"aec7dc5f-67e4-4920-8454-2b3e1f43d593\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.609531 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aec7dc5f-67e4-4920-8454-2b3e1f43d593-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"aec7dc5f-67e4-4920-8454-2b3e1f43d593\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.631159 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qmt8\" (UniqueName: \"kubernetes.io/projected/aec7dc5f-67e4-4920-8454-2b3e1f43d593-kube-api-access-2qmt8\") pod \"nova-cell1-novncproxy-0\" (UID: \"aec7dc5f-67e4-4920-8454-2b3e1f43d593\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.818411 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-bzm2j"] Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.882753 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.918882 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.938461 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wdqpj"] Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.939661 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wdqpj" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.943182 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.958628 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.962314 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7df96c47-1886-48fe-8ee5-8436498d91ce","Type":"ContainerStarted","Data":"fc455202815216cadb5bd016ab8aeb1e7104b27a1f10848fb9a85d2a8f8d5765"} Mar 12 15:09:53 crc kubenswrapper[4869]: I0312 15:09:53.964496 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bzm2j" event={"ID":"2829ee85-1daa-4418-b99c-90bbebceb7c2","Type":"ContainerStarted","Data":"70345be3937fbef859414d2706e6c25f9c9d6d464a5fca813ba474a6b2ee2b76"} Mar 12 15:09:54 crc kubenswrapper[4869]: I0312 15:09:53.996006 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wdqpj"] Mar 12 15:09:54 crc kubenswrapper[4869]: I0312 15:09:54.114981 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f547181b-aa83-44ec-9c1a-dfbc394ff6a5-config-data\") pod \"nova-cell1-conductor-db-sync-wdqpj\" (UID: \"f547181b-aa83-44ec-9c1a-dfbc394ff6a5\") " pod="openstack/nova-cell1-conductor-db-sync-wdqpj" Mar 12 15:09:54 crc kubenswrapper[4869]: I0312 15:09:54.115045 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f547181b-aa83-44ec-9c1a-dfbc394ff6a5-scripts\") pod \"nova-cell1-conductor-db-sync-wdqpj\" (UID: \"f547181b-aa83-44ec-9c1a-dfbc394ff6a5\") " pod="openstack/nova-cell1-conductor-db-sync-wdqpj" Mar 12 15:09:54 crc kubenswrapper[4869]: I0312 15:09:54.115123 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f547181b-aa83-44ec-9c1a-dfbc394ff6a5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wdqpj\" (UID: \"f547181b-aa83-44ec-9c1a-dfbc394ff6a5\") " pod="openstack/nova-cell1-conductor-db-sync-wdqpj" Mar 12 15:09:54 crc kubenswrapper[4869]: I0312 15:09:54.115143 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrcbs\" (UniqueName: \"kubernetes.io/projected/f547181b-aa83-44ec-9c1a-dfbc394ff6a5-kube-api-access-qrcbs\") pod \"nova-cell1-conductor-db-sync-wdqpj\" (UID: \"f547181b-aa83-44ec-9c1a-dfbc394ff6a5\") " pod="openstack/nova-cell1-conductor-db-sync-wdqpj" Mar 12 15:09:54 crc kubenswrapper[4869]: I0312 15:09:54.157608 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b6c754dc9-jzxfs"] Mar 12 15:09:54 crc kubenswrapper[4869]: I0312 15:09:54.218211 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f547181b-aa83-44ec-9c1a-dfbc394ff6a5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wdqpj\" (UID: \"f547181b-aa83-44ec-9c1a-dfbc394ff6a5\") " pod="openstack/nova-cell1-conductor-db-sync-wdqpj" Mar 12 15:09:54 crc kubenswrapper[4869]: I0312 15:09:54.218674 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrcbs\" (UniqueName: \"kubernetes.io/projected/f547181b-aa83-44ec-9c1a-dfbc394ff6a5-kube-api-access-qrcbs\") pod \"nova-cell1-conductor-db-sync-wdqpj\" (UID: \"f547181b-aa83-44ec-9c1a-dfbc394ff6a5\") " pod="openstack/nova-cell1-conductor-db-sync-wdqpj" Mar 12 15:09:54 crc kubenswrapper[4869]: I0312 15:09:54.218833 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f547181b-aa83-44ec-9c1a-dfbc394ff6a5-config-data\") pod \"nova-cell1-conductor-db-sync-wdqpj\" (UID: \"f547181b-aa83-44ec-9c1a-dfbc394ff6a5\") " pod="openstack/nova-cell1-conductor-db-sync-wdqpj" Mar 12 15:09:54 crc kubenswrapper[4869]: I0312 15:09:54.218894 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f547181b-aa83-44ec-9c1a-dfbc394ff6a5-scripts\") pod \"nova-cell1-conductor-db-sync-wdqpj\" (UID: \"f547181b-aa83-44ec-9c1a-dfbc394ff6a5\") " pod="openstack/nova-cell1-conductor-db-sync-wdqpj" Mar 12 15:09:54 crc kubenswrapper[4869]: I0312 15:09:54.222574 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f547181b-aa83-44ec-9c1a-dfbc394ff6a5-scripts\") pod \"nova-cell1-conductor-db-sync-wdqpj\" (UID: \"f547181b-aa83-44ec-9c1a-dfbc394ff6a5\") " pod="openstack/nova-cell1-conductor-db-sync-wdqpj" Mar 12 15:09:54 crc kubenswrapper[4869]: I0312 15:09:54.224762 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f547181b-aa83-44ec-9c1a-dfbc394ff6a5-config-data\") pod \"nova-cell1-conductor-db-sync-wdqpj\" (UID: \"f547181b-aa83-44ec-9c1a-dfbc394ff6a5\") " pod="openstack/nova-cell1-conductor-db-sync-wdqpj" Mar 12 15:09:54 crc kubenswrapper[4869]: I0312 15:09:54.228680 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f547181b-aa83-44ec-9c1a-dfbc394ff6a5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wdqpj\" (UID: \"f547181b-aa83-44ec-9c1a-dfbc394ff6a5\") " pod="openstack/nova-cell1-conductor-db-sync-wdqpj" Mar 12 15:09:54 crc kubenswrapper[4869]: I0312 15:09:54.253560 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrcbs\" (UniqueName: \"kubernetes.io/projected/f547181b-aa83-44ec-9c1a-dfbc394ff6a5-kube-api-access-qrcbs\") pod \"nova-cell1-conductor-db-sync-wdqpj\" (UID: \"f547181b-aa83-44ec-9c1a-dfbc394ff6a5\") " pod="openstack/nova-cell1-conductor-db-sync-wdqpj" Mar 12 15:09:54 crc kubenswrapper[4869]: I0312 15:09:54.271814 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 15:09:54 crc kubenswrapper[4869]: W0312 15:09:54.296013 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ba9b14c_abed_454e_a2f3_809ccafcd42f.slice/crio-081e35002fd9b43ba28f46cb613d8f60efde756e0633a17dee0f3984f300c389 WatchSource:0}: Error finding container 081e35002fd9b43ba28f46cb613d8f60efde756e0633a17dee0f3984f300c389: Status 404 returned error can't find the container with id 081e35002fd9b43ba28f46cb613d8f60efde756e0633a17dee0f3984f300c389 Mar 12 15:09:54 crc kubenswrapper[4869]: I0312 15:09:54.310829 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 15:09:54 crc kubenswrapper[4869]: I0312 15:09:54.390931 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wdqpj" Mar 12 15:09:54 crc kubenswrapper[4869]: E0312 15:09:54.426448 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://quay.io/podified-antelope-centos9/openstack-nova-api:current-podified: can't talk to a V1 container registry" image="quay.io/podified-antelope-centos9/openstack-nova-api:current-podified" Mar 12 15:09:54 crc kubenswrapper[4869]: E0312 15:09:54.426630 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:nova-api-log,Image:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,Command:[/usr/bin/dumb-init],Args:[--single-child -- /bin/sh -c /usr/bin/tail -n+1 -F /var/log/nova/nova-api.log 2>/dev/null],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n668h5dhfch588h8ch67bh667h66h5b4h64bh56hbchbfh8dh65h5b6hfch84h577h658h5cfh698h644h5f4h5b7h5d9hfdh645h655h57bh586hcq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/nova,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z6cn6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 8774 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 8774 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42436,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 8774 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-api-0_openstack(5f1248e0-398e-48f9-adad-d214c7700bb8): ErrImagePull: initializing source docker://quay.io/podified-antelope-centos9/openstack-nova-api:current-podified: can't talk to a V1 container registry" logger="UnhandledError" Mar 12 15:09:54 crc kubenswrapper[4869]: E0312 15:09:54.431062 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"nova-api-log\" with ErrImagePull: \"initializing source docker://quay.io/podified-antelope-centos9/openstack-nova-api:current-podified: can't talk to a V1 container registry\", failed to \"StartContainer\" for \"nova-api-api\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-nova-api:current-podified\\\"\"]" pod="openstack/nova-api-0" podUID="5f1248e0-398e-48f9-adad-d214c7700bb8" Mar 12 15:09:54 crc kubenswrapper[4869]: I0312 15:09:54.521860 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 15:09:54 crc kubenswrapper[4869]: E0312 15:09:54.678382 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified: Requesting bearer token: invalid status code from registry 502 (Bad Gateway)" image="quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified" Mar 12 15:09:54 crc kubenswrapper[4869]: E0312 15:09:54.678570 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:nova-cell1-novncproxy-novncproxy,Image:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5bfh9ch5ffh58h646h558h55h684h7h559h5f5h9dh568hd5h55ch58dh78h59bh548h5c5h699h55fh9ch7h5bch76h86h68fh58dhf4h559h689q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/kolla/config_files/config.json,SubPath:nova-novncproxy-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2qmt8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/vnc_lite.html,Port:{0 6080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/vnc_lite.html,Port:{0 6080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42436,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/vnc_lite.html,Port:{0 6080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-cell1-novncproxy-0_openstack(aec7dc5f-67e4-4920-8454-2b3e1f43d593): ErrImagePull: initializing source docker://quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified: Requesting bearer token: invalid status code from registry 502 (Bad Gateway)" logger="UnhandledError" Mar 12 15:09:54 crc kubenswrapper[4869]: E0312 15:09:54.682654 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-novncproxy-novncproxy\" with ErrImagePull: \"initializing source docker://quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified: Requesting bearer token: invalid status code from registry 502 (Bad Gateway)\"" pod="openstack/nova-cell1-novncproxy-0" podUID="aec7dc5f-67e4-4920-8454-2b3e1f43d593" Mar 12 15:09:54 crc kubenswrapper[4869]: I0312 15:09:54.928420 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wdqpj"] Mar 12 15:09:54 crc kubenswrapper[4869]: W0312 15:09:54.931501 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf547181b_aa83_44ec_9c1a_dfbc394ff6a5.slice/crio-79b83a6cb1d80c12019079822ed494ff34fea75407c1f57968c864be2bc5b32c WatchSource:0}: Error finding container 79b83a6cb1d80c12019079822ed494ff34fea75407c1f57968c864be2bc5b32c: Status 404 returned error can't find the container with id 79b83a6cb1d80c12019079822ed494ff34fea75407c1f57968c864be2bc5b32c Mar 12 15:09:54 crc kubenswrapper[4869]: I0312 15:09:54.986405 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bzm2j" event={"ID":"2829ee85-1daa-4418-b99c-90bbebceb7c2","Type":"ContainerStarted","Data":"59b4471156ad7b9f1040a028511e1b2918ce761e3e6f4b926fc37d1b83e33231"} Mar 12 15:09:54 crc kubenswrapper[4869]: I0312 15:09:54.989048 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wdqpj" event={"ID":"f547181b-aa83-44ec-9c1a-dfbc394ff6a5","Type":"ContainerStarted","Data":"79b83a6cb1d80c12019079822ed494ff34fea75407c1f57968c864be2bc5b32c"} Mar 12 15:09:55 crc kubenswrapper[4869]: I0312 15:09:55.009139 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-bzm2j" podStartSLOduration=3.009122992 podStartE2EDuration="3.009122992s" podCreationTimestamp="2026-03-12 15:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:09:55.001329489 +0000 UTC m=+1347.286554767" watchObservedRunningTime="2026-03-12 15:09:55.009122992 +0000 UTC m=+1347.294348270" Mar 12 15:09:55 crc kubenswrapper[4869]: I0312 15:09:55.012469 4869 generic.go:334] "Generic (PLEG): container finished" podID="079bc323-9373-49b7-ba69-d64155e9e902" containerID="f31bb762f40d5512ae7613e29ecc5b03ad21a7c2e2bf74be3487911076996654" exitCode=0 Mar 12 15:09:55 crc kubenswrapper[4869]: I0312 15:09:55.012576 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6c754dc9-jzxfs" event={"ID":"079bc323-9373-49b7-ba69-d64155e9e902","Type":"ContainerDied","Data":"f31bb762f40d5512ae7613e29ecc5b03ad21a7c2e2bf74be3487911076996654"} Mar 12 15:09:55 crc kubenswrapper[4869]: I0312 15:09:55.012614 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6c754dc9-jzxfs" event={"ID":"079bc323-9373-49b7-ba69-d64155e9e902","Type":"ContainerStarted","Data":"276ee0c126a776e50b6e4ac34ec6efc0ade7e8454dd26d1120c063e088c05948"} Mar 12 15:09:55 crc kubenswrapper[4869]: I0312 15:09:55.023851 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5f1248e0-398e-48f9-adad-d214c7700bb8","Type":"ContainerStarted","Data":"354a3583d6498691de0324a3991957292fc80e30d59e3a9a5ba5704911370370"} Mar 12 15:09:55 crc kubenswrapper[4869]: E0312 15:09:55.026611 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"nova-api-log\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-nova-api:current-podified\\\"\", failed to \"StartContainer\" for \"nova-api-api\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-nova-api:current-podified\\\"\"]" pod="openstack/nova-api-0" podUID="5f1248e0-398e-48f9-adad-d214c7700bb8" Mar 12 15:09:55 crc kubenswrapper[4869]: I0312 15:09:55.031231 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0ba9b14c-abed-454e-a2f3-809ccafcd42f","Type":"ContainerStarted","Data":"081e35002fd9b43ba28f46cb613d8f60efde756e0633a17dee0f3984f300c389"} Mar 12 15:09:55 crc kubenswrapper[4869]: I0312 15:09:55.077758 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"aec7dc5f-67e4-4920-8454-2b3e1f43d593","Type":"ContainerStarted","Data":"f1f686cd511fa125014dd1e3cb0b714a2ae514f41ada06cb9a1f91950db88adb"} Mar 12 15:09:55 crc kubenswrapper[4869]: E0312 15:09:55.080931 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-novncproxy-novncproxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified\\\"\"" pod="openstack/nova-cell1-novncproxy-0" podUID="aec7dc5f-67e4-4920-8454-2b3e1f43d593" Mar 12 15:09:56 crc kubenswrapper[4869]: I0312 15:09:56.090762 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wdqpj" event={"ID":"f547181b-aa83-44ec-9c1a-dfbc394ff6a5","Type":"ContainerStarted","Data":"4fc0e8e6db5dd590434b9022f05c0318a9911d651083687ec672665e85926ce1"} Mar 12 15:09:56 crc kubenswrapper[4869]: I0312 15:09:56.093577 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6c754dc9-jzxfs" event={"ID":"079bc323-9373-49b7-ba69-d64155e9e902","Type":"ContainerStarted","Data":"e3a07bd69d8e49c784071c6adf360c3eb344bcfd4b9a799e7db5cdfbbfaafbec"} Mar 12 15:09:56 crc kubenswrapper[4869]: E0312 15:09:56.095943 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-novncproxy-novncproxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified\\\"\"" pod="openstack/nova-cell1-novncproxy-0" podUID="aec7dc5f-67e4-4920-8454-2b3e1f43d593" Mar 12 15:09:56 crc kubenswrapper[4869]: E0312 15:09:56.097075 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"nova-api-log\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-nova-api:current-podified\\\"\", failed to \"StartContainer\" for \"nova-api-api\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-nova-api:current-podified\\\"\"]" pod="openstack/nova-api-0" podUID="5f1248e0-398e-48f9-adad-d214c7700bb8" Mar 12 15:09:56 crc kubenswrapper[4869]: I0312 15:09:56.118251 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-wdqpj" podStartSLOduration=3.118231247 podStartE2EDuration="3.118231247s" podCreationTimestamp="2026-03-12 15:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:09:56.10924195 +0000 UTC m=+1348.394467238" watchObservedRunningTime="2026-03-12 15:09:56.118231247 +0000 UTC m=+1348.403456525" Mar 12 15:09:56 crc kubenswrapper[4869]: I0312 15:09:56.169785 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b6c754dc9-jzxfs" podStartSLOduration=4.169764932 podStartE2EDuration="4.169764932s" podCreationTimestamp="2026-03-12 15:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:09:56.167499067 +0000 UTC m=+1348.452724365" watchObservedRunningTime="2026-03-12 15:09:56.169764932 +0000 UTC m=+1348.454990210" Mar 12 15:09:57 crc kubenswrapper[4869]: I0312 15:09:57.133259 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7df96c47-1886-48fe-8ee5-8436498d91ce","Type":"ContainerStarted","Data":"75dab8a1f04e26fefa8afb3a51f7c8930a209ed96943e5ddbc760dd150f26318"} Mar 12 15:09:57 crc kubenswrapper[4869]: I0312 15:09:57.133670 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b6c754dc9-jzxfs" Mar 12 15:09:57 crc kubenswrapper[4869]: I0312 15:09:57.314834 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 15:09:57 crc kubenswrapper[4869]: I0312 15:09:57.331869 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 15:09:57 crc kubenswrapper[4869]: I0312 15:09:57.705358 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:09:57 crc kubenswrapper[4869]: I0312 15:09:57.830020 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qmt8\" (UniqueName: \"kubernetes.io/projected/aec7dc5f-67e4-4920-8454-2b3e1f43d593-kube-api-access-2qmt8\") pod \"aec7dc5f-67e4-4920-8454-2b3e1f43d593\" (UID: \"aec7dc5f-67e4-4920-8454-2b3e1f43d593\") " Mar 12 15:09:57 crc kubenswrapper[4869]: I0312 15:09:57.830093 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aec7dc5f-67e4-4920-8454-2b3e1f43d593-combined-ca-bundle\") pod \"aec7dc5f-67e4-4920-8454-2b3e1f43d593\" (UID: \"aec7dc5f-67e4-4920-8454-2b3e1f43d593\") " Mar 12 15:09:57 crc kubenswrapper[4869]: I0312 15:09:57.830290 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aec7dc5f-67e4-4920-8454-2b3e1f43d593-config-data\") pod \"aec7dc5f-67e4-4920-8454-2b3e1f43d593\" (UID: \"aec7dc5f-67e4-4920-8454-2b3e1f43d593\") " Mar 12 15:09:57 crc kubenswrapper[4869]: I0312 15:09:57.836087 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aec7dc5f-67e4-4920-8454-2b3e1f43d593-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aec7dc5f-67e4-4920-8454-2b3e1f43d593" (UID: "aec7dc5f-67e4-4920-8454-2b3e1f43d593"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:57 crc kubenswrapper[4869]: I0312 15:09:57.850112 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aec7dc5f-67e4-4920-8454-2b3e1f43d593-config-data" (OuterVolumeSpecName: "config-data") pod "aec7dc5f-67e4-4920-8454-2b3e1f43d593" (UID: "aec7dc5f-67e4-4920-8454-2b3e1f43d593"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:57 crc kubenswrapper[4869]: I0312 15:09:57.850300 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aec7dc5f-67e4-4920-8454-2b3e1f43d593-kube-api-access-2qmt8" (OuterVolumeSpecName: "kube-api-access-2qmt8") pod "aec7dc5f-67e4-4920-8454-2b3e1f43d593" (UID: "aec7dc5f-67e4-4920-8454-2b3e1f43d593"). InnerVolumeSpecName "kube-api-access-2qmt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:09:57 crc kubenswrapper[4869]: I0312 15:09:57.932643 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qmt8\" (UniqueName: \"kubernetes.io/projected/aec7dc5f-67e4-4920-8454-2b3e1f43d593-kube-api-access-2qmt8\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:57 crc kubenswrapper[4869]: I0312 15:09:57.932675 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aec7dc5f-67e4-4920-8454-2b3e1f43d593-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:57 crc kubenswrapper[4869]: I0312 15:09:57.932684 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aec7dc5f-67e4-4920-8454-2b3e1f43d593-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.145254 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.145293 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"aec7dc5f-67e4-4920-8454-2b3e1f43d593","Type":"ContainerDied","Data":"f1f686cd511fa125014dd1e3cb0b714a2ae514f41ada06cb9a1f91950db88adb"} Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.160602 4869 generic.go:334] "Generic (PLEG): container finished" podID="cd15c8a9-0582-4c14-9450-407ad0cfa828" containerID="df9ada5ff4cc69b25fabf28f675a3451db0d54dce2b7dce3f095cf1db1450059" exitCode=137 Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.160699 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd15c8a9-0582-4c14-9450-407ad0cfa828","Type":"ContainerDied","Data":"df9ada5ff4cc69b25fabf28f675a3451db0d54dce2b7dce3f095cf1db1450059"} Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.162673 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7df96c47-1886-48fe-8ee5-8436498d91ce" containerName="nova-metadata-log" containerID="cri-o://75dab8a1f04e26fefa8afb3a51f7c8930a209ed96943e5ddbc760dd150f26318" gracePeriod=30 Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.163238 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7df96c47-1886-48fe-8ee5-8436498d91ce" containerName="nova-metadata-metadata" containerID="cri-o://91b094bf7f2252d2a0c565d3e1f6482ce649815ab8dbef0cacc6fdacaba739d1" gracePeriod=30 Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.163321 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7df96c47-1886-48fe-8ee5-8436498d91ce","Type":"ContainerStarted","Data":"91b094bf7f2252d2a0c565d3e1f6482ce649815ab8dbef0cacc6fdacaba739d1"} Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.192801 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.527245583 podStartE2EDuration="6.192783027s" podCreationTimestamp="2026-03-12 15:09:52 +0000 UTC" firstStartedPulling="2026-03-12 15:09:53.916365195 +0000 UTC m=+1346.201590473" lastFinishedPulling="2026-03-12 15:09:56.581902639 +0000 UTC m=+1348.867127917" observedRunningTime="2026-03-12 15:09:58.192679944 +0000 UTC m=+1350.477905242" watchObservedRunningTime="2026-03-12 15:09:58.192783027 +0000 UTC m=+1350.478008325" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.234611 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.234702 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.306904 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.332609 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.361584 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aec7dc5f-67e4-4920-8454-2b3e1f43d593" path="/var/lib/kubelet/pods/aec7dc5f-67e4-4920-8454-2b3e1f43d593/volumes" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.362059 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.365858 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.369778 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.369895 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.370014 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.390248 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.480739 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a10d0ea8-db1d-4779-9b4c-d97edb20c85e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a10d0ea8-db1d-4779-9b4c-d97edb20c85e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.481263 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a10d0ea8-db1d-4779-9b4c-d97edb20c85e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a10d0ea8-db1d-4779-9b4c-d97edb20c85e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.481419 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a10d0ea8-db1d-4779-9b4c-d97edb20c85e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a10d0ea8-db1d-4779-9b4c-d97edb20c85e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.481593 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnfmb\" (UniqueName: \"kubernetes.io/projected/a10d0ea8-db1d-4779-9b4c-d97edb20c85e-kube-api-access-nnfmb\") pod \"nova-cell1-novncproxy-0\" (UID: \"a10d0ea8-db1d-4779-9b4c-d97edb20c85e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.481805 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a10d0ea8-db1d-4779-9b4c-d97edb20c85e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a10d0ea8-db1d-4779-9b4c-d97edb20c85e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.503602 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.583246 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd15c8a9-0582-4c14-9450-407ad0cfa828-log-httpd\") pod \"cd15c8a9-0582-4c14-9450-407ad0cfa828\" (UID: \"cd15c8a9-0582-4c14-9450-407ad0cfa828\") " Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.583309 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd15c8a9-0582-4c14-9450-407ad0cfa828-config-data\") pod \"cd15c8a9-0582-4c14-9450-407ad0cfa828\" (UID: \"cd15c8a9-0582-4c14-9450-407ad0cfa828\") " Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.583371 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cd15c8a9-0582-4c14-9450-407ad0cfa828-sg-core-conf-yaml\") pod \"cd15c8a9-0582-4c14-9450-407ad0cfa828\" (UID: \"cd15c8a9-0582-4c14-9450-407ad0cfa828\") " Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.583429 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd15c8a9-0582-4c14-9450-407ad0cfa828-combined-ca-bundle\") pod \"cd15c8a9-0582-4c14-9450-407ad0cfa828\" (UID: \"cd15c8a9-0582-4c14-9450-407ad0cfa828\") " Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.583572 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd15c8a9-0582-4c14-9450-407ad0cfa828-scripts\") pod \"cd15c8a9-0582-4c14-9450-407ad0cfa828\" (UID: \"cd15c8a9-0582-4c14-9450-407ad0cfa828\") " Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.583653 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9gnr\" (UniqueName: \"kubernetes.io/projected/cd15c8a9-0582-4c14-9450-407ad0cfa828-kube-api-access-d9gnr\") pod \"cd15c8a9-0582-4c14-9450-407ad0cfa828\" (UID: \"cd15c8a9-0582-4c14-9450-407ad0cfa828\") " Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.583799 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd15c8a9-0582-4c14-9450-407ad0cfa828-run-httpd\") pod \"cd15c8a9-0582-4c14-9450-407ad0cfa828\" (UID: \"cd15c8a9-0582-4c14-9450-407ad0cfa828\") " Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.584073 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a10d0ea8-db1d-4779-9b4c-d97edb20c85e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a10d0ea8-db1d-4779-9b4c-d97edb20c85e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.584113 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a10d0ea8-db1d-4779-9b4c-d97edb20c85e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a10d0ea8-db1d-4779-9b4c-d97edb20c85e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.584146 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a10d0ea8-db1d-4779-9b4c-d97edb20c85e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a10d0ea8-db1d-4779-9b4c-d97edb20c85e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.584202 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnfmb\" (UniqueName: \"kubernetes.io/projected/a10d0ea8-db1d-4779-9b4c-d97edb20c85e-kube-api-access-nnfmb\") pod \"nova-cell1-novncproxy-0\" (UID: \"a10d0ea8-db1d-4779-9b4c-d97edb20c85e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.584230 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a10d0ea8-db1d-4779-9b4c-d97edb20c85e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a10d0ea8-db1d-4779-9b4c-d97edb20c85e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.587947 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd15c8a9-0582-4c14-9450-407ad0cfa828-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cd15c8a9-0582-4c14-9450-407ad0cfa828" (UID: "cd15c8a9-0582-4c14-9450-407ad0cfa828"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.588494 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd15c8a9-0582-4c14-9450-407ad0cfa828-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cd15c8a9-0582-4c14-9450-407ad0cfa828" (UID: "cd15c8a9-0582-4c14-9450-407ad0cfa828"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.591961 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd15c8a9-0582-4c14-9450-407ad0cfa828-kube-api-access-d9gnr" (OuterVolumeSpecName: "kube-api-access-d9gnr") pod "cd15c8a9-0582-4c14-9450-407ad0cfa828" (UID: "cd15c8a9-0582-4c14-9450-407ad0cfa828"). InnerVolumeSpecName "kube-api-access-d9gnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.594767 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd15c8a9-0582-4c14-9450-407ad0cfa828-scripts" (OuterVolumeSpecName: "scripts") pod "cd15c8a9-0582-4c14-9450-407ad0cfa828" (UID: "cd15c8a9-0582-4c14-9450-407ad0cfa828"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.601717 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a10d0ea8-db1d-4779-9b4c-d97edb20c85e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a10d0ea8-db1d-4779-9b4c-d97edb20c85e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.602093 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a10d0ea8-db1d-4779-9b4c-d97edb20c85e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a10d0ea8-db1d-4779-9b4c-d97edb20c85e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.607131 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a10d0ea8-db1d-4779-9b4c-d97edb20c85e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a10d0ea8-db1d-4779-9b4c-d97edb20c85e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.607208 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a10d0ea8-db1d-4779-9b4c-d97edb20c85e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a10d0ea8-db1d-4779-9b4c-d97edb20c85e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.607387 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnfmb\" (UniqueName: \"kubernetes.io/projected/a10d0ea8-db1d-4779-9b4c-d97edb20c85e-kube-api-access-nnfmb\") pod \"nova-cell1-novncproxy-0\" (UID: \"a10d0ea8-db1d-4779-9b4c-d97edb20c85e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.644549 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd15c8a9-0582-4c14-9450-407ad0cfa828-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cd15c8a9-0582-4c14-9450-407ad0cfa828" (UID: "cd15c8a9-0582-4c14-9450-407ad0cfa828"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.686257 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd15c8a9-0582-4c14-9450-407ad0cfa828-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.686291 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9gnr\" (UniqueName: \"kubernetes.io/projected/cd15c8a9-0582-4c14-9450-407ad0cfa828-kube-api-access-d9gnr\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.686303 4869 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd15c8a9-0582-4c14-9450-407ad0cfa828-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.686312 4869 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd15c8a9-0582-4c14-9450-407ad0cfa828-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.686353 4869 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cd15c8a9-0582-4c14-9450-407ad0cfa828-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.700195 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd15c8a9-0582-4c14-9450-407ad0cfa828-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd15c8a9-0582-4c14-9450-407ad0cfa828" (UID: "cd15c8a9-0582-4c14-9450-407ad0cfa828"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.722413 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd15c8a9-0582-4c14-9450-407ad0cfa828-config-data" (OuterVolumeSpecName: "config-data") pod "cd15c8a9-0582-4c14-9450-407ad0cfa828" (UID: "cd15c8a9-0582-4c14-9450-407ad0cfa828"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.784413 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.787600 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd15c8a9-0582-4c14-9450-407ad0cfa828-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.787627 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd15c8a9-0582-4c14-9450-407ad0cfa828-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.796172 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.888278 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7df96c47-1886-48fe-8ee5-8436498d91ce-logs\") pod \"7df96c47-1886-48fe-8ee5-8436498d91ce\" (UID: \"7df96c47-1886-48fe-8ee5-8436498d91ce\") " Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.888665 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7df96c47-1886-48fe-8ee5-8436498d91ce-logs" (OuterVolumeSpecName: "logs") pod "7df96c47-1886-48fe-8ee5-8436498d91ce" (UID: "7df96c47-1886-48fe-8ee5-8436498d91ce"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.888718 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txcmm\" (UniqueName: \"kubernetes.io/projected/7df96c47-1886-48fe-8ee5-8436498d91ce-kube-api-access-txcmm\") pod \"7df96c47-1886-48fe-8ee5-8436498d91ce\" (UID: \"7df96c47-1886-48fe-8ee5-8436498d91ce\") " Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.888961 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7df96c47-1886-48fe-8ee5-8436498d91ce-config-data\") pod \"7df96c47-1886-48fe-8ee5-8436498d91ce\" (UID: \"7df96c47-1886-48fe-8ee5-8436498d91ce\") " Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.889013 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df96c47-1886-48fe-8ee5-8436498d91ce-combined-ca-bundle\") pod \"7df96c47-1886-48fe-8ee5-8436498d91ce\" (UID: \"7df96c47-1886-48fe-8ee5-8436498d91ce\") " Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.889503 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7df96c47-1886-48fe-8ee5-8436498d91ce-logs\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.892680 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7df96c47-1886-48fe-8ee5-8436498d91ce-kube-api-access-txcmm" (OuterVolumeSpecName: "kube-api-access-txcmm") pod "7df96c47-1886-48fe-8ee5-8436498d91ce" (UID: "7df96c47-1886-48fe-8ee5-8436498d91ce"). InnerVolumeSpecName "kube-api-access-txcmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.930336 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7df96c47-1886-48fe-8ee5-8436498d91ce-config-data" (OuterVolumeSpecName: "config-data") pod "7df96c47-1886-48fe-8ee5-8436498d91ce" (UID: "7df96c47-1886-48fe-8ee5-8436498d91ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.944657 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7df96c47-1886-48fe-8ee5-8436498d91ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7df96c47-1886-48fe-8ee5-8436498d91ce" (UID: "7df96c47-1886-48fe-8ee5-8436498d91ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.990944 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7df96c47-1886-48fe-8ee5-8436498d91ce-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.990978 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df96c47-1886-48fe-8ee5-8436498d91ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:58 crc kubenswrapper[4869]: I0312 15:09:58.990988 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txcmm\" (UniqueName: \"kubernetes.io/projected/7df96c47-1886-48fe-8ee5-8436498d91ce-kube-api-access-txcmm\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.192074 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.192056 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd15c8a9-0582-4c14-9450-407ad0cfa828","Type":"ContainerDied","Data":"cd3b0393b44f190417d696075b113b7b1391d50f46566b5dd5e1adff25caf78c"} Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.193793 4869 scope.go:117] "RemoveContainer" containerID="df9ada5ff4cc69b25fabf28f675a3451db0d54dce2b7dce3f095cf1db1450059" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.203979 4869 generic.go:334] "Generic (PLEG): container finished" podID="7df96c47-1886-48fe-8ee5-8436498d91ce" containerID="91b094bf7f2252d2a0c565d3e1f6482ce649815ab8dbef0cacc6fdacaba739d1" exitCode=0 Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.204006 4869 generic.go:334] "Generic (PLEG): container finished" podID="7df96c47-1886-48fe-8ee5-8436498d91ce" containerID="75dab8a1f04e26fefa8afb3a51f7c8930a209ed96943e5ddbc760dd150f26318" exitCode=143 Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.204026 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7df96c47-1886-48fe-8ee5-8436498d91ce","Type":"ContainerDied","Data":"91b094bf7f2252d2a0c565d3e1f6482ce649815ab8dbef0cacc6fdacaba739d1"} Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.204052 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7df96c47-1886-48fe-8ee5-8436498d91ce","Type":"ContainerDied","Data":"75dab8a1f04e26fefa8afb3a51f7c8930a209ed96943e5ddbc760dd150f26318"} Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.204062 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7df96c47-1886-48fe-8ee5-8436498d91ce","Type":"ContainerDied","Data":"fc455202815216cadb5bd016ab8aeb1e7104b27a1f10848fb9a85d2a8f8d5765"} Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.204110 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.225839 4869 scope.go:117] "RemoveContainer" containerID="1292976ee9e3579ce89cfb4080110d00e5a5768b2e284f96fdc169afd47f4f15" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.236248 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.249699 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.262432 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.306932 4869 scope.go:117] "RemoveContainer" containerID="ddb6144292d8b97dcf8e8f16ae0310e23b3126a3c32becdfb1f98d740e54ace6" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.316471 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.331703 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:09:59 crc kubenswrapper[4869]: E0312 15:09:59.332330 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7df96c47-1886-48fe-8ee5-8436498d91ce" containerName="nova-metadata-metadata" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.332348 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="7df96c47-1886-48fe-8ee5-8436498d91ce" containerName="nova-metadata-metadata" Mar 12 15:09:59 crc kubenswrapper[4869]: E0312 15:09:59.332362 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd15c8a9-0582-4c14-9450-407ad0cfa828" containerName="ceilometer-central-agent" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.332373 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd15c8a9-0582-4c14-9450-407ad0cfa828" containerName="ceilometer-central-agent" Mar 12 15:09:59 crc kubenswrapper[4869]: E0312 15:09:59.332386 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7df96c47-1886-48fe-8ee5-8436498d91ce" containerName="nova-metadata-log" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.332392 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="7df96c47-1886-48fe-8ee5-8436498d91ce" containerName="nova-metadata-log" Mar 12 15:09:59 crc kubenswrapper[4869]: E0312 15:09:59.332410 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd15c8a9-0582-4c14-9450-407ad0cfa828" containerName="sg-core" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.332416 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd15c8a9-0582-4c14-9450-407ad0cfa828" containerName="sg-core" Mar 12 15:09:59 crc kubenswrapper[4869]: E0312 15:09:59.332426 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd15c8a9-0582-4c14-9450-407ad0cfa828" containerName="proxy-httpd" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.332432 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd15c8a9-0582-4c14-9450-407ad0cfa828" containerName="proxy-httpd" Mar 12 15:09:59 crc kubenswrapper[4869]: E0312 15:09:59.332455 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd15c8a9-0582-4c14-9450-407ad0cfa828" containerName="ceilometer-notification-agent" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.332461 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd15c8a9-0582-4c14-9450-407ad0cfa828" containerName="ceilometer-notification-agent" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.332663 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="7df96c47-1886-48fe-8ee5-8436498d91ce" containerName="nova-metadata-log" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.332677 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd15c8a9-0582-4c14-9450-407ad0cfa828" containerName="proxy-httpd" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.332688 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd15c8a9-0582-4c14-9450-407ad0cfa828" containerName="ceilometer-notification-agent" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.332699 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd15c8a9-0582-4c14-9450-407ad0cfa828" containerName="sg-core" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.332710 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="7df96c47-1886-48fe-8ee5-8436498d91ce" containerName="nova-metadata-metadata" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.332723 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd15c8a9-0582-4c14-9450-407ad0cfa828" containerName="ceilometer-central-agent" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.334601 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.338055 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.338266 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.349237 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.359318 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.361802 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.366970 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.367083 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.378231 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.388246 4869 scope.go:117] "RemoveContainer" containerID="abef6ce10f9b78d09aba11e8f0720335aa87354e3fd0046148a390edfefff7ec" Mar 12 15:09:59 crc kubenswrapper[4869]: W0312 15:09:59.403630 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda10d0ea8_db1d_4779_9b4c_d97edb20c85e.slice/crio-235eaa531a20e3a545534d20f3968b09e118379b91c73ee5b19b7df394d6883d WatchSource:0}: Error finding container 235eaa531a20e3a545534d20f3968b09e118379b91c73ee5b19b7df394d6883d: Status 404 returned error can't find the container with id 235eaa531a20e3a545534d20f3968b09e118379b91c73ee5b19b7df394d6883d Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.412616 4869 scope.go:117] "RemoveContainer" containerID="91b094bf7f2252d2a0c565d3e1f6482ce649815ab8dbef0cacc6fdacaba739d1" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.412620 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.448172 4869 scope.go:117] "RemoveContainer" containerID="75dab8a1f04e26fefa8afb3a51f7c8930a209ed96943e5ddbc760dd150f26318" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.510118 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09af255f-def2-476c-bf2e-77a49b59388d-log-httpd\") pod \"ceilometer-0\" (UID: \"09af255f-def2-476c-bf2e-77a49b59388d\") " pod="openstack/ceilometer-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.510204 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/df3e455c-91ed-49cc-9072-0b9abda6ba0d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"df3e455c-91ed-49cc-9072-0b9abda6ba0d\") " pod="openstack/nova-metadata-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.510228 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df3e455c-91ed-49cc-9072-0b9abda6ba0d-logs\") pod \"nova-metadata-0\" (UID: \"df3e455c-91ed-49cc-9072-0b9abda6ba0d\") " pod="openstack/nova-metadata-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.510256 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpnkq\" (UniqueName: \"kubernetes.io/projected/09af255f-def2-476c-bf2e-77a49b59388d-kube-api-access-jpnkq\") pod \"ceilometer-0\" (UID: \"09af255f-def2-476c-bf2e-77a49b59388d\") " pod="openstack/ceilometer-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.510296 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09af255f-def2-476c-bf2e-77a49b59388d-scripts\") pod \"ceilometer-0\" (UID: \"09af255f-def2-476c-bf2e-77a49b59388d\") " pod="openstack/ceilometer-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.510323 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df3e455c-91ed-49cc-9072-0b9abda6ba0d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"df3e455c-91ed-49cc-9072-0b9abda6ba0d\") " pod="openstack/nova-metadata-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.510841 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df3e455c-91ed-49cc-9072-0b9abda6ba0d-config-data\") pod \"nova-metadata-0\" (UID: \"df3e455c-91ed-49cc-9072-0b9abda6ba0d\") " pod="openstack/nova-metadata-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.510885 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75qk8\" (UniqueName: \"kubernetes.io/projected/df3e455c-91ed-49cc-9072-0b9abda6ba0d-kube-api-access-75qk8\") pod \"nova-metadata-0\" (UID: \"df3e455c-91ed-49cc-9072-0b9abda6ba0d\") " pod="openstack/nova-metadata-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.510915 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09af255f-def2-476c-bf2e-77a49b59388d-run-httpd\") pod \"ceilometer-0\" (UID: \"09af255f-def2-476c-bf2e-77a49b59388d\") " pod="openstack/ceilometer-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.511057 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09af255f-def2-476c-bf2e-77a49b59388d-config-data\") pod \"ceilometer-0\" (UID: \"09af255f-def2-476c-bf2e-77a49b59388d\") " pod="openstack/ceilometer-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.511186 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/09af255f-def2-476c-bf2e-77a49b59388d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"09af255f-def2-476c-bf2e-77a49b59388d\") " pod="openstack/ceilometer-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.511254 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09af255f-def2-476c-bf2e-77a49b59388d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"09af255f-def2-476c-bf2e-77a49b59388d\") " pod="openstack/ceilometer-0" Mar 12 15:09:59 crc kubenswrapper[4869]: E0312 15:09:59.522915 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified: can't talk to a V1 container registry" image="quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified" Mar 12 15:09:59 crc kubenswrapper[4869]: E0312 15:09:59.523125 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:nova-cell1-novncproxy-novncproxy,Image:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n88h685h59ch578h54bh66h5fh567h9h5cfh596h667h68ch5d8h59ch9h4h5dch564h684h85hd5h599h5cfh5c4h65bh5d6hdbh5bdh564hc8h675q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/kolla/config_files/config.json,SubPath:nova-novncproxy-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:nova-novncproxy-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/nova-novncproxy.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:nova-novncproxy-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/nova-novncproxy.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:vencrypt-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/vencrypt.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:vencrypt-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/vencrypt.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nnfmb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/vnc_lite.html,Port:{0 6080 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/vnc_lite.html,Port:{0 6080 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42436,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/vnc_lite.html,Port:{0 6080 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-cell1-novncproxy-0_openstack(a10d0ea8-db1d-4779-9b4c-d97edb20c85e): ErrImagePull: initializing source docker://quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified: can't talk to a V1 container registry" logger="UnhandledError" Mar 12 15:09:59 crc kubenswrapper[4869]: E0312 15:09:59.524238 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-novncproxy-novncproxy\" with ErrImagePull: \"initializing source docker://quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified: can't talk to a V1 container registry\"" pod="openstack/nova-cell1-novncproxy-0" podUID="a10d0ea8-db1d-4779-9b4c-d97edb20c85e" Mar 12 15:09:59 crc kubenswrapper[4869]: E0312 15:09:59.603385 4869 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc13a91ca_db8c_42b4_bfca_029e427aff28.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc13a91ca_db8c_42b4_bfca_029e427aff28.slice/crio-23f0e9b3a7cf88948884e8f788df4eb82dcd7e1e08852ac82e535a3497644ef7\": RecentStats: unable to find data in memory cache]" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.614028 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/09af255f-def2-476c-bf2e-77a49b59388d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"09af255f-def2-476c-bf2e-77a49b59388d\") " pod="openstack/ceilometer-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.614106 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09af255f-def2-476c-bf2e-77a49b59388d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"09af255f-def2-476c-bf2e-77a49b59388d\") " pod="openstack/ceilometer-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.614926 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09af255f-def2-476c-bf2e-77a49b59388d-log-httpd\") pod \"ceilometer-0\" (UID: \"09af255f-def2-476c-bf2e-77a49b59388d\") " pod="openstack/ceilometer-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.614315 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09af255f-def2-476c-bf2e-77a49b59388d-log-httpd\") pod \"ceilometer-0\" (UID: \"09af255f-def2-476c-bf2e-77a49b59388d\") " pod="openstack/ceilometer-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.615367 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/df3e455c-91ed-49cc-9072-0b9abda6ba0d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"df3e455c-91ed-49cc-9072-0b9abda6ba0d\") " pod="openstack/nova-metadata-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.615397 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df3e455c-91ed-49cc-9072-0b9abda6ba0d-logs\") pod \"nova-metadata-0\" (UID: \"df3e455c-91ed-49cc-9072-0b9abda6ba0d\") " pod="openstack/nova-metadata-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.615428 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpnkq\" (UniqueName: \"kubernetes.io/projected/09af255f-def2-476c-bf2e-77a49b59388d-kube-api-access-jpnkq\") pod \"ceilometer-0\" (UID: \"09af255f-def2-476c-bf2e-77a49b59388d\") " pod="openstack/ceilometer-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.615529 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09af255f-def2-476c-bf2e-77a49b59388d-scripts\") pod \"ceilometer-0\" (UID: \"09af255f-def2-476c-bf2e-77a49b59388d\") " pod="openstack/ceilometer-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.615580 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df3e455c-91ed-49cc-9072-0b9abda6ba0d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"df3e455c-91ed-49cc-9072-0b9abda6ba0d\") " pod="openstack/nova-metadata-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.615655 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df3e455c-91ed-49cc-9072-0b9abda6ba0d-config-data\") pod \"nova-metadata-0\" (UID: \"df3e455c-91ed-49cc-9072-0b9abda6ba0d\") " pod="openstack/nova-metadata-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.615699 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75qk8\" (UniqueName: \"kubernetes.io/projected/df3e455c-91ed-49cc-9072-0b9abda6ba0d-kube-api-access-75qk8\") pod \"nova-metadata-0\" (UID: \"df3e455c-91ed-49cc-9072-0b9abda6ba0d\") " pod="openstack/nova-metadata-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.615724 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09af255f-def2-476c-bf2e-77a49b59388d-run-httpd\") pod \"ceilometer-0\" (UID: \"09af255f-def2-476c-bf2e-77a49b59388d\") " pod="openstack/ceilometer-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.615798 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09af255f-def2-476c-bf2e-77a49b59388d-config-data\") pod \"ceilometer-0\" (UID: \"09af255f-def2-476c-bf2e-77a49b59388d\") " pod="openstack/ceilometer-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.616072 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df3e455c-91ed-49cc-9072-0b9abda6ba0d-logs\") pod \"nova-metadata-0\" (UID: \"df3e455c-91ed-49cc-9072-0b9abda6ba0d\") " pod="openstack/nova-metadata-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.616590 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09af255f-def2-476c-bf2e-77a49b59388d-run-httpd\") pod \"ceilometer-0\" (UID: \"09af255f-def2-476c-bf2e-77a49b59388d\") " pod="openstack/ceilometer-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.625710 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df3e455c-91ed-49cc-9072-0b9abda6ba0d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"df3e455c-91ed-49cc-9072-0b9abda6ba0d\") " pod="openstack/nova-metadata-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.625995 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df3e455c-91ed-49cc-9072-0b9abda6ba0d-config-data\") pod \"nova-metadata-0\" (UID: \"df3e455c-91ed-49cc-9072-0b9abda6ba0d\") " pod="openstack/nova-metadata-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.627107 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/09af255f-def2-476c-bf2e-77a49b59388d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"09af255f-def2-476c-bf2e-77a49b59388d\") " pod="openstack/ceilometer-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.628126 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09af255f-def2-476c-bf2e-77a49b59388d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"09af255f-def2-476c-bf2e-77a49b59388d\") " pod="openstack/ceilometer-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.628678 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09af255f-def2-476c-bf2e-77a49b59388d-config-data\") pod \"ceilometer-0\" (UID: \"09af255f-def2-476c-bf2e-77a49b59388d\") " pod="openstack/ceilometer-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.630025 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/df3e455c-91ed-49cc-9072-0b9abda6ba0d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"df3e455c-91ed-49cc-9072-0b9abda6ba0d\") " pod="openstack/nova-metadata-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.630922 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpnkq\" (UniqueName: \"kubernetes.io/projected/09af255f-def2-476c-bf2e-77a49b59388d-kube-api-access-jpnkq\") pod \"ceilometer-0\" (UID: \"09af255f-def2-476c-bf2e-77a49b59388d\") " pod="openstack/ceilometer-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.631338 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09af255f-def2-476c-bf2e-77a49b59388d-scripts\") pod \"ceilometer-0\" (UID: \"09af255f-def2-476c-bf2e-77a49b59388d\") " pod="openstack/ceilometer-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.632687 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75qk8\" (UniqueName: \"kubernetes.io/projected/df3e455c-91ed-49cc-9072-0b9abda6ba0d-kube-api-access-75qk8\") pod \"nova-metadata-0\" (UID: \"df3e455c-91ed-49cc-9072-0b9abda6ba0d\") " pod="openstack/nova-metadata-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.662312 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.709309 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.865371 4869 scope.go:117] "RemoveContainer" containerID="91b094bf7f2252d2a0c565d3e1f6482ce649815ab8dbef0cacc6fdacaba739d1" Mar 12 15:09:59 crc kubenswrapper[4869]: E0312 15:09:59.866198 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91b094bf7f2252d2a0c565d3e1f6482ce649815ab8dbef0cacc6fdacaba739d1\": container with ID starting with 91b094bf7f2252d2a0c565d3e1f6482ce649815ab8dbef0cacc6fdacaba739d1 not found: ID does not exist" containerID="91b094bf7f2252d2a0c565d3e1f6482ce649815ab8dbef0cacc6fdacaba739d1" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.866255 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91b094bf7f2252d2a0c565d3e1f6482ce649815ab8dbef0cacc6fdacaba739d1"} err="failed to get container status \"91b094bf7f2252d2a0c565d3e1f6482ce649815ab8dbef0cacc6fdacaba739d1\": rpc error: code = NotFound desc = could not find container \"91b094bf7f2252d2a0c565d3e1f6482ce649815ab8dbef0cacc6fdacaba739d1\": container with ID starting with 91b094bf7f2252d2a0c565d3e1f6482ce649815ab8dbef0cacc6fdacaba739d1 not found: ID does not exist" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.866292 4869 scope.go:117] "RemoveContainer" containerID="75dab8a1f04e26fefa8afb3a51f7c8930a209ed96943e5ddbc760dd150f26318" Mar 12 15:09:59 crc kubenswrapper[4869]: E0312 15:09:59.866703 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75dab8a1f04e26fefa8afb3a51f7c8930a209ed96943e5ddbc760dd150f26318\": container with ID starting with 75dab8a1f04e26fefa8afb3a51f7c8930a209ed96943e5ddbc760dd150f26318 not found: ID does not exist" containerID="75dab8a1f04e26fefa8afb3a51f7c8930a209ed96943e5ddbc760dd150f26318" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.866734 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75dab8a1f04e26fefa8afb3a51f7c8930a209ed96943e5ddbc760dd150f26318"} err="failed to get container status \"75dab8a1f04e26fefa8afb3a51f7c8930a209ed96943e5ddbc760dd150f26318\": rpc error: code = NotFound desc = could not find container \"75dab8a1f04e26fefa8afb3a51f7c8930a209ed96943e5ddbc760dd150f26318\": container with ID starting with 75dab8a1f04e26fefa8afb3a51f7c8930a209ed96943e5ddbc760dd150f26318 not found: ID does not exist" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.866756 4869 scope.go:117] "RemoveContainer" containerID="91b094bf7f2252d2a0c565d3e1f6482ce649815ab8dbef0cacc6fdacaba739d1" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.867080 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91b094bf7f2252d2a0c565d3e1f6482ce649815ab8dbef0cacc6fdacaba739d1"} err="failed to get container status \"91b094bf7f2252d2a0c565d3e1f6482ce649815ab8dbef0cacc6fdacaba739d1\": rpc error: code = NotFound desc = could not find container \"91b094bf7f2252d2a0c565d3e1f6482ce649815ab8dbef0cacc6fdacaba739d1\": container with ID starting with 91b094bf7f2252d2a0c565d3e1f6482ce649815ab8dbef0cacc6fdacaba739d1 not found: ID does not exist" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.867101 4869 scope.go:117] "RemoveContainer" containerID="75dab8a1f04e26fefa8afb3a51f7c8930a209ed96943e5ddbc760dd150f26318" Mar 12 15:09:59 crc kubenswrapper[4869]: I0312 15:09:59.867415 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75dab8a1f04e26fefa8afb3a51f7c8930a209ed96943e5ddbc760dd150f26318"} err="failed to get container status \"75dab8a1f04e26fefa8afb3a51f7c8930a209ed96943e5ddbc760dd150f26318\": rpc error: code = NotFound desc = could not find container \"75dab8a1f04e26fefa8afb3a51f7c8930a209ed96943e5ddbc760dd150f26318\": container with ID starting with 75dab8a1f04e26fefa8afb3a51f7c8930a209ed96943e5ddbc760dd150f26318 not found: ID does not exist" Mar 12 15:10:00 crc kubenswrapper[4869]: I0312 15:10:00.157419 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555470-wzk9x"] Mar 12 15:10:00 crc kubenswrapper[4869]: I0312 15:10:00.160576 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555470-wzk9x" Mar 12 15:10:00 crc kubenswrapper[4869]: I0312 15:10:00.162645 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:10:00 crc kubenswrapper[4869]: I0312 15:10:00.162736 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:10:00 crc kubenswrapper[4869]: I0312 15:10:00.164431 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 15:10:00 crc kubenswrapper[4869]: I0312 15:10:00.190528 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555470-wzk9x"] Mar 12 15:10:00 crc kubenswrapper[4869]: I0312 15:10:00.231157 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a10d0ea8-db1d-4779-9b4c-d97edb20c85e","Type":"ContainerStarted","Data":"235eaa531a20e3a545534d20f3968b09e118379b91c73ee5b19b7df394d6883d"} Mar 12 15:10:00 crc kubenswrapper[4869]: E0312 15:10:00.237994 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-novncproxy-novncproxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified\\\"\"" pod="openstack/nova-cell1-novncproxy-0" podUID="a10d0ea8-db1d-4779-9b4c-d97edb20c85e" Mar 12 15:10:00 crc kubenswrapper[4869]: I0312 15:10:00.243704 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfmgn\" (UniqueName: \"kubernetes.io/projected/84874bd4-e292-42eb-bb62-edbd0500482f-kube-api-access-cfmgn\") pod \"auto-csr-approver-29555470-wzk9x\" (UID: \"84874bd4-e292-42eb-bb62-edbd0500482f\") " pod="openshift-infra/auto-csr-approver-29555470-wzk9x" Mar 12 15:10:00 crc kubenswrapper[4869]: I0312 15:10:00.284089 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.651463781 podStartE2EDuration="7.284065175s" podCreationTimestamp="2026-03-12 15:09:53 +0000 UTC" firstStartedPulling="2026-03-12 15:09:54.299022431 +0000 UTC m=+1346.584247699" lastFinishedPulling="2026-03-12 15:09:59.931623815 +0000 UTC m=+1352.216849093" observedRunningTime="2026-03-12 15:10:00.277992251 +0000 UTC m=+1352.563217529" watchObservedRunningTime="2026-03-12 15:10:00.284065175 +0000 UTC m=+1352.569290453" Mar 12 15:10:00 crc kubenswrapper[4869]: I0312 15:10:00.347055 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfmgn\" (UniqueName: \"kubernetes.io/projected/84874bd4-e292-42eb-bb62-edbd0500482f-kube-api-access-cfmgn\") pod \"auto-csr-approver-29555470-wzk9x\" (UID: \"84874bd4-e292-42eb-bb62-edbd0500482f\") " pod="openshift-infra/auto-csr-approver-29555470-wzk9x" Mar 12 15:10:00 crc kubenswrapper[4869]: I0312 15:10:00.356427 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7df96c47-1886-48fe-8ee5-8436498d91ce" path="/var/lib/kubelet/pods/7df96c47-1886-48fe-8ee5-8436498d91ce/volumes" Mar 12 15:10:00 crc kubenswrapper[4869]: I0312 15:10:00.358508 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd15c8a9-0582-4c14-9450-407ad0cfa828" path="/var/lib/kubelet/pods/cd15c8a9-0582-4c14-9450-407ad0cfa828/volumes" Mar 12 15:10:00 crc kubenswrapper[4869]: I0312 15:10:00.379265 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfmgn\" (UniqueName: \"kubernetes.io/projected/84874bd4-e292-42eb-bb62-edbd0500482f-kube-api-access-cfmgn\") pod \"auto-csr-approver-29555470-wzk9x\" (UID: \"84874bd4-e292-42eb-bb62-edbd0500482f\") " pod="openshift-infra/auto-csr-approver-29555470-wzk9x" Mar 12 15:10:00 crc kubenswrapper[4869]: I0312 15:10:00.492845 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555470-wzk9x" Mar 12 15:10:00 crc kubenswrapper[4869]: I0312 15:10:00.561436 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:10:00 crc kubenswrapper[4869]: I0312 15:10:00.722552 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 15:10:01 crc kubenswrapper[4869]: I0312 15:10:01.036470 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555470-wzk9x"] Mar 12 15:10:01 crc kubenswrapper[4869]: I0312 15:10:01.263274 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0ba9b14c-abed-454e-a2f3-809ccafcd42f","Type":"ContainerStarted","Data":"7de72d0295662374b609e566a0f1f8fefdb3762c779d9751c4ae25f2c894a238"} Mar 12 15:10:01 crc kubenswrapper[4869]: I0312 15:10:01.266386 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555470-wzk9x" event={"ID":"84874bd4-e292-42eb-bb62-edbd0500482f","Type":"ContainerStarted","Data":"f70ea22419adbed32ab8d4304e0cc07582332b38499d9553dc0ff23d7269b7fc"} Mar 12 15:10:01 crc kubenswrapper[4869]: I0312 15:10:01.273178 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"09af255f-def2-476c-bf2e-77a49b59388d","Type":"ContainerStarted","Data":"44d83d620e6d575294fd36749c92e155694e571fd0ffb96b94561349e9ad3ad5"} Mar 12 15:10:01 crc kubenswrapper[4869]: I0312 15:10:01.273217 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"09af255f-def2-476c-bf2e-77a49b59388d","Type":"ContainerStarted","Data":"307f4e18466d3ee16e17bfd1f206d14eb3d9e722823d7791de5a1821ef9c7adb"} Mar 12 15:10:01 crc kubenswrapper[4869]: I0312 15:10:01.275792 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"df3e455c-91ed-49cc-9072-0b9abda6ba0d","Type":"ContainerStarted","Data":"297c3a9e7f7731c9d548a37a7a68a5dc3c4103d909eb39454fb2f43c33aca014"} Mar 12 15:10:01 crc kubenswrapper[4869]: I0312 15:10:01.275948 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"df3e455c-91ed-49cc-9072-0b9abda6ba0d","Type":"ContainerStarted","Data":"a9eb0a69790a2fec0a70f53a3d7f7136004c5b1875fe1988db4e54fc173b2c9e"} Mar 12 15:10:01 crc kubenswrapper[4869]: I0312 15:10:01.276037 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"df3e455c-91ed-49cc-9072-0b9abda6ba0d","Type":"ContainerStarted","Data":"d40242a5631088d0750f4377b3874b762424acaffa7272dbfc343dc908a7b6ab"} Mar 12 15:10:01 crc kubenswrapper[4869]: E0312 15:10:01.283773 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-novncproxy-novncproxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified\\\"\"" pod="openstack/nova-cell1-novncproxy-0" podUID="a10d0ea8-db1d-4779-9b4c-d97edb20c85e" Mar 12 15:10:01 crc kubenswrapper[4869]: I0312 15:10:01.317237 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.317219918 podStartE2EDuration="2.317219918s" podCreationTimestamp="2026-03-12 15:09:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:10:01.314588822 +0000 UTC m=+1353.599814100" watchObservedRunningTime="2026-03-12 15:10:01.317219918 +0000 UTC m=+1353.602445186" Mar 12 15:10:03 crc kubenswrapper[4869]: I0312 15:10:03.298385 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"09af255f-def2-476c-bf2e-77a49b59388d","Type":"ContainerStarted","Data":"6ab533bcf44cdccef55fa48cbb2761294a453d5324f8d62c305a78c960df28fd"} Mar 12 15:10:03 crc kubenswrapper[4869]: I0312 15:10:03.298842 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"09af255f-def2-476c-bf2e-77a49b59388d","Type":"ContainerStarted","Data":"6efa829c29242f44e8e07501ff5369c59051fd3c4af89adc548677b9bd41977c"} Mar 12 15:10:03 crc kubenswrapper[4869]: I0312 15:10:03.301008 4869 generic.go:334] "Generic (PLEG): container finished" podID="2829ee85-1daa-4418-b99c-90bbebceb7c2" containerID="59b4471156ad7b9f1040a028511e1b2918ce761e3e6f4b926fc37d1b83e33231" exitCode=0 Mar 12 15:10:03 crc kubenswrapper[4869]: I0312 15:10:03.301078 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bzm2j" event={"ID":"2829ee85-1daa-4418-b99c-90bbebceb7c2","Type":"ContainerDied","Data":"59b4471156ad7b9f1040a028511e1b2918ce761e3e6f4b926fc37d1b83e33231"} Mar 12 15:10:03 crc kubenswrapper[4869]: I0312 15:10:03.303144 4869 generic.go:334] "Generic (PLEG): container finished" podID="f547181b-aa83-44ec-9c1a-dfbc394ff6a5" containerID="4fc0e8e6db5dd590434b9022f05c0318a9911d651083687ec672665e85926ce1" exitCode=0 Mar 12 15:10:03 crc kubenswrapper[4869]: I0312 15:10:03.303174 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wdqpj" event={"ID":"f547181b-aa83-44ec-9c1a-dfbc394ff6a5","Type":"ContainerDied","Data":"4fc0e8e6db5dd590434b9022f05c0318a9911d651083687ec672665e85926ce1"} Mar 12 15:10:03 crc kubenswrapper[4869]: I0312 15:10:03.373816 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b6c754dc9-jzxfs" Mar 12 15:10:03 crc kubenswrapper[4869]: I0312 15:10:03.451741 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56696ff475-snqth"] Mar 12 15:10:03 crc kubenswrapper[4869]: I0312 15:10:03.451987 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56696ff475-snqth" podUID="7d8820b4-254f-4f89-8609-a8b86b0d5796" containerName="dnsmasq-dns" containerID="cri-o://19b0544d3c6ffd94718f36ac8538fafc6b217424698cb2135894577bfe52dcb6" gracePeriod=10 Mar 12 15:10:03 crc kubenswrapper[4869]: I0312 15:10:03.503095 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 12 15:10:03 crc kubenswrapper[4869]: I0312 15:10:03.504271 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 12 15:10:03 crc kubenswrapper[4869]: I0312 15:10:03.561351 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 12 15:10:03 crc kubenswrapper[4869]: I0312 15:10:03.960639 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56696ff475-snqth" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.129022 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d8820b4-254f-4f89-8609-a8b86b0d5796-config\") pod \"7d8820b4-254f-4f89-8609-a8b86b0d5796\" (UID: \"7d8820b4-254f-4f89-8609-a8b86b0d5796\") " Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.129112 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh92j\" (UniqueName: \"kubernetes.io/projected/7d8820b4-254f-4f89-8609-a8b86b0d5796-kube-api-access-qh92j\") pod \"7d8820b4-254f-4f89-8609-a8b86b0d5796\" (UID: \"7d8820b4-254f-4f89-8609-a8b86b0d5796\") " Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.129170 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d8820b4-254f-4f89-8609-a8b86b0d5796-ovsdbserver-nb\") pod \"7d8820b4-254f-4f89-8609-a8b86b0d5796\" (UID: \"7d8820b4-254f-4f89-8609-a8b86b0d5796\") " Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.129363 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d8820b4-254f-4f89-8609-a8b86b0d5796-ovsdbserver-sb\") pod \"7d8820b4-254f-4f89-8609-a8b86b0d5796\" (UID: \"7d8820b4-254f-4f89-8609-a8b86b0d5796\") " Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.129439 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d8820b4-254f-4f89-8609-a8b86b0d5796-dns-svc\") pod \"7d8820b4-254f-4f89-8609-a8b86b0d5796\" (UID: \"7d8820b4-254f-4f89-8609-a8b86b0d5796\") " Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.129562 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d8820b4-254f-4f89-8609-a8b86b0d5796-dns-swift-storage-0\") pod \"7d8820b4-254f-4f89-8609-a8b86b0d5796\" (UID: \"7d8820b4-254f-4f89-8609-a8b86b0d5796\") " Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.140175 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d8820b4-254f-4f89-8609-a8b86b0d5796-kube-api-access-qh92j" (OuterVolumeSpecName: "kube-api-access-qh92j") pod "7d8820b4-254f-4f89-8609-a8b86b0d5796" (UID: "7d8820b4-254f-4f89-8609-a8b86b0d5796"). InnerVolumeSpecName "kube-api-access-qh92j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.235398 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qh92j\" (UniqueName: \"kubernetes.io/projected/7d8820b4-254f-4f89-8609-a8b86b0d5796-kube-api-access-qh92j\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.284472 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d8820b4-254f-4f89-8609-a8b86b0d5796-config" (OuterVolumeSpecName: "config") pod "7d8820b4-254f-4f89-8609-a8b86b0d5796" (UID: "7d8820b4-254f-4f89-8609-a8b86b0d5796"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.326222 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d8820b4-254f-4f89-8609-a8b86b0d5796-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7d8820b4-254f-4f89-8609-a8b86b0d5796" (UID: "7d8820b4-254f-4f89-8609-a8b86b0d5796"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.342568 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d8820b4-254f-4f89-8609-a8b86b0d5796-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7d8820b4-254f-4f89-8609-a8b86b0d5796" (UID: "7d8820b4-254f-4f89-8609-a8b86b0d5796"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.350582 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d8820b4-254f-4f89-8609-a8b86b0d5796-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.351243 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d8820b4-254f-4f89-8609-a8b86b0d5796-config\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.356410 4869 generic.go:334] "Generic (PLEG): container finished" podID="7d8820b4-254f-4f89-8609-a8b86b0d5796" containerID="19b0544d3c6ffd94718f36ac8538fafc6b217424698cb2135894577bfe52dcb6" exitCode=0 Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.357375 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56696ff475-snqth" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.370461 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555470-wzk9x" podStartSLOduration=2.075136536 podStartE2EDuration="4.370443051s" podCreationTimestamp="2026-03-12 15:10:00 +0000 UTC" firstStartedPulling="2026-03-12 15:10:01.053833084 +0000 UTC m=+1353.339058362" lastFinishedPulling="2026-03-12 15:10:03.349139599 +0000 UTC m=+1355.634364877" observedRunningTime="2026-03-12 15:10:04.368052873 +0000 UTC m=+1356.653278151" watchObservedRunningTime="2026-03-12 15:10:04.370443051 +0000 UTC m=+1356.655668329" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.379230 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d8820b4-254f-4f89-8609-a8b86b0d5796-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7d8820b4-254f-4f89-8609-a8b86b0d5796" (UID: "7d8820b4-254f-4f89-8609-a8b86b0d5796"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.391298 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d8820b4-254f-4f89-8609-a8b86b0d5796-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7d8820b4-254f-4f89-8609-a8b86b0d5796" (UID: "7d8820b4-254f-4f89-8609-a8b86b0d5796"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.453978 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d8820b4-254f-4f89-8609-a8b86b0d5796-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.454015 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d8820b4-254f-4f89-8609-a8b86b0d5796-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.454030 4869 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d8820b4-254f-4f89-8609-a8b86b0d5796-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.485243 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555470-wzk9x" event={"ID":"84874bd4-e292-42eb-bb62-edbd0500482f","Type":"ContainerStarted","Data":"07488d27a0bf22f7f29266dbf115e85fb87fb51aa90e5154a49f24c7dc895676"} Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.485318 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56696ff475-snqth" event={"ID":"7d8820b4-254f-4f89-8609-a8b86b0d5796","Type":"ContainerDied","Data":"19b0544d3c6ffd94718f36ac8538fafc6b217424698cb2135894577bfe52dcb6"} Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.485369 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.485380 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56696ff475-snqth" event={"ID":"7d8820b4-254f-4f89-8609-a8b86b0d5796","Type":"ContainerDied","Data":"e85cf88af6a7a070fef5545697d2390f90c3fa7993c40a268a1ecb0601f8e04d"} Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.485397 4869 scope.go:117] "RemoveContainer" containerID="19b0544d3c6ffd94718f36ac8538fafc6b217424698cb2135894577bfe52dcb6" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.533675 4869 scope.go:117] "RemoveContainer" containerID="56b88304e21c94aee28d3896cf3e49687eab4ce788807f2cab4c37e04bdaed22" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.580643 4869 scope.go:117] "RemoveContainer" containerID="19b0544d3c6ffd94718f36ac8538fafc6b217424698cb2135894577bfe52dcb6" Mar 12 15:10:04 crc kubenswrapper[4869]: E0312 15:10:04.581359 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19b0544d3c6ffd94718f36ac8538fafc6b217424698cb2135894577bfe52dcb6\": container with ID starting with 19b0544d3c6ffd94718f36ac8538fafc6b217424698cb2135894577bfe52dcb6 not found: ID does not exist" containerID="19b0544d3c6ffd94718f36ac8538fafc6b217424698cb2135894577bfe52dcb6" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.581391 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19b0544d3c6ffd94718f36ac8538fafc6b217424698cb2135894577bfe52dcb6"} err="failed to get container status \"19b0544d3c6ffd94718f36ac8538fafc6b217424698cb2135894577bfe52dcb6\": rpc error: code = NotFound desc = could not find container \"19b0544d3c6ffd94718f36ac8538fafc6b217424698cb2135894577bfe52dcb6\": container with ID starting with 19b0544d3c6ffd94718f36ac8538fafc6b217424698cb2135894577bfe52dcb6 not found: ID does not exist" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.581412 4869 scope.go:117] "RemoveContainer" containerID="56b88304e21c94aee28d3896cf3e49687eab4ce788807f2cab4c37e04bdaed22" Mar 12 15:10:04 crc kubenswrapper[4869]: E0312 15:10:04.581823 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56b88304e21c94aee28d3896cf3e49687eab4ce788807f2cab4c37e04bdaed22\": container with ID starting with 56b88304e21c94aee28d3896cf3e49687eab4ce788807f2cab4c37e04bdaed22 not found: ID does not exist" containerID="56b88304e21c94aee28d3896cf3e49687eab4ce788807f2cab4c37e04bdaed22" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.581851 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56b88304e21c94aee28d3896cf3e49687eab4ce788807f2cab4c37e04bdaed22"} err="failed to get container status \"56b88304e21c94aee28d3896cf3e49687eab4ce788807f2cab4c37e04bdaed22\": rpc error: code = NotFound desc = could not find container \"56b88304e21c94aee28d3896cf3e49687eab4ce788807f2cab4c37e04bdaed22\": container with ID starting with 56b88304e21c94aee28d3896cf3e49687eab4ce788807f2cab4c37e04bdaed22 not found: ID does not exist" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.637958 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wdqpj" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.712722 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.713338 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.716093 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56696ff475-snqth"] Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.726059 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56696ff475-snqth"] Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.761465 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f547181b-aa83-44ec-9c1a-dfbc394ff6a5-scripts\") pod \"f547181b-aa83-44ec-9c1a-dfbc394ff6a5\" (UID: \"f547181b-aa83-44ec-9c1a-dfbc394ff6a5\") " Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.761599 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f547181b-aa83-44ec-9c1a-dfbc394ff6a5-config-data\") pod \"f547181b-aa83-44ec-9c1a-dfbc394ff6a5\" (UID: \"f547181b-aa83-44ec-9c1a-dfbc394ff6a5\") " Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.761665 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f547181b-aa83-44ec-9c1a-dfbc394ff6a5-combined-ca-bundle\") pod \"f547181b-aa83-44ec-9c1a-dfbc394ff6a5\" (UID: \"f547181b-aa83-44ec-9c1a-dfbc394ff6a5\") " Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.761730 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrcbs\" (UniqueName: \"kubernetes.io/projected/f547181b-aa83-44ec-9c1a-dfbc394ff6a5-kube-api-access-qrcbs\") pod \"f547181b-aa83-44ec-9c1a-dfbc394ff6a5\" (UID: \"f547181b-aa83-44ec-9c1a-dfbc394ff6a5\") " Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.765960 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f547181b-aa83-44ec-9c1a-dfbc394ff6a5-kube-api-access-qrcbs" (OuterVolumeSpecName: "kube-api-access-qrcbs") pod "f547181b-aa83-44ec-9c1a-dfbc394ff6a5" (UID: "f547181b-aa83-44ec-9c1a-dfbc394ff6a5"). InnerVolumeSpecName "kube-api-access-qrcbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.770636 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f547181b-aa83-44ec-9c1a-dfbc394ff6a5-scripts" (OuterVolumeSpecName: "scripts") pod "f547181b-aa83-44ec-9c1a-dfbc394ff6a5" (UID: "f547181b-aa83-44ec-9c1a-dfbc394ff6a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.804059 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f547181b-aa83-44ec-9c1a-dfbc394ff6a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f547181b-aa83-44ec-9c1a-dfbc394ff6a5" (UID: "f547181b-aa83-44ec-9c1a-dfbc394ff6a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.809927 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f547181b-aa83-44ec-9c1a-dfbc394ff6a5-config-data" (OuterVolumeSpecName: "config-data") pod "f547181b-aa83-44ec-9c1a-dfbc394ff6a5" (UID: "f547181b-aa83-44ec-9c1a-dfbc394ff6a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.814734 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bzm2j" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.864956 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f547181b-aa83-44ec-9c1a-dfbc394ff6a5-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.864982 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f547181b-aa83-44ec-9c1a-dfbc394ff6a5-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.864991 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f547181b-aa83-44ec-9c1a-dfbc394ff6a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.865021 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrcbs\" (UniqueName: \"kubernetes.io/projected/f547181b-aa83-44ec-9c1a-dfbc394ff6a5-kube-api-access-qrcbs\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.966272 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5jqp\" (UniqueName: \"kubernetes.io/projected/2829ee85-1daa-4418-b99c-90bbebceb7c2-kube-api-access-n5jqp\") pod \"2829ee85-1daa-4418-b99c-90bbebceb7c2\" (UID: \"2829ee85-1daa-4418-b99c-90bbebceb7c2\") " Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.966324 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2829ee85-1daa-4418-b99c-90bbebceb7c2-config-data\") pod \"2829ee85-1daa-4418-b99c-90bbebceb7c2\" (UID: \"2829ee85-1daa-4418-b99c-90bbebceb7c2\") " Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.966494 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2829ee85-1daa-4418-b99c-90bbebceb7c2-scripts\") pod \"2829ee85-1daa-4418-b99c-90bbebceb7c2\" (UID: \"2829ee85-1daa-4418-b99c-90bbebceb7c2\") " Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.966606 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2829ee85-1daa-4418-b99c-90bbebceb7c2-combined-ca-bundle\") pod \"2829ee85-1daa-4418-b99c-90bbebceb7c2\" (UID: \"2829ee85-1daa-4418-b99c-90bbebceb7c2\") " Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.970971 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2829ee85-1daa-4418-b99c-90bbebceb7c2-scripts" (OuterVolumeSpecName: "scripts") pod "2829ee85-1daa-4418-b99c-90bbebceb7c2" (UID: "2829ee85-1daa-4418-b99c-90bbebceb7c2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.971256 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2829ee85-1daa-4418-b99c-90bbebceb7c2-kube-api-access-n5jqp" (OuterVolumeSpecName: "kube-api-access-n5jqp") pod "2829ee85-1daa-4418-b99c-90bbebceb7c2" (UID: "2829ee85-1daa-4418-b99c-90bbebceb7c2"). InnerVolumeSpecName "kube-api-access-n5jqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:10:04 crc kubenswrapper[4869]: I0312 15:10:04.993864 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2829ee85-1daa-4418-b99c-90bbebceb7c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2829ee85-1daa-4418-b99c-90bbebceb7c2" (UID: "2829ee85-1daa-4418-b99c-90bbebceb7c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.000011 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2829ee85-1daa-4418-b99c-90bbebceb7c2-config-data" (OuterVolumeSpecName: "config-data") pod "2829ee85-1daa-4418-b99c-90bbebceb7c2" (UID: "2829ee85-1daa-4418-b99c-90bbebceb7c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.069018 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2829ee85-1daa-4418-b99c-90bbebceb7c2-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.069060 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2829ee85-1daa-4418-b99c-90bbebceb7c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.069073 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5jqp\" (UniqueName: \"kubernetes.io/projected/2829ee85-1daa-4418-b99c-90bbebceb7c2-kube-api-access-n5jqp\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.069082 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2829ee85-1daa-4418-b99c-90bbebceb7c2-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.372058 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bzm2j" Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.372058 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bzm2j" event={"ID":"2829ee85-1daa-4418-b99c-90bbebceb7c2","Type":"ContainerDied","Data":"70345be3937fbef859414d2706e6c25f9c9d6d464a5fca813ba474a6b2ee2b76"} Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.372191 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70345be3937fbef859414d2706e6c25f9c9d6d464a5fca813ba474a6b2ee2b76" Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.374346 4869 generic.go:334] "Generic (PLEG): container finished" podID="84874bd4-e292-42eb-bb62-edbd0500482f" containerID="07488d27a0bf22f7f29266dbf115e85fb87fb51aa90e5154a49f24c7dc895676" exitCode=0 Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.374425 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555470-wzk9x" event={"ID":"84874bd4-e292-42eb-bb62-edbd0500482f","Type":"ContainerDied","Data":"07488d27a0bf22f7f29266dbf115e85fb87fb51aa90e5154a49f24c7dc895676"} Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.376605 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wdqpj" event={"ID":"f547181b-aa83-44ec-9c1a-dfbc394ff6a5","Type":"ContainerDied","Data":"79b83a6cb1d80c12019079822ed494ff34fea75407c1f57968c864be2bc5b32c"} Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.376625 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wdqpj" Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.376632 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79b83a6cb1d80c12019079822ed494ff34fea75407c1f57968c864be2bc5b32c" Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.425797 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 12 15:10:05 crc kubenswrapper[4869]: E0312 15:10:05.426522 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2829ee85-1daa-4418-b99c-90bbebceb7c2" containerName="nova-manage" Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.426561 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="2829ee85-1daa-4418-b99c-90bbebceb7c2" containerName="nova-manage" Mar 12 15:10:05 crc kubenswrapper[4869]: E0312 15:10:05.426581 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d8820b4-254f-4f89-8609-a8b86b0d5796" containerName="dnsmasq-dns" Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.426589 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d8820b4-254f-4f89-8609-a8b86b0d5796" containerName="dnsmasq-dns" Mar 12 15:10:05 crc kubenswrapper[4869]: E0312 15:10:05.426607 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d8820b4-254f-4f89-8609-a8b86b0d5796" containerName="init" Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.426614 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d8820b4-254f-4f89-8609-a8b86b0d5796" containerName="init" Mar 12 15:10:05 crc kubenswrapper[4869]: E0312 15:10:05.426631 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f547181b-aa83-44ec-9c1a-dfbc394ff6a5" containerName="nova-cell1-conductor-db-sync" Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.426641 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f547181b-aa83-44ec-9c1a-dfbc394ff6a5" containerName="nova-cell1-conductor-db-sync" Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.426875 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="2829ee85-1daa-4418-b99c-90bbebceb7c2" containerName="nova-manage" Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.426903 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f547181b-aa83-44ec-9c1a-dfbc394ff6a5" containerName="nova-cell1-conductor-db-sync" Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.426929 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d8820b4-254f-4f89-8609-a8b86b0d5796" containerName="dnsmasq-dns" Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.427702 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.429776 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.449710 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.577706 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ztv8\" (UniqueName: \"kubernetes.io/projected/cda652da-aabd-4fd6-91f3-0da6ec497845-kube-api-access-7ztv8\") pod \"nova-cell1-conductor-0\" (UID: \"cda652da-aabd-4fd6-91f3-0da6ec497845\") " pod="openstack/nova-cell1-conductor-0" Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.577794 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cda652da-aabd-4fd6-91f3-0da6ec497845-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"cda652da-aabd-4fd6-91f3-0da6ec497845\") " pod="openstack/nova-cell1-conductor-0" Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.577817 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cda652da-aabd-4fd6-91f3-0da6ec497845-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"cda652da-aabd-4fd6-91f3-0da6ec497845\") " pod="openstack/nova-cell1-conductor-0" Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.598881 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.630271 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.646299 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.679417 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cda652da-aabd-4fd6-91f3-0da6ec497845-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"cda652da-aabd-4fd6-91f3-0da6ec497845\") " pod="openstack/nova-cell1-conductor-0" Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.679851 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ztv8\" (UniqueName: \"kubernetes.io/projected/cda652da-aabd-4fd6-91f3-0da6ec497845-kube-api-access-7ztv8\") pod \"nova-cell1-conductor-0\" (UID: \"cda652da-aabd-4fd6-91f3-0da6ec497845\") " pod="openstack/nova-cell1-conductor-0" Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.679948 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cda652da-aabd-4fd6-91f3-0da6ec497845-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"cda652da-aabd-4fd6-91f3-0da6ec497845\") " pod="openstack/nova-cell1-conductor-0" Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.686298 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cda652da-aabd-4fd6-91f3-0da6ec497845-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"cda652da-aabd-4fd6-91f3-0da6ec497845\") " pod="openstack/nova-cell1-conductor-0" Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.703237 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ztv8\" (UniqueName: \"kubernetes.io/projected/cda652da-aabd-4fd6-91f3-0da6ec497845-kube-api-access-7ztv8\") pod \"nova-cell1-conductor-0\" (UID: \"cda652da-aabd-4fd6-91f3-0da6ec497845\") " pod="openstack/nova-cell1-conductor-0" Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.703441 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cda652da-aabd-4fd6-91f3-0da6ec497845-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"cda652da-aabd-4fd6-91f3-0da6ec497845\") " pod="openstack/nova-cell1-conductor-0" Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.746822 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 12 15:10:05 crc kubenswrapper[4869]: I0312 15:10:05.947520 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.091139 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f1248e0-398e-48f9-adad-d214c7700bb8-combined-ca-bundle\") pod \"5f1248e0-398e-48f9-adad-d214c7700bb8\" (UID: \"5f1248e0-398e-48f9-adad-d214c7700bb8\") " Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.091316 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f1248e0-398e-48f9-adad-d214c7700bb8-logs\") pod \"5f1248e0-398e-48f9-adad-d214c7700bb8\" (UID: \"5f1248e0-398e-48f9-adad-d214c7700bb8\") " Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.091379 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6cn6\" (UniqueName: \"kubernetes.io/projected/5f1248e0-398e-48f9-adad-d214c7700bb8-kube-api-access-z6cn6\") pod \"5f1248e0-398e-48f9-adad-d214c7700bb8\" (UID: \"5f1248e0-398e-48f9-adad-d214c7700bb8\") " Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.091450 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f1248e0-398e-48f9-adad-d214c7700bb8-config-data\") pod \"5f1248e0-398e-48f9-adad-d214c7700bb8\" (UID: \"5f1248e0-398e-48f9-adad-d214c7700bb8\") " Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.092807 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f1248e0-398e-48f9-adad-d214c7700bb8-logs" (OuterVolumeSpecName: "logs") pod "5f1248e0-398e-48f9-adad-d214c7700bb8" (UID: "5f1248e0-398e-48f9-adad-d214c7700bb8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.097030 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f1248e0-398e-48f9-adad-d214c7700bb8-kube-api-access-z6cn6" (OuterVolumeSpecName: "kube-api-access-z6cn6") pod "5f1248e0-398e-48f9-adad-d214c7700bb8" (UID: "5f1248e0-398e-48f9-adad-d214c7700bb8"). InnerVolumeSpecName "kube-api-access-z6cn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.100335 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f1248e0-398e-48f9-adad-d214c7700bb8-config-data" (OuterVolumeSpecName: "config-data") pod "5f1248e0-398e-48f9-adad-d214c7700bb8" (UID: "5f1248e0-398e-48f9-adad-d214c7700bb8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.102305 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f1248e0-398e-48f9-adad-d214c7700bb8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f1248e0-398e-48f9-adad-d214c7700bb8" (UID: "5f1248e0-398e-48f9-adad-d214c7700bb8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.194301 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f1248e0-398e-48f9-adad-d214c7700bb8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.194689 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f1248e0-398e-48f9-adad-d214c7700bb8-logs\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.194753 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6cn6\" (UniqueName: \"kubernetes.io/projected/5f1248e0-398e-48f9-adad-d214c7700bb8-kube-api-access-z6cn6\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.194833 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f1248e0-398e-48f9-adad-d214c7700bb8-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.267905 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.409259 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d8820b4-254f-4f89-8609-a8b86b0d5796" path="/var/lib/kubelet/pods/7d8820b4-254f-4f89-8609-a8b86b0d5796/volumes" Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.411951 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"cda652da-aabd-4fd6-91f3-0da6ec497845","Type":"ContainerStarted","Data":"4ec59246171b1469d15d25c4c3449c9009db79807cc95d10acef25e407aa9d76"} Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.413522 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.413964 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="df3e455c-91ed-49cc-9072-0b9abda6ba0d" containerName="nova-metadata-log" containerID="cri-o://a9eb0a69790a2fec0a70f53a3d7f7136004c5b1875fe1988db4e54fc173b2c9e" gracePeriod=30 Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.414063 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="df3e455c-91ed-49cc-9072-0b9abda6ba0d" containerName="nova-metadata-metadata" containerID="cri-o://297c3a9e7f7731c9d548a37a7a68a5dc3c4103d909eb39454fb2f43c33aca014" gracePeriod=30 Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.414307 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5f1248e0-398e-48f9-adad-d214c7700bb8","Type":"ContainerDied","Data":"354a3583d6498691de0324a3991957292fc80e30d59e3a9a5ba5704911370370"} Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.486859 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.499595 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.536659 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.539531 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.541481 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.541864 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.710833 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/101c26e1-fa16-444e-936e-82baa2aebf60-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"101c26e1-fa16-444e-936e-82baa2aebf60\") " pod="openstack/nova-api-0" Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.711178 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/101c26e1-fa16-444e-936e-82baa2aebf60-logs\") pod \"nova-api-0\" (UID: \"101c26e1-fa16-444e-936e-82baa2aebf60\") " pod="openstack/nova-api-0" Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.711224 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/101c26e1-fa16-444e-936e-82baa2aebf60-config-data\") pod \"nova-api-0\" (UID: \"101c26e1-fa16-444e-936e-82baa2aebf60\") " pod="openstack/nova-api-0" Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.711292 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7kcg\" (UniqueName: \"kubernetes.io/projected/101c26e1-fa16-444e-936e-82baa2aebf60-kube-api-access-z7kcg\") pod \"nova-api-0\" (UID: \"101c26e1-fa16-444e-936e-82baa2aebf60\") " pod="openstack/nova-api-0" Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.806432 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555470-wzk9x" Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.812208 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfmgn\" (UniqueName: \"kubernetes.io/projected/84874bd4-e292-42eb-bb62-edbd0500482f-kube-api-access-cfmgn\") pod \"84874bd4-e292-42eb-bb62-edbd0500482f\" (UID: \"84874bd4-e292-42eb-bb62-edbd0500482f\") " Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.812632 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/101c26e1-fa16-444e-936e-82baa2aebf60-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"101c26e1-fa16-444e-936e-82baa2aebf60\") " pod="openstack/nova-api-0" Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.812671 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/101c26e1-fa16-444e-936e-82baa2aebf60-logs\") pod \"nova-api-0\" (UID: \"101c26e1-fa16-444e-936e-82baa2aebf60\") " pod="openstack/nova-api-0" Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.812723 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/101c26e1-fa16-444e-936e-82baa2aebf60-config-data\") pod \"nova-api-0\" (UID: \"101c26e1-fa16-444e-936e-82baa2aebf60\") " pod="openstack/nova-api-0" Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.812814 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7kcg\" (UniqueName: \"kubernetes.io/projected/101c26e1-fa16-444e-936e-82baa2aebf60-kube-api-access-z7kcg\") pod \"nova-api-0\" (UID: \"101c26e1-fa16-444e-936e-82baa2aebf60\") " pod="openstack/nova-api-0" Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.813944 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/101c26e1-fa16-444e-936e-82baa2aebf60-logs\") pod \"nova-api-0\" (UID: \"101c26e1-fa16-444e-936e-82baa2aebf60\") " pod="openstack/nova-api-0" Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.820623 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/101c26e1-fa16-444e-936e-82baa2aebf60-config-data\") pod \"nova-api-0\" (UID: \"101c26e1-fa16-444e-936e-82baa2aebf60\") " pod="openstack/nova-api-0" Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.822735 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84874bd4-e292-42eb-bb62-edbd0500482f-kube-api-access-cfmgn" (OuterVolumeSpecName: "kube-api-access-cfmgn") pod "84874bd4-e292-42eb-bb62-edbd0500482f" (UID: "84874bd4-e292-42eb-bb62-edbd0500482f"). InnerVolumeSpecName "kube-api-access-cfmgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.823303 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/101c26e1-fa16-444e-936e-82baa2aebf60-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"101c26e1-fa16-444e-936e-82baa2aebf60\") " pod="openstack/nova-api-0" Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.835985 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7kcg\" (UniqueName: \"kubernetes.io/projected/101c26e1-fa16-444e-936e-82baa2aebf60-kube-api-access-z7kcg\") pod \"nova-api-0\" (UID: \"101c26e1-fa16-444e-936e-82baa2aebf60\") " pod="openstack/nova-api-0" Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.865213 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.915001 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfmgn\" (UniqueName: \"kubernetes.io/projected/84874bd4-e292-42eb-bb62-edbd0500482f-kube-api-access-cfmgn\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:06 crc kubenswrapper[4869]: I0312 15:10:06.966679 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.017377 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df3e455c-91ed-49cc-9072-0b9abda6ba0d-logs\") pod \"df3e455c-91ed-49cc-9072-0b9abda6ba0d\" (UID: \"df3e455c-91ed-49cc-9072-0b9abda6ba0d\") " Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.017427 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75qk8\" (UniqueName: \"kubernetes.io/projected/df3e455c-91ed-49cc-9072-0b9abda6ba0d-kube-api-access-75qk8\") pod \"df3e455c-91ed-49cc-9072-0b9abda6ba0d\" (UID: \"df3e455c-91ed-49cc-9072-0b9abda6ba0d\") " Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.017461 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df3e455c-91ed-49cc-9072-0b9abda6ba0d-combined-ca-bundle\") pod \"df3e455c-91ed-49cc-9072-0b9abda6ba0d\" (UID: \"df3e455c-91ed-49cc-9072-0b9abda6ba0d\") " Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.017592 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df3e455c-91ed-49cc-9072-0b9abda6ba0d-config-data\") pod \"df3e455c-91ed-49cc-9072-0b9abda6ba0d\" (UID: \"df3e455c-91ed-49cc-9072-0b9abda6ba0d\") " Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.017645 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/df3e455c-91ed-49cc-9072-0b9abda6ba0d-nova-metadata-tls-certs\") pod \"df3e455c-91ed-49cc-9072-0b9abda6ba0d\" (UID: \"df3e455c-91ed-49cc-9072-0b9abda6ba0d\") " Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.017960 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df3e455c-91ed-49cc-9072-0b9abda6ba0d-logs" (OuterVolumeSpecName: "logs") pod "df3e455c-91ed-49cc-9072-0b9abda6ba0d" (UID: "df3e455c-91ed-49cc-9072-0b9abda6ba0d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.018343 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df3e455c-91ed-49cc-9072-0b9abda6ba0d-logs\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.025746 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df3e455c-91ed-49cc-9072-0b9abda6ba0d-kube-api-access-75qk8" (OuterVolumeSpecName: "kube-api-access-75qk8") pod "df3e455c-91ed-49cc-9072-0b9abda6ba0d" (UID: "df3e455c-91ed-49cc-9072-0b9abda6ba0d"). InnerVolumeSpecName "kube-api-access-75qk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.067966 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df3e455c-91ed-49cc-9072-0b9abda6ba0d-config-data" (OuterVolumeSpecName: "config-data") pod "df3e455c-91ed-49cc-9072-0b9abda6ba0d" (UID: "df3e455c-91ed-49cc-9072-0b9abda6ba0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.076650 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df3e455c-91ed-49cc-9072-0b9abda6ba0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df3e455c-91ed-49cc-9072-0b9abda6ba0d" (UID: "df3e455c-91ed-49cc-9072-0b9abda6ba0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.120931 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75qk8\" (UniqueName: \"kubernetes.io/projected/df3e455c-91ed-49cc-9072-0b9abda6ba0d-kube-api-access-75qk8\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.120985 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df3e455c-91ed-49cc-9072-0b9abda6ba0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.121000 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df3e455c-91ed-49cc-9072-0b9abda6ba0d-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.131683 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df3e455c-91ed-49cc-9072-0b9abda6ba0d-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "df3e455c-91ed-49cc-9072-0b9abda6ba0d" (UID: "df3e455c-91ed-49cc-9072-0b9abda6ba0d"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.170398 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.223042 4869 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/df3e455c-91ed-49cc-9072-0b9abda6ba0d-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.432707 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555470-wzk9x" event={"ID":"84874bd4-e292-42eb-bb62-edbd0500482f","Type":"ContainerDied","Data":"f70ea22419adbed32ab8d4304e0cc07582332b38499d9553dc0ff23d7269b7fc"} Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.433076 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f70ea22419adbed32ab8d4304e0cc07582332b38499d9553dc0ff23d7269b7fc" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.433086 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555470-wzk9x" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.438882 4869 generic.go:334] "Generic (PLEG): container finished" podID="df3e455c-91ed-49cc-9072-0b9abda6ba0d" containerID="297c3a9e7f7731c9d548a37a7a68a5dc3c4103d909eb39454fb2f43c33aca014" exitCode=0 Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.438918 4869 generic.go:334] "Generic (PLEG): container finished" podID="df3e455c-91ed-49cc-9072-0b9abda6ba0d" containerID="a9eb0a69790a2fec0a70f53a3d7f7136004c5b1875fe1988db4e54fc173b2c9e" exitCode=143 Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.438924 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.438976 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"df3e455c-91ed-49cc-9072-0b9abda6ba0d","Type":"ContainerDied","Data":"297c3a9e7f7731c9d548a37a7a68a5dc3c4103d909eb39454fb2f43c33aca014"} Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.439007 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"df3e455c-91ed-49cc-9072-0b9abda6ba0d","Type":"ContainerDied","Data":"a9eb0a69790a2fec0a70f53a3d7f7136004c5b1875fe1988db4e54fc173b2c9e"} Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.439021 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"df3e455c-91ed-49cc-9072-0b9abda6ba0d","Type":"ContainerDied","Data":"d40242a5631088d0750f4377b3874b762424acaffa7272dbfc343dc908a7b6ab"} Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.439035 4869 scope.go:117] "RemoveContainer" containerID="297c3a9e7f7731c9d548a37a7a68a5dc3c4103d909eb39454fb2f43c33aca014" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.443716 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"cda652da-aabd-4fd6-91f3-0da6ec497845","Type":"ContainerStarted","Data":"3092dba4dfd8408cd5fee4d1f2d671876abc656520ffd10aebf855619eed3400"} Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.444531 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.445475 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555464-nrhv5"] Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.446979 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="0ba9b14c-abed-454e-a2f3-809ccafcd42f" containerName="nova-scheduler-scheduler" containerID="cri-o://7de72d0295662374b609e566a0f1f8fefdb3762c779d9751c4ae25f2c894a238" gracePeriod=30 Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.447057 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"101c26e1-fa16-444e-936e-82baa2aebf60","Type":"ContainerStarted","Data":"50f9dc841391d688b3de13a15773bd6b23c8d1163f074c9d462fde9d66258e3a"} Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.447078 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"101c26e1-fa16-444e-936e-82baa2aebf60","Type":"ContainerStarted","Data":"b25b47c90ee26af12853d342a01e8f708695c4b3b0b4cb8f64bdc3b56ff47f63"} Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.466982 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555464-nrhv5"] Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.467137 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.467124825 podStartE2EDuration="2.467124825s" podCreationTimestamp="2026-03-12 15:10:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:10:07.460879497 +0000 UTC m=+1359.746104775" watchObservedRunningTime="2026-03-12 15:10:07.467124825 +0000 UTC m=+1359.752350103" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.508249 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.514670 4869 scope.go:117] "RemoveContainer" containerID="a9eb0a69790a2fec0a70f53a3d7f7136004c5b1875fe1988db4e54fc173b2c9e" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.524085 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.533731 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 12 15:10:07 crc kubenswrapper[4869]: E0312 15:10:07.534174 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df3e455c-91ed-49cc-9072-0b9abda6ba0d" containerName="nova-metadata-metadata" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.534197 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="df3e455c-91ed-49cc-9072-0b9abda6ba0d" containerName="nova-metadata-metadata" Mar 12 15:10:07 crc kubenswrapper[4869]: E0312 15:10:07.534281 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df3e455c-91ed-49cc-9072-0b9abda6ba0d" containerName="nova-metadata-log" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.534289 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="df3e455c-91ed-49cc-9072-0b9abda6ba0d" containerName="nova-metadata-log" Mar 12 15:10:07 crc kubenswrapper[4869]: E0312 15:10:07.534303 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84874bd4-e292-42eb-bb62-edbd0500482f" containerName="oc" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.534309 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="84874bd4-e292-42eb-bb62-edbd0500482f" containerName="oc" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.534483 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="df3e455c-91ed-49cc-9072-0b9abda6ba0d" containerName="nova-metadata-metadata" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.534496 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="df3e455c-91ed-49cc-9072-0b9abda6ba0d" containerName="nova-metadata-log" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.534507 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="84874bd4-e292-42eb-bb62-edbd0500482f" containerName="oc" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.535456 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.537801 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.538019 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.542250 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.542321 4869 scope.go:117] "RemoveContainer" containerID="297c3a9e7f7731c9d548a37a7a68a5dc3c4103d909eb39454fb2f43c33aca014" Mar 12 15:10:07 crc kubenswrapper[4869]: E0312 15:10:07.543045 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"297c3a9e7f7731c9d548a37a7a68a5dc3c4103d909eb39454fb2f43c33aca014\": container with ID starting with 297c3a9e7f7731c9d548a37a7a68a5dc3c4103d909eb39454fb2f43c33aca014 not found: ID does not exist" containerID="297c3a9e7f7731c9d548a37a7a68a5dc3c4103d909eb39454fb2f43c33aca014" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.543121 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"297c3a9e7f7731c9d548a37a7a68a5dc3c4103d909eb39454fb2f43c33aca014"} err="failed to get container status \"297c3a9e7f7731c9d548a37a7a68a5dc3c4103d909eb39454fb2f43c33aca014\": rpc error: code = NotFound desc = could not find container \"297c3a9e7f7731c9d548a37a7a68a5dc3c4103d909eb39454fb2f43c33aca014\": container with ID starting with 297c3a9e7f7731c9d548a37a7a68a5dc3c4103d909eb39454fb2f43c33aca014 not found: ID does not exist" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.543158 4869 scope.go:117] "RemoveContainer" containerID="a9eb0a69790a2fec0a70f53a3d7f7136004c5b1875fe1988db4e54fc173b2c9e" Mar 12 15:10:07 crc kubenswrapper[4869]: E0312 15:10:07.543420 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9eb0a69790a2fec0a70f53a3d7f7136004c5b1875fe1988db4e54fc173b2c9e\": container with ID starting with a9eb0a69790a2fec0a70f53a3d7f7136004c5b1875fe1988db4e54fc173b2c9e not found: ID does not exist" containerID="a9eb0a69790a2fec0a70f53a3d7f7136004c5b1875fe1988db4e54fc173b2c9e" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.543444 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9eb0a69790a2fec0a70f53a3d7f7136004c5b1875fe1988db4e54fc173b2c9e"} err="failed to get container status \"a9eb0a69790a2fec0a70f53a3d7f7136004c5b1875fe1988db4e54fc173b2c9e\": rpc error: code = NotFound desc = could not find container \"a9eb0a69790a2fec0a70f53a3d7f7136004c5b1875fe1988db4e54fc173b2c9e\": container with ID starting with a9eb0a69790a2fec0a70f53a3d7f7136004c5b1875fe1988db4e54fc173b2c9e not found: ID does not exist" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.543459 4869 scope.go:117] "RemoveContainer" containerID="297c3a9e7f7731c9d548a37a7a68a5dc3c4103d909eb39454fb2f43c33aca014" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.543721 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"297c3a9e7f7731c9d548a37a7a68a5dc3c4103d909eb39454fb2f43c33aca014"} err="failed to get container status \"297c3a9e7f7731c9d548a37a7a68a5dc3c4103d909eb39454fb2f43c33aca014\": rpc error: code = NotFound desc = could not find container \"297c3a9e7f7731c9d548a37a7a68a5dc3c4103d909eb39454fb2f43c33aca014\": container with ID starting with 297c3a9e7f7731c9d548a37a7a68a5dc3c4103d909eb39454fb2f43c33aca014 not found: ID does not exist" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.543773 4869 scope.go:117] "RemoveContainer" containerID="a9eb0a69790a2fec0a70f53a3d7f7136004c5b1875fe1988db4e54fc173b2c9e" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.544063 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9eb0a69790a2fec0a70f53a3d7f7136004c5b1875fe1988db4e54fc173b2c9e"} err="failed to get container status \"a9eb0a69790a2fec0a70f53a3d7f7136004c5b1875fe1988db4e54fc173b2c9e\": rpc error: code = NotFound desc = could not find container \"a9eb0a69790a2fec0a70f53a3d7f7136004c5b1875fe1988db4e54fc173b2c9e\": container with ID starting with a9eb0a69790a2fec0a70f53a3d7f7136004c5b1875fe1988db4e54fc173b2c9e not found: ID does not exist" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.629934 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d835be4-a958-46b9-8319-4a13cb8ee018-config-data\") pod \"nova-metadata-0\" (UID: \"7d835be4-a958-46b9-8319-4a13cb8ee018\") " pod="openstack/nova-metadata-0" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.629989 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvlfz\" (UniqueName: \"kubernetes.io/projected/7d835be4-a958-46b9-8319-4a13cb8ee018-kube-api-access-gvlfz\") pod \"nova-metadata-0\" (UID: \"7d835be4-a958-46b9-8319-4a13cb8ee018\") " pod="openstack/nova-metadata-0" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.630049 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d835be4-a958-46b9-8319-4a13cb8ee018-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7d835be4-a958-46b9-8319-4a13cb8ee018\") " pod="openstack/nova-metadata-0" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.630148 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d835be4-a958-46b9-8319-4a13cb8ee018-logs\") pod \"nova-metadata-0\" (UID: \"7d835be4-a958-46b9-8319-4a13cb8ee018\") " pod="openstack/nova-metadata-0" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.630252 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d835be4-a958-46b9-8319-4a13cb8ee018-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7d835be4-a958-46b9-8319-4a13cb8ee018\") " pod="openstack/nova-metadata-0" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.731240 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d835be4-a958-46b9-8319-4a13cb8ee018-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7d835be4-a958-46b9-8319-4a13cb8ee018\") " pod="openstack/nova-metadata-0" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.731348 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d835be4-a958-46b9-8319-4a13cb8ee018-config-data\") pod \"nova-metadata-0\" (UID: \"7d835be4-a958-46b9-8319-4a13cb8ee018\") " pod="openstack/nova-metadata-0" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.731379 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvlfz\" (UniqueName: \"kubernetes.io/projected/7d835be4-a958-46b9-8319-4a13cb8ee018-kube-api-access-gvlfz\") pod \"nova-metadata-0\" (UID: \"7d835be4-a958-46b9-8319-4a13cb8ee018\") " pod="openstack/nova-metadata-0" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.731413 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d835be4-a958-46b9-8319-4a13cb8ee018-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7d835be4-a958-46b9-8319-4a13cb8ee018\") " pod="openstack/nova-metadata-0" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.731497 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d835be4-a958-46b9-8319-4a13cb8ee018-logs\") pod \"nova-metadata-0\" (UID: \"7d835be4-a958-46b9-8319-4a13cb8ee018\") " pod="openstack/nova-metadata-0" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.732267 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d835be4-a958-46b9-8319-4a13cb8ee018-logs\") pod \"nova-metadata-0\" (UID: \"7d835be4-a958-46b9-8319-4a13cb8ee018\") " pod="openstack/nova-metadata-0" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.735334 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d835be4-a958-46b9-8319-4a13cb8ee018-config-data\") pod \"nova-metadata-0\" (UID: \"7d835be4-a958-46b9-8319-4a13cb8ee018\") " pod="openstack/nova-metadata-0" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.735821 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d835be4-a958-46b9-8319-4a13cb8ee018-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7d835be4-a958-46b9-8319-4a13cb8ee018\") " pod="openstack/nova-metadata-0" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.736237 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d835be4-a958-46b9-8319-4a13cb8ee018-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7d835be4-a958-46b9-8319-4a13cb8ee018\") " pod="openstack/nova-metadata-0" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.757333 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvlfz\" (UniqueName: \"kubernetes.io/projected/7d835be4-a958-46b9-8319-4a13cb8ee018-kube-api-access-gvlfz\") pod \"nova-metadata-0\" (UID: \"7d835be4-a958-46b9-8319-4a13cb8ee018\") " pod="openstack/nova-metadata-0" Mar 12 15:10:07 crc kubenswrapper[4869]: I0312 15:10:07.856584 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 15:10:08 crc kubenswrapper[4869]: W0312 15:10:08.327577 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d835be4_a958_46b9_8319_4a13cb8ee018.slice/crio-e29d46567310258a937b32673b5e52bc020aade7a46aed78f2f185cd566ec626 WatchSource:0}: Error finding container e29d46567310258a937b32673b5e52bc020aade7a46aed78f2f185cd566ec626: Status 404 returned error can't find the container with id e29d46567310258a937b32673b5e52bc020aade7a46aed78f2f185cd566ec626 Mar 12 15:10:08 crc kubenswrapper[4869]: I0312 15:10:08.333393 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 15:10:08 crc kubenswrapper[4869]: I0312 15:10:08.377983 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f1248e0-398e-48f9-adad-d214c7700bb8" path="/var/lib/kubelet/pods/5f1248e0-398e-48f9-adad-d214c7700bb8/volumes" Mar 12 15:10:08 crc kubenswrapper[4869]: I0312 15:10:08.378988 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ec8e133-2a4d-4494-9372-eeb75efc75b5" path="/var/lib/kubelet/pods/6ec8e133-2a4d-4494-9372-eeb75efc75b5/volumes" Mar 12 15:10:08 crc kubenswrapper[4869]: I0312 15:10:08.382930 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df3e455c-91ed-49cc-9072-0b9abda6ba0d" path="/var/lib/kubelet/pods/df3e455c-91ed-49cc-9072-0b9abda6ba0d/volumes" Mar 12 15:10:08 crc kubenswrapper[4869]: I0312 15:10:08.456687 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d835be4-a958-46b9-8319-4a13cb8ee018","Type":"ContainerStarted","Data":"e29d46567310258a937b32673b5e52bc020aade7a46aed78f2f185cd566ec626"} Mar 12 15:10:08 crc kubenswrapper[4869]: I0312 15:10:08.458117 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"101c26e1-fa16-444e-936e-82baa2aebf60","Type":"ContainerStarted","Data":"19bf0ec7cf726e8753cb3f746e7bd71378da0722ae771fd64a40e818db937dd7"} Mar 12 15:10:08 crc kubenswrapper[4869]: I0312 15:10:08.479106 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.479089029 podStartE2EDuration="2.479089029s" podCreationTimestamp="2026-03-12 15:10:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:10:08.47875222 +0000 UTC m=+1360.763977498" watchObservedRunningTime="2026-03-12 15:10:08.479089029 +0000 UTC m=+1360.764314307" Mar 12 15:10:08 crc kubenswrapper[4869]: E0312 15:10:08.504505 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7de72d0295662374b609e566a0f1f8fefdb3762c779d9751c4ae25f2c894a238" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 12 15:10:08 crc kubenswrapper[4869]: E0312 15:10:08.505703 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7de72d0295662374b609e566a0f1f8fefdb3762c779d9751c4ae25f2c894a238" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 12 15:10:08 crc kubenswrapper[4869]: E0312 15:10:08.509891 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7de72d0295662374b609e566a0f1f8fefdb3762c779d9751c4ae25f2c894a238" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 12 15:10:08 crc kubenswrapper[4869]: E0312 15:10:08.509934 4869 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="0ba9b14c-abed-454e-a2f3-809ccafcd42f" containerName="nova-scheduler-scheduler" Mar 12 15:10:09 crc kubenswrapper[4869]: I0312 15:10:09.485377 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d835be4-a958-46b9-8319-4a13cb8ee018","Type":"ContainerStarted","Data":"bb4d62e29cf6dbc63591016052aa5285281f1051df5b2e48e5b66290aa8d96ee"} Mar 12 15:10:09 crc kubenswrapper[4869]: I0312 15:10:09.486480 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d835be4-a958-46b9-8319-4a13cb8ee018","Type":"ContainerStarted","Data":"881b0937148b30891a0a3d74a9663e2bd9b18812d2b722a689daa9673d969f08"} Mar 12 15:10:09 crc kubenswrapper[4869]: I0312 15:10:09.517937 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.517901068 podStartE2EDuration="2.517901068s" podCreationTimestamp="2026-03-12 15:10:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:10:09.5098882 +0000 UTC m=+1361.795113488" watchObservedRunningTime="2026-03-12 15:10:09.517901068 +0000 UTC m=+1361.803126346" Mar 12 15:10:09 crc kubenswrapper[4869]: E0312 15:10:09.838912 4869 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc13a91ca_db8c_42b4_bfca_029e427aff28.slice/crio-23f0e9b3a7cf88948884e8f788df4eb82dcd7e1e08852ac82e535a3497644ef7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc13a91ca_db8c_42b4_bfca_029e427aff28.slice\": RecentStats: unable to find data in memory cache]" Mar 12 15:10:10 crc kubenswrapper[4869]: I0312 15:10:10.496336 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"09af255f-def2-476c-bf2e-77a49b59388d","Type":"ContainerStarted","Data":"295bc7fe2a9fcd50f072a180c87d50a67ec9e424d5e1414bf169e414aa58bfc3"} Mar 12 15:10:10 crc kubenswrapper[4869]: I0312 15:10:10.524521 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.635819402 podStartE2EDuration="11.52450151s" podCreationTimestamp="2026-03-12 15:09:59 +0000 UTC" firstStartedPulling="2026-03-12 15:10:00.57370387 +0000 UTC m=+1352.858929148" lastFinishedPulling="2026-03-12 15:10:09.462385978 +0000 UTC m=+1361.747611256" observedRunningTime="2026-03-12 15:10:10.510967425 +0000 UTC m=+1362.796192703" watchObservedRunningTime="2026-03-12 15:10:10.52450151 +0000 UTC m=+1362.809726788" Mar 12 15:10:11 crc kubenswrapper[4869]: I0312 15:10:11.504733 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 12 15:10:12 crc kubenswrapper[4869]: I0312 15:10:12.247117 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 15:10:12 crc kubenswrapper[4869]: I0312 15:10:12.359388 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ba9b14c-abed-454e-a2f3-809ccafcd42f-config-data\") pod \"0ba9b14c-abed-454e-a2f3-809ccafcd42f\" (UID: \"0ba9b14c-abed-454e-a2f3-809ccafcd42f\") " Mar 12 15:10:12 crc kubenswrapper[4869]: I0312 15:10:12.359457 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjvs4\" (UniqueName: \"kubernetes.io/projected/0ba9b14c-abed-454e-a2f3-809ccafcd42f-kube-api-access-tjvs4\") pod \"0ba9b14c-abed-454e-a2f3-809ccafcd42f\" (UID: \"0ba9b14c-abed-454e-a2f3-809ccafcd42f\") " Mar 12 15:10:12 crc kubenswrapper[4869]: I0312 15:10:12.359500 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ba9b14c-abed-454e-a2f3-809ccafcd42f-combined-ca-bundle\") pod \"0ba9b14c-abed-454e-a2f3-809ccafcd42f\" (UID: \"0ba9b14c-abed-454e-a2f3-809ccafcd42f\") " Mar 12 15:10:12 crc kubenswrapper[4869]: I0312 15:10:12.365775 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ba9b14c-abed-454e-a2f3-809ccafcd42f-kube-api-access-tjvs4" (OuterVolumeSpecName: "kube-api-access-tjvs4") pod "0ba9b14c-abed-454e-a2f3-809ccafcd42f" (UID: "0ba9b14c-abed-454e-a2f3-809ccafcd42f"). InnerVolumeSpecName "kube-api-access-tjvs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:10:12 crc kubenswrapper[4869]: E0312 15:10:12.423411 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ba9b14c-abed-454e-a2f3-809ccafcd42f-config-data podName:0ba9b14c-abed-454e-a2f3-809ccafcd42f nodeName:}" failed. No retries permitted until 2026-03-12 15:10:12.923374257 +0000 UTC m=+1365.208599535 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/0ba9b14c-abed-454e-a2f3-809ccafcd42f-config-data") pod "0ba9b14c-abed-454e-a2f3-809ccafcd42f" (UID: "0ba9b14c-abed-454e-a2f3-809ccafcd42f") : error deleting /var/lib/kubelet/pods/0ba9b14c-abed-454e-a2f3-809ccafcd42f/volume-subpaths: remove /var/lib/kubelet/pods/0ba9b14c-abed-454e-a2f3-809ccafcd42f/volume-subpaths: no such file or directory Mar 12 15:10:12 crc kubenswrapper[4869]: I0312 15:10:12.427123 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ba9b14c-abed-454e-a2f3-809ccafcd42f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ba9b14c-abed-454e-a2f3-809ccafcd42f" (UID: "0ba9b14c-abed-454e-a2f3-809ccafcd42f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:10:12 crc kubenswrapper[4869]: I0312 15:10:12.462782 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjvs4\" (UniqueName: \"kubernetes.io/projected/0ba9b14c-abed-454e-a2f3-809ccafcd42f-kube-api-access-tjvs4\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:12 crc kubenswrapper[4869]: I0312 15:10:12.463225 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ba9b14c-abed-454e-a2f3-809ccafcd42f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:12 crc kubenswrapper[4869]: I0312 15:10:12.515279 4869 generic.go:334] "Generic (PLEG): container finished" podID="0ba9b14c-abed-454e-a2f3-809ccafcd42f" containerID="7de72d0295662374b609e566a0f1f8fefdb3762c779d9751c4ae25f2c894a238" exitCode=0 Mar 12 15:10:12 crc kubenswrapper[4869]: I0312 15:10:12.515341 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 15:10:12 crc kubenswrapper[4869]: I0312 15:10:12.515359 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0ba9b14c-abed-454e-a2f3-809ccafcd42f","Type":"ContainerDied","Data":"7de72d0295662374b609e566a0f1f8fefdb3762c779d9751c4ae25f2c894a238"} Mar 12 15:10:12 crc kubenswrapper[4869]: I0312 15:10:12.515440 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0ba9b14c-abed-454e-a2f3-809ccafcd42f","Type":"ContainerDied","Data":"081e35002fd9b43ba28f46cb613d8f60efde756e0633a17dee0f3984f300c389"} Mar 12 15:10:12 crc kubenswrapper[4869]: I0312 15:10:12.515464 4869 scope.go:117] "RemoveContainer" containerID="7de72d0295662374b609e566a0f1f8fefdb3762c779d9751c4ae25f2c894a238" Mar 12 15:10:12 crc kubenswrapper[4869]: I0312 15:10:12.543278 4869 scope.go:117] "RemoveContainer" containerID="7de72d0295662374b609e566a0f1f8fefdb3762c779d9751c4ae25f2c894a238" Mar 12 15:10:12 crc kubenswrapper[4869]: E0312 15:10:12.544227 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7de72d0295662374b609e566a0f1f8fefdb3762c779d9751c4ae25f2c894a238\": container with ID starting with 7de72d0295662374b609e566a0f1f8fefdb3762c779d9751c4ae25f2c894a238 not found: ID does not exist" containerID="7de72d0295662374b609e566a0f1f8fefdb3762c779d9751c4ae25f2c894a238" Mar 12 15:10:12 crc kubenswrapper[4869]: I0312 15:10:12.544274 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7de72d0295662374b609e566a0f1f8fefdb3762c779d9751c4ae25f2c894a238"} err="failed to get container status \"7de72d0295662374b609e566a0f1f8fefdb3762c779d9751c4ae25f2c894a238\": rpc error: code = NotFound desc = could not find container \"7de72d0295662374b609e566a0f1f8fefdb3762c779d9751c4ae25f2c894a238\": container with ID starting with 7de72d0295662374b609e566a0f1f8fefdb3762c779d9751c4ae25f2c894a238 not found: ID does not exist" Mar 12 15:10:12 crc kubenswrapper[4869]: I0312 15:10:12.856699 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 12 15:10:12 crc kubenswrapper[4869]: I0312 15:10:12.856750 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 12 15:10:12 crc kubenswrapper[4869]: I0312 15:10:12.971593 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ba9b14c-abed-454e-a2f3-809ccafcd42f-config-data\") pod \"0ba9b14c-abed-454e-a2f3-809ccafcd42f\" (UID: \"0ba9b14c-abed-454e-a2f3-809ccafcd42f\") " Mar 12 15:10:12 crc kubenswrapper[4869]: I0312 15:10:12.975566 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ba9b14c-abed-454e-a2f3-809ccafcd42f-config-data" (OuterVolumeSpecName: "config-data") pod "0ba9b14c-abed-454e-a2f3-809ccafcd42f" (UID: "0ba9b14c-abed-454e-a2f3-809ccafcd42f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:10:13 crc kubenswrapper[4869]: I0312 15:10:13.073860 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ba9b14c-abed-454e-a2f3-809ccafcd42f-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:13 crc kubenswrapper[4869]: I0312 15:10:13.145483 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 15:10:13 crc kubenswrapper[4869]: I0312 15:10:13.155729 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 15:10:13 crc kubenswrapper[4869]: I0312 15:10:13.166804 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 15:10:13 crc kubenswrapper[4869]: E0312 15:10:13.167308 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba9b14c-abed-454e-a2f3-809ccafcd42f" containerName="nova-scheduler-scheduler" Mar 12 15:10:13 crc kubenswrapper[4869]: I0312 15:10:13.167331 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba9b14c-abed-454e-a2f3-809ccafcd42f" containerName="nova-scheduler-scheduler" Mar 12 15:10:13 crc kubenswrapper[4869]: I0312 15:10:13.167796 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ba9b14c-abed-454e-a2f3-809ccafcd42f" containerName="nova-scheduler-scheduler" Mar 12 15:10:13 crc kubenswrapper[4869]: I0312 15:10:13.168606 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 15:10:13 crc kubenswrapper[4869]: I0312 15:10:13.170705 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 12 15:10:13 crc kubenswrapper[4869]: I0312 15:10:13.175715 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxsmm\" (UniqueName: \"kubernetes.io/projected/d03a118b-3b5d-437b-b3e1-dcf419ec3065-kube-api-access-zxsmm\") pod \"nova-scheduler-0\" (UID: \"d03a118b-3b5d-437b-b3e1-dcf419ec3065\") " pod="openstack/nova-scheduler-0" Mar 12 15:10:13 crc kubenswrapper[4869]: I0312 15:10:13.175941 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d03a118b-3b5d-437b-b3e1-dcf419ec3065-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d03a118b-3b5d-437b-b3e1-dcf419ec3065\") " pod="openstack/nova-scheduler-0" Mar 12 15:10:13 crc kubenswrapper[4869]: I0312 15:10:13.176055 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d03a118b-3b5d-437b-b3e1-dcf419ec3065-config-data\") pod \"nova-scheduler-0\" (UID: \"d03a118b-3b5d-437b-b3e1-dcf419ec3065\") " pod="openstack/nova-scheduler-0" Mar 12 15:10:13 crc kubenswrapper[4869]: I0312 15:10:13.182526 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 15:10:13 crc kubenswrapper[4869]: I0312 15:10:13.277154 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d03a118b-3b5d-437b-b3e1-dcf419ec3065-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d03a118b-3b5d-437b-b3e1-dcf419ec3065\") " pod="openstack/nova-scheduler-0" Mar 12 15:10:13 crc kubenswrapper[4869]: I0312 15:10:13.277207 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d03a118b-3b5d-437b-b3e1-dcf419ec3065-config-data\") pod \"nova-scheduler-0\" (UID: \"d03a118b-3b5d-437b-b3e1-dcf419ec3065\") " pod="openstack/nova-scheduler-0" Mar 12 15:10:13 crc kubenswrapper[4869]: I0312 15:10:13.277289 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxsmm\" (UniqueName: \"kubernetes.io/projected/d03a118b-3b5d-437b-b3e1-dcf419ec3065-kube-api-access-zxsmm\") pod \"nova-scheduler-0\" (UID: \"d03a118b-3b5d-437b-b3e1-dcf419ec3065\") " pod="openstack/nova-scheduler-0" Mar 12 15:10:13 crc kubenswrapper[4869]: I0312 15:10:13.280820 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d03a118b-3b5d-437b-b3e1-dcf419ec3065-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d03a118b-3b5d-437b-b3e1-dcf419ec3065\") " pod="openstack/nova-scheduler-0" Mar 12 15:10:13 crc kubenswrapper[4869]: I0312 15:10:13.287209 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d03a118b-3b5d-437b-b3e1-dcf419ec3065-config-data\") pod \"nova-scheduler-0\" (UID: \"d03a118b-3b5d-437b-b3e1-dcf419ec3065\") " pod="openstack/nova-scheduler-0" Mar 12 15:10:13 crc kubenswrapper[4869]: I0312 15:10:13.295040 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxsmm\" (UniqueName: \"kubernetes.io/projected/d03a118b-3b5d-437b-b3e1-dcf419ec3065-kube-api-access-zxsmm\") pod \"nova-scheduler-0\" (UID: \"d03a118b-3b5d-437b-b3e1-dcf419ec3065\") " pod="openstack/nova-scheduler-0" Mar 12 15:10:13 crc kubenswrapper[4869]: I0312 15:10:13.485768 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 15:10:13 crc kubenswrapper[4869]: W0312 15:10:13.937103 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd03a118b_3b5d_437b_b3e1_dcf419ec3065.slice/crio-05eade6b9531e29e74adfa75a0e186734fafdf432f243b282ed50939b5939e65 WatchSource:0}: Error finding container 05eade6b9531e29e74adfa75a0e186734fafdf432f243b282ed50939b5939e65: Status 404 returned error can't find the container with id 05eade6b9531e29e74adfa75a0e186734fafdf432f243b282ed50939b5939e65 Mar 12 15:10:13 crc kubenswrapper[4869]: I0312 15:10:13.937469 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 15:10:14 crc kubenswrapper[4869]: I0312 15:10:14.351662 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ba9b14c-abed-454e-a2f3-809ccafcd42f" path="/var/lib/kubelet/pods/0ba9b14c-abed-454e-a2f3-809ccafcd42f/volumes" Mar 12 15:10:14 crc kubenswrapper[4869]: I0312 15:10:14.537094 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d03a118b-3b5d-437b-b3e1-dcf419ec3065","Type":"ContainerStarted","Data":"870ebf208cd0fac909d931e373003ad829d820f144149533ee90b199d90207af"} Mar 12 15:10:14 crc kubenswrapper[4869]: I0312 15:10:14.537146 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d03a118b-3b5d-437b-b3e1-dcf419ec3065","Type":"ContainerStarted","Data":"05eade6b9531e29e74adfa75a0e186734fafdf432f243b282ed50939b5939e65"} Mar 12 15:10:14 crc kubenswrapper[4869]: I0312 15:10:14.552071 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.552053728 podStartE2EDuration="1.552053728s" podCreationTimestamp="2026-03-12 15:10:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:10:14.550214196 +0000 UTC m=+1366.835439474" watchObservedRunningTime="2026-03-12 15:10:14.552053728 +0000 UTC m=+1366.837279006" Mar 12 15:10:15 crc kubenswrapper[4869]: I0312 15:10:15.779630 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 12 15:10:16 crc kubenswrapper[4869]: I0312 15:10:16.865915 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 12 15:10:16 crc kubenswrapper[4869]: I0312 15:10:16.866527 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 12 15:10:17 crc kubenswrapper[4869]: I0312 15:10:17.576378 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a10d0ea8-db1d-4779-9b4c-d97edb20c85e","Type":"ContainerStarted","Data":"12c4bfec336d05cae541521a8b169c678a074f81ba23441c84731e123d2cf8a3"} Mar 12 15:10:17 crc kubenswrapper[4869]: I0312 15:10:17.599674 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.280684126 podStartE2EDuration="19.599653755s" podCreationTimestamp="2026-03-12 15:09:58 +0000 UTC" firstStartedPulling="2026-03-12 15:09:59.41229381 +0000 UTC m=+1351.697519088" lastFinishedPulling="2026-03-12 15:10:16.731263439 +0000 UTC m=+1369.016488717" observedRunningTime="2026-03-12 15:10:17.590198176 +0000 UTC m=+1369.875423464" watchObservedRunningTime="2026-03-12 15:10:17.599653755 +0000 UTC m=+1369.884879033" Mar 12 15:10:17 crc kubenswrapper[4869]: I0312 15:10:17.857110 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 12 15:10:17 crc kubenswrapper[4869]: I0312 15:10:17.858397 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 12 15:10:17 crc kubenswrapper[4869]: I0312 15:10:17.949939 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="101c26e1-fa16-444e-936e-82baa2aebf60" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.216:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 12 15:10:17 crc kubenswrapper[4869]: I0312 15:10:17.950041 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="101c26e1-fa16-444e-936e-82baa2aebf60" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.216:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 12 15:10:18 crc kubenswrapper[4869]: I0312 15:10:18.486260 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 12 15:10:18 crc kubenswrapper[4869]: I0312 15:10:18.798096 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:10:18 crc kubenswrapper[4869]: I0312 15:10:18.798152 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:10:18 crc kubenswrapper[4869]: I0312 15:10:18.799747 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-cell1-novncproxy-0" podUID="a10d0ea8-db1d-4779-9b4c-d97edb20c85e" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.211:6080/vnc_lite.html\": dial tcp 10.217.0.211:6080: connect: connection refused" Mar 12 15:10:18 crc kubenswrapper[4869]: I0312 15:10:18.864036 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7d835be4-a958-46b9-8319-4a13cb8ee018" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.217:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 15:10:18 crc kubenswrapper[4869]: I0312 15:10:18.868735 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7d835be4-a958-46b9-8319-4a13cb8ee018" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.217:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 15:10:20 crc kubenswrapper[4869]: E0312 15:10:20.060947 4869 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc13a91ca_db8c_42b4_bfca_029e427aff28.slice/crio-23f0e9b3a7cf88948884e8f788df4eb82dcd7e1e08852ac82e535a3497644ef7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc13a91ca_db8c_42b4_bfca_029e427aff28.slice\": RecentStats: unable to find data in memory cache]" Mar 12 15:10:23 crc kubenswrapper[4869]: I0312 15:10:23.487057 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 12 15:10:23 crc kubenswrapper[4869]: I0312 15:10:23.516170 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 12 15:10:23 crc kubenswrapper[4869]: I0312 15:10:23.659752 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 12 15:10:26 crc kubenswrapper[4869]: I0312 15:10:26.868883 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 12 15:10:26 crc kubenswrapper[4869]: I0312 15:10:26.869173 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 12 15:10:26 crc kubenswrapper[4869]: I0312 15:10:26.870084 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 12 15:10:26 crc kubenswrapper[4869]: I0312 15:10:26.870127 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 12 15:10:26 crc kubenswrapper[4869]: I0312 15:10:26.872748 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 12 15:10:26 crc kubenswrapper[4869]: I0312 15:10:26.873807 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 12 15:10:27 crc kubenswrapper[4869]: I0312 15:10:27.066818 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b4c997d87-pnwr2"] Mar 12 15:10:27 crc kubenswrapper[4869]: I0312 15:10:27.068389 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b4c997d87-pnwr2" Mar 12 15:10:27 crc kubenswrapper[4869]: I0312 15:10:27.112597 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b4c997d87-pnwr2"] Mar 12 15:10:27 crc kubenswrapper[4869]: I0312 15:10:27.211980 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3301f303-5379-498d-978f-6606497ae3da-ovsdbserver-nb\") pod \"dnsmasq-dns-5b4c997d87-pnwr2\" (UID: \"3301f303-5379-498d-978f-6606497ae3da\") " pod="openstack/dnsmasq-dns-5b4c997d87-pnwr2" Mar 12 15:10:27 crc kubenswrapper[4869]: I0312 15:10:27.212321 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3301f303-5379-498d-978f-6606497ae3da-dns-swift-storage-0\") pod \"dnsmasq-dns-5b4c997d87-pnwr2\" (UID: \"3301f303-5379-498d-978f-6606497ae3da\") " pod="openstack/dnsmasq-dns-5b4c997d87-pnwr2" Mar 12 15:10:27 crc kubenswrapper[4869]: I0312 15:10:27.212354 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3301f303-5379-498d-978f-6606497ae3da-config\") pod \"dnsmasq-dns-5b4c997d87-pnwr2\" (UID: \"3301f303-5379-498d-978f-6606497ae3da\") " pod="openstack/dnsmasq-dns-5b4c997d87-pnwr2" Mar 12 15:10:27 crc kubenswrapper[4869]: I0312 15:10:27.212402 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3301f303-5379-498d-978f-6606497ae3da-ovsdbserver-sb\") pod \"dnsmasq-dns-5b4c997d87-pnwr2\" (UID: \"3301f303-5379-498d-978f-6606497ae3da\") " pod="openstack/dnsmasq-dns-5b4c997d87-pnwr2" Mar 12 15:10:27 crc kubenswrapper[4869]: I0312 15:10:27.212683 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8flm5\" (UniqueName: \"kubernetes.io/projected/3301f303-5379-498d-978f-6606497ae3da-kube-api-access-8flm5\") pod \"dnsmasq-dns-5b4c997d87-pnwr2\" (UID: \"3301f303-5379-498d-978f-6606497ae3da\") " pod="openstack/dnsmasq-dns-5b4c997d87-pnwr2" Mar 12 15:10:27 crc kubenswrapper[4869]: I0312 15:10:27.212723 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3301f303-5379-498d-978f-6606497ae3da-dns-svc\") pod \"dnsmasq-dns-5b4c997d87-pnwr2\" (UID: \"3301f303-5379-498d-978f-6606497ae3da\") " pod="openstack/dnsmasq-dns-5b4c997d87-pnwr2" Mar 12 15:10:27 crc kubenswrapper[4869]: I0312 15:10:27.314707 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3301f303-5379-498d-978f-6606497ae3da-dns-svc\") pod \"dnsmasq-dns-5b4c997d87-pnwr2\" (UID: \"3301f303-5379-498d-978f-6606497ae3da\") " pod="openstack/dnsmasq-dns-5b4c997d87-pnwr2" Mar 12 15:10:27 crc kubenswrapper[4869]: I0312 15:10:27.314804 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3301f303-5379-498d-978f-6606497ae3da-ovsdbserver-nb\") pod \"dnsmasq-dns-5b4c997d87-pnwr2\" (UID: \"3301f303-5379-498d-978f-6606497ae3da\") " pod="openstack/dnsmasq-dns-5b4c997d87-pnwr2" Mar 12 15:10:27 crc kubenswrapper[4869]: I0312 15:10:27.314844 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3301f303-5379-498d-978f-6606497ae3da-dns-swift-storage-0\") pod \"dnsmasq-dns-5b4c997d87-pnwr2\" (UID: \"3301f303-5379-498d-978f-6606497ae3da\") " pod="openstack/dnsmasq-dns-5b4c997d87-pnwr2" Mar 12 15:10:27 crc kubenswrapper[4869]: I0312 15:10:27.314873 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3301f303-5379-498d-978f-6606497ae3da-config\") pod \"dnsmasq-dns-5b4c997d87-pnwr2\" (UID: \"3301f303-5379-498d-978f-6606497ae3da\") " pod="openstack/dnsmasq-dns-5b4c997d87-pnwr2" Mar 12 15:10:27 crc kubenswrapper[4869]: I0312 15:10:27.314917 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3301f303-5379-498d-978f-6606497ae3da-ovsdbserver-sb\") pod \"dnsmasq-dns-5b4c997d87-pnwr2\" (UID: \"3301f303-5379-498d-978f-6606497ae3da\") " pod="openstack/dnsmasq-dns-5b4c997d87-pnwr2" Mar 12 15:10:27 crc kubenswrapper[4869]: I0312 15:10:27.314960 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8flm5\" (UniqueName: \"kubernetes.io/projected/3301f303-5379-498d-978f-6606497ae3da-kube-api-access-8flm5\") pod \"dnsmasq-dns-5b4c997d87-pnwr2\" (UID: \"3301f303-5379-498d-978f-6606497ae3da\") " pod="openstack/dnsmasq-dns-5b4c997d87-pnwr2" Mar 12 15:10:27 crc kubenswrapper[4869]: I0312 15:10:27.315941 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3301f303-5379-498d-978f-6606497ae3da-ovsdbserver-nb\") pod \"dnsmasq-dns-5b4c997d87-pnwr2\" (UID: \"3301f303-5379-498d-978f-6606497ae3da\") " pod="openstack/dnsmasq-dns-5b4c997d87-pnwr2" Mar 12 15:10:27 crc kubenswrapper[4869]: I0312 15:10:27.316012 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3301f303-5379-498d-978f-6606497ae3da-dns-swift-storage-0\") pod \"dnsmasq-dns-5b4c997d87-pnwr2\" (UID: \"3301f303-5379-498d-978f-6606497ae3da\") " pod="openstack/dnsmasq-dns-5b4c997d87-pnwr2" Mar 12 15:10:27 crc kubenswrapper[4869]: I0312 15:10:27.316164 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3301f303-5379-498d-978f-6606497ae3da-config\") pod \"dnsmasq-dns-5b4c997d87-pnwr2\" (UID: \"3301f303-5379-498d-978f-6606497ae3da\") " pod="openstack/dnsmasq-dns-5b4c997d87-pnwr2" Mar 12 15:10:27 crc kubenswrapper[4869]: I0312 15:10:27.316627 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3301f303-5379-498d-978f-6606497ae3da-ovsdbserver-sb\") pod \"dnsmasq-dns-5b4c997d87-pnwr2\" (UID: \"3301f303-5379-498d-978f-6606497ae3da\") " pod="openstack/dnsmasq-dns-5b4c997d87-pnwr2" Mar 12 15:10:27 crc kubenswrapper[4869]: I0312 15:10:27.316751 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3301f303-5379-498d-978f-6606497ae3da-dns-svc\") pod \"dnsmasq-dns-5b4c997d87-pnwr2\" (UID: \"3301f303-5379-498d-978f-6606497ae3da\") " pod="openstack/dnsmasq-dns-5b4c997d87-pnwr2" Mar 12 15:10:27 crc kubenswrapper[4869]: I0312 15:10:27.343397 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8flm5\" (UniqueName: \"kubernetes.io/projected/3301f303-5379-498d-978f-6606497ae3da-kube-api-access-8flm5\") pod \"dnsmasq-dns-5b4c997d87-pnwr2\" (UID: \"3301f303-5379-498d-978f-6606497ae3da\") " pod="openstack/dnsmasq-dns-5b4c997d87-pnwr2" Mar 12 15:10:27 crc kubenswrapper[4869]: I0312 15:10:27.412510 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b4c997d87-pnwr2" Mar 12 15:10:27 crc kubenswrapper[4869]: I0312 15:10:27.860097 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b4c997d87-pnwr2"] Mar 12 15:10:27 crc kubenswrapper[4869]: I0312 15:10:27.880851 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 12 15:10:27 crc kubenswrapper[4869]: I0312 15:10:27.916454 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 12 15:10:27 crc kubenswrapper[4869]: I0312 15:10:27.917065 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 12 15:10:28 crc kubenswrapper[4869]: E0312 15:10:28.375674 4869 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/45934a6f9cc19b64db5199e6869d7ddab5e814185fa183128c62d38e5c22821b/diff" to get inode usage: stat /var/lib/containers/storage/overlay/45934a6f9cc19b64db5199e6869d7ddab5e814185fa183128c62d38e5c22821b/diff: no such file or directory, extraDiskErr: Mar 12 15:10:28 crc kubenswrapper[4869]: I0312 15:10:28.684992 4869 generic.go:334] "Generic (PLEG): container finished" podID="3301f303-5379-498d-978f-6606497ae3da" containerID="771b6a7e54188575e33684bc34c2ab8e4af795e3035c6d5f13139403a75da52f" exitCode=0 Mar 12 15:10:28 crc kubenswrapper[4869]: I0312 15:10:28.685083 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b4c997d87-pnwr2" event={"ID":"3301f303-5379-498d-978f-6606497ae3da","Type":"ContainerDied","Data":"771b6a7e54188575e33684bc34c2ab8e4af795e3035c6d5f13139403a75da52f"} Mar 12 15:10:28 crc kubenswrapper[4869]: I0312 15:10:28.685145 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b4c997d87-pnwr2" event={"ID":"3301f303-5379-498d-978f-6606497ae3da","Type":"ContainerStarted","Data":"32ce90efaf2bf231f0cba42e67b6a82ac1fec04cdc64fc9ebe2f2a58c0b11cf5"} Mar 12 15:10:28 crc kubenswrapper[4869]: I0312 15:10:28.704970 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 12 15:10:28 crc kubenswrapper[4869]: I0312 15:10:28.829567 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:10:28 crc kubenswrapper[4869]: I0312 15:10:28.865614 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 12 15:10:29 crc kubenswrapper[4869]: I0312 15:10:29.130628 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-4zpf4"] Mar 12 15:10:29 crc kubenswrapper[4869]: I0312 15:10:29.131901 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4zpf4" Mar 12 15:10:29 crc kubenswrapper[4869]: I0312 15:10:29.134312 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 12 15:10:29 crc kubenswrapper[4869]: I0312 15:10:29.134410 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 12 15:10:29 crc kubenswrapper[4869]: I0312 15:10:29.141664 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-4zpf4"] Mar 12 15:10:29 crc kubenswrapper[4869]: I0312 15:10:29.155225 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b423daaf-54ec-4a18-a9b6-572c4f32a207-scripts\") pod \"nova-cell1-cell-mapping-4zpf4\" (UID: \"b423daaf-54ec-4a18-a9b6-572c4f32a207\") " pod="openstack/nova-cell1-cell-mapping-4zpf4" Mar 12 15:10:29 crc kubenswrapper[4869]: I0312 15:10:29.155267 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b423daaf-54ec-4a18-a9b6-572c4f32a207-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4zpf4\" (UID: \"b423daaf-54ec-4a18-a9b6-572c4f32a207\") " pod="openstack/nova-cell1-cell-mapping-4zpf4" Mar 12 15:10:29 crc kubenswrapper[4869]: I0312 15:10:29.155308 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b423daaf-54ec-4a18-a9b6-572c4f32a207-config-data\") pod \"nova-cell1-cell-mapping-4zpf4\" (UID: \"b423daaf-54ec-4a18-a9b6-572c4f32a207\") " pod="openstack/nova-cell1-cell-mapping-4zpf4" Mar 12 15:10:29 crc kubenswrapper[4869]: I0312 15:10:29.155416 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b76x\" (UniqueName: \"kubernetes.io/projected/b423daaf-54ec-4a18-a9b6-572c4f32a207-kube-api-access-5b76x\") pod \"nova-cell1-cell-mapping-4zpf4\" (UID: \"b423daaf-54ec-4a18-a9b6-572c4f32a207\") " pod="openstack/nova-cell1-cell-mapping-4zpf4" Mar 12 15:10:29 crc kubenswrapper[4869]: I0312 15:10:29.257254 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b423daaf-54ec-4a18-a9b6-572c4f32a207-config-data\") pod \"nova-cell1-cell-mapping-4zpf4\" (UID: \"b423daaf-54ec-4a18-a9b6-572c4f32a207\") " pod="openstack/nova-cell1-cell-mapping-4zpf4" Mar 12 15:10:29 crc kubenswrapper[4869]: I0312 15:10:29.257383 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b76x\" (UniqueName: \"kubernetes.io/projected/b423daaf-54ec-4a18-a9b6-572c4f32a207-kube-api-access-5b76x\") pod \"nova-cell1-cell-mapping-4zpf4\" (UID: \"b423daaf-54ec-4a18-a9b6-572c4f32a207\") " pod="openstack/nova-cell1-cell-mapping-4zpf4" Mar 12 15:10:29 crc kubenswrapper[4869]: I0312 15:10:29.257498 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b423daaf-54ec-4a18-a9b6-572c4f32a207-scripts\") pod \"nova-cell1-cell-mapping-4zpf4\" (UID: \"b423daaf-54ec-4a18-a9b6-572c4f32a207\") " pod="openstack/nova-cell1-cell-mapping-4zpf4" Mar 12 15:10:29 crc kubenswrapper[4869]: I0312 15:10:29.257518 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b423daaf-54ec-4a18-a9b6-572c4f32a207-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4zpf4\" (UID: \"b423daaf-54ec-4a18-a9b6-572c4f32a207\") " pod="openstack/nova-cell1-cell-mapping-4zpf4" Mar 12 15:10:29 crc kubenswrapper[4869]: I0312 15:10:29.282160 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b423daaf-54ec-4a18-a9b6-572c4f32a207-config-data\") pod \"nova-cell1-cell-mapping-4zpf4\" (UID: \"b423daaf-54ec-4a18-a9b6-572c4f32a207\") " pod="openstack/nova-cell1-cell-mapping-4zpf4" Mar 12 15:10:29 crc kubenswrapper[4869]: I0312 15:10:29.283240 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b423daaf-54ec-4a18-a9b6-572c4f32a207-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4zpf4\" (UID: \"b423daaf-54ec-4a18-a9b6-572c4f32a207\") " pod="openstack/nova-cell1-cell-mapping-4zpf4" Mar 12 15:10:29 crc kubenswrapper[4869]: I0312 15:10:29.285352 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b76x\" (UniqueName: \"kubernetes.io/projected/b423daaf-54ec-4a18-a9b6-572c4f32a207-kube-api-access-5b76x\") pod \"nova-cell1-cell-mapping-4zpf4\" (UID: \"b423daaf-54ec-4a18-a9b6-572c4f32a207\") " pod="openstack/nova-cell1-cell-mapping-4zpf4" Mar 12 15:10:29 crc kubenswrapper[4869]: I0312 15:10:29.285719 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b423daaf-54ec-4a18-a9b6-572c4f32a207-scripts\") pod \"nova-cell1-cell-mapping-4zpf4\" (UID: \"b423daaf-54ec-4a18-a9b6-572c4f32a207\") " pod="openstack/nova-cell1-cell-mapping-4zpf4" Mar 12 15:10:29 crc kubenswrapper[4869]: I0312 15:10:29.291495 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:10:29 crc kubenswrapper[4869]: I0312 15:10:29.291816 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="09af255f-def2-476c-bf2e-77a49b59388d" containerName="ceilometer-central-agent" containerID="cri-o://44d83d620e6d575294fd36749c92e155694e571fd0ffb96b94561349e9ad3ad5" gracePeriod=30 Mar 12 15:10:29 crc kubenswrapper[4869]: I0312 15:10:29.292179 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="09af255f-def2-476c-bf2e-77a49b59388d" containerName="proxy-httpd" containerID="cri-o://295bc7fe2a9fcd50f072a180c87d50a67ec9e424d5e1414bf169e414aa58bfc3" gracePeriod=30 Mar 12 15:10:29 crc kubenswrapper[4869]: I0312 15:10:29.292312 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="09af255f-def2-476c-bf2e-77a49b59388d" containerName="sg-core" containerID="cri-o://6ab533bcf44cdccef55fa48cbb2761294a453d5324f8d62c305a78c960df28fd" gracePeriod=30 Mar 12 15:10:29 crc kubenswrapper[4869]: I0312 15:10:29.292369 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="09af255f-def2-476c-bf2e-77a49b59388d" containerName="ceilometer-notification-agent" containerID="cri-o://6efa829c29242f44e8e07501ff5369c59051fd3c4af89adc548677b9bd41977c" gracePeriod=30 Mar 12 15:10:29 crc kubenswrapper[4869]: I0312 15:10:29.311044 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="09af255f-def2-476c-bf2e-77a49b59388d" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 12 15:10:29 crc kubenswrapper[4869]: I0312 15:10:29.457174 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4zpf4" Mar 12 15:10:30 crc kubenswrapper[4869]: I0312 15:10:29.591051 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 12 15:10:30 crc kubenswrapper[4869]: I0312 15:10:29.664869 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="09af255f-def2-476c-bf2e-77a49b59388d" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.212:3000/\": dial tcp 10.217.0.212:3000: connect: connection refused" Mar 12 15:10:30 crc kubenswrapper[4869]: I0312 15:10:29.723837 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b4c997d87-pnwr2" event={"ID":"3301f303-5379-498d-978f-6606497ae3da","Type":"ContainerStarted","Data":"f4f9a2c68cb581aa8b7a9969638e76f3e80eef2920a4f4f65cb47fbb88d7b0a7"} Mar 12 15:10:30 crc kubenswrapper[4869]: I0312 15:10:29.724112 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b4c997d87-pnwr2" Mar 12 15:10:30 crc kubenswrapper[4869]: I0312 15:10:29.728493 4869 generic.go:334] "Generic (PLEG): container finished" podID="09af255f-def2-476c-bf2e-77a49b59388d" containerID="295bc7fe2a9fcd50f072a180c87d50a67ec9e424d5e1414bf169e414aa58bfc3" exitCode=0 Mar 12 15:10:30 crc kubenswrapper[4869]: I0312 15:10:29.728517 4869 generic.go:334] "Generic (PLEG): container finished" podID="09af255f-def2-476c-bf2e-77a49b59388d" containerID="6ab533bcf44cdccef55fa48cbb2761294a453d5324f8d62c305a78c960df28fd" exitCode=2 Mar 12 15:10:30 crc kubenswrapper[4869]: I0312 15:10:29.728653 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"09af255f-def2-476c-bf2e-77a49b59388d","Type":"ContainerDied","Data":"295bc7fe2a9fcd50f072a180c87d50a67ec9e424d5e1414bf169e414aa58bfc3"} Mar 12 15:10:30 crc kubenswrapper[4869]: I0312 15:10:29.730025 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"09af255f-def2-476c-bf2e-77a49b59388d","Type":"ContainerDied","Data":"6ab533bcf44cdccef55fa48cbb2761294a453d5324f8d62c305a78c960df28fd"} Mar 12 15:10:30 crc kubenswrapper[4869]: I0312 15:10:29.730325 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="101c26e1-fa16-444e-936e-82baa2aebf60" containerName="nova-api-api" containerID="cri-o://19bf0ec7cf726e8753cb3f746e7bd71378da0722ae771fd64a40e818db937dd7" gracePeriod=30 Mar 12 15:10:30 crc kubenswrapper[4869]: I0312 15:10:29.730330 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="101c26e1-fa16-444e-936e-82baa2aebf60" containerName="nova-api-log" containerID="cri-o://50f9dc841391d688b3de13a15773bd6b23c8d1163f074c9d462fde9d66258e3a" gracePeriod=30 Mar 12 15:10:30 crc kubenswrapper[4869]: I0312 15:10:29.756331 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b4c997d87-pnwr2" podStartSLOduration=2.7563147470000002 podStartE2EDuration="2.756314747s" podCreationTimestamp="2026-03-12 15:10:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:10:29.745024346 +0000 UTC m=+1382.030249624" watchObservedRunningTime="2026-03-12 15:10:29.756314747 +0000 UTC m=+1382.041540025" Mar 12 15:10:30 crc kubenswrapper[4869]: I0312 15:10:30.530709 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-4zpf4"] Mar 12 15:10:30 crc kubenswrapper[4869]: I0312 15:10:30.576996 4869 scope.go:117] "RemoveContainer" containerID="134a0a881c718f6804c48d1c69519c46cdcdfff2b928bbdb5070d0c06ca68204" Mar 12 15:10:30 crc kubenswrapper[4869]: I0312 15:10:30.755744 4869 generic.go:334] "Generic (PLEG): container finished" podID="09af255f-def2-476c-bf2e-77a49b59388d" containerID="6efa829c29242f44e8e07501ff5369c59051fd3c4af89adc548677b9bd41977c" exitCode=0 Mar 12 15:10:30 crc kubenswrapper[4869]: I0312 15:10:30.755779 4869 generic.go:334] "Generic (PLEG): container finished" podID="09af255f-def2-476c-bf2e-77a49b59388d" containerID="44d83d620e6d575294fd36749c92e155694e571fd0ffb96b94561349e9ad3ad5" exitCode=0 Mar 12 15:10:30 crc kubenswrapper[4869]: I0312 15:10:30.755840 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"09af255f-def2-476c-bf2e-77a49b59388d","Type":"ContainerDied","Data":"6efa829c29242f44e8e07501ff5369c59051fd3c4af89adc548677b9bd41977c"} Mar 12 15:10:30 crc kubenswrapper[4869]: I0312 15:10:30.755884 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"09af255f-def2-476c-bf2e-77a49b59388d","Type":"ContainerDied","Data":"44d83d620e6d575294fd36749c92e155694e571fd0ffb96b94561349e9ad3ad5"} Mar 12 15:10:30 crc kubenswrapper[4869]: I0312 15:10:30.770987 4869 generic.go:334] "Generic (PLEG): container finished" podID="101c26e1-fa16-444e-936e-82baa2aebf60" containerID="50f9dc841391d688b3de13a15773bd6b23c8d1163f074c9d462fde9d66258e3a" exitCode=143 Mar 12 15:10:30 crc kubenswrapper[4869]: I0312 15:10:30.771071 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"101c26e1-fa16-444e-936e-82baa2aebf60","Type":"ContainerDied","Data":"50f9dc841391d688b3de13a15773bd6b23c8d1163f074c9d462fde9d66258e3a"} Mar 12 15:10:30 crc kubenswrapper[4869]: I0312 15:10:30.772871 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4zpf4" event={"ID":"b423daaf-54ec-4a18-a9b6-572c4f32a207","Type":"ContainerStarted","Data":"d1f98d0540b5fe7a3c23ce0303c389c0076e134144f4512eee258cb2a4cd8b6d"} Mar 12 15:10:30 crc kubenswrapper[4869]: I0312 15:10:30.820686 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 15:10:30 crc kubenswrapper[4869]: I0312 15:10:30.996000 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09af255f-def2-476c-bf2e-77a49b59388d-log-httpd\") pod \"09af255f-def2-476c-bf2e-77a49b59388d\" (UID: \"09af255f-def2-476c-bf2e-77a49b59388d\") " Mar 12 15:10:30 crc kubenswrapper[4869]: I0312 15:10:30.996337 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09af255f-def2-476c-bf2e-77a49b59388d-run-httpd\") pod \"09af255f-def2-476c-bf2e-77a49b59388d\" (UID: \"09af255f-def2-476c-bf2e-77a49b59388d\") " Mar 12 15:10:30 crc kubenswrapper[4869]: I0312 15:10:30.996466 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09af255f-def2-476c-bf2e-77a49b59388d-combined-ca-bundle\") pod \"09af255f-def2-476c-bf2e-77a49b59388d\" (UID: \"09af255f-def2-476c-bf2e-77a49b59388d\") " Mar 12 15:10:30 crc kubenswrapper[4869]: I0312 15:10:30.996470 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09af255f-def2-476c-bf2e-77a49b59388d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "09af255f-def2-476c-bf2e-77a49b59388d" (UID: "09af255f-def2-476c-bf2e-77a49b59388d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:10:30 crc kubenswrapper[4869]: I0312 15:10:30.996502 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09af255f-def2-476c-bf2e-77a49b59388d-scripts\") pod \"09af255f-def2-476c-bf2e-77a49b59388d\" (UID: \"09af255f-def2-476c-bf2e-77a49b59388d\") " Mar 12 15:10:30 crc kubenswrapper[4869]: I0312 15:10:30.996579 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/09af255f-def2-476c-bf2e-77a49b59388d-sg-core-conf-yaml\") pod \"09af255f-def2-476c-bf2e-77a49b59388d\" (UID: \"09af255f-def2-476c-bf2e-77a49b59388d\") " Mar 12 15:10:30 crc kubenswrapper[4869]: I0312 15:10:30.996612 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpnkq\" (UniqueName: \"kubernetes.io/projected/09af255f-def2-476c-bf2e-77a49b59388d-kube-api-access-jpnkq\") pod \"09af255f-def2-476c-bf2e-77a49b59388d\" (UID: \"09af255f-def2-476c-bf2e-77a49b59388d\") " Mar 12 15:10:30 crc kubenswrapper[4869]: I0312 15:10:30.996637 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09af255f-def2-476c-bf2e-77a49b59388d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "09af255f-def2-476c-bf2e-77a49b59388d" (UID: "09af255f-def2-476c-bf2e-77a49b59388d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:10:30 crc kubenswrapper[4869]: I0312 15:10:30.996666 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09af255f-def2-476c-bf2e-77a49b59388d-config-data\") pod \"09af255f-def2-476c-bf2e-77a49b59388d\" (UID: \"09af255f-def2-476c-bf2e-77a49b59388d\") " Mar 12 15:10:30 crc kubenswrapper[4869]: I0312 15:10:30.997375 4869 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09af255f-def2-476c-bf2e-77a49b59388d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:30 crc kubenswrapper[4869]: I0312 15:10:30.997401 4869 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09af255f-def2-476c-bf2e-77a49b59388d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.003775 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09af255f-def2-476c-bf2e-77a49b59388d-kube-api-access-jpnkq" (OuterVolumeSpecName: "kube-api-access-jpnkq") pod "09af255f-def2-476c-bf2e-77a49b59388d" (UID: "09af255f-def2-476c-bf2e-77a49b59388d"). InnerVolumeSpecName "kube-api-access-jpnkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.005927 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09af255f-def2-476c-bf2e-77a49b59388d-scripts" (OuterVolumeSpecName: "scripts") pod "09af255f-def2-476c-bf2e-77a49b59388d" (UID: "09af255f-def2-476c-bf2e-77a49b59388d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.029223 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09af255f-def2-476c-bf2e-77a49b59388d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "09af255f-def2-476c-bf2e-77a49b59388d" (UID: "09af255f-def2-476c-bf2e-77a49b59388d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.078721 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09af255f-def2-476c-bf2e-77a49b59388d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09af255f-def2-476c-bf2e-77a49b59388d" (UID: "09af255f-def2-476c-bf2e-77a49b59388d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.099062 4869 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/09af255f-def2-476c-bf2e-77a49b59388d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.099092 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpnkq\" (UniqueName: \"kubernetes.io/projected/09af255f-def2-476c-bf2e-77a49b59388d-kube-api-access-jpnkq\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.099102 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09af255f-def2-476c-bf2e-77a49b59388d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.099110 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09af255f-def2-476c-bf2e-77a49b59388d-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.123076 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09af255f-def2-476c-bf2e-77a49b59388d-config-data" (OuterVolumeSpecName: "config-data") pod "09af255f-def2-476c-bf2e-77a49b59388d" (UID: "09af255f-def2-476c-bf2e-77a49b59388d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.201258 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09af255f-def2-476c-bf2e-77a49b59388d-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.782231 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4zpf4" event={"ID":"b423daaf-54ec-4a18-a9b6-572c4f32a207","Type":"ContainerStarted","Data":"4daa8807894435a3e07a29aced07bd516c2036a676927d13c83627681b7ae737"} Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.786155 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"09af255f-def2-476c-bf2e-77a49b59388d","Type":"ContainerDied","Data":"307f4e18466d3ee16e17bfd1f206d14eb3d9e722823d7791de5a1821ef9c7adb"} Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.786325 4869 scope.go:117] "RemoveContainer" containerID="295bc7fe2a9fcd50f072a180c87d50a67ec9e424d5e1414bf169e414aa58bfc3" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.786563 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.827775 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-4zpf4" podStartSLOduration=2.827756679 podStartE2EDuration="2.827756679s" podCreationTimestamp="2026-03-12 15:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:10:31.818224068 +0000 UTC m=+1384.103449346" watchObservedRunningTime="2026-03-12 15:10:31.827756679 +0000 UTC m=+1384.112981957" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.848597 4869 scope.go:117] "RemoveContainer" containerID="6ab533bcf44cdccef55fa48cbb2761294a453d5324f8d62c305a78c960df28fd" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.852914 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.861366 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.904416 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:10:31 crc kubenswrapper[4869]: E0312 15:10:31.904883 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09af255f-def2-476c-bf2e-77a49b59388d" containerName="sg-core" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.904902 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="09af255f-def2-476c-bf2e-77a49b59388d" containerName="sg-core" Mar 12 15:10:31 crc kubenswrapper[4869]: E0312 15:10:31.904924 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09af255f-def2-476c-bf2e-77a49b59388d" containerName="ceilometer-central-agent" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.904930 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="09af255f-def2-476c-bf2e-77a49b59388d" containerName="ceilometer-central-agent" Mar 12 15:10:31 crc kubenswrapper[4869]: E0312 15:10:31.904940 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09af255f-def2-476c-bf2e-77a49b59388d" containerName="ceilometer-notification-agent" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.904946 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="09af255f-def2-476c-bf2e-77a49b59388d" containerName="ceilometer-notification-agent" Mar 12 15:10:31 crc kubenswrapper[4869]: E0312 15:10:31.904961 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09af255f-def2-476c-bf2e-77a49b59388d" containerName="proxy-httpd" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.904967 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="09af255f-def2-476c-bf2e-77a49b59388d" containerName="proxy-httpd" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.905144 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="09af255f-def2-476c-bf2e-77a49b59388d" containerName="ceilometer-notification-agent" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.905166 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="09af255f-def2-476c-bf2e-77a49b59388d" containerName="ceilometer-central-agent" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.905177 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="09af255f-def2-476c-bf2e-77a49b59388d" containerName="proxy-httpd" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.905185 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="09af255f-def2-476c-bf2e-77a49b59388d" containerName="sg-core" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.906820 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.911259 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.911434 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.920470 4869 scope.go:117] "RemoveContainer" containerID="6efa829c29242f44e8e07501ff5369c59051fd3c4af89adc548677b9bd41977c" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.922812 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b75c9a12-7533-4855-b543-e75d0bb77857-log-httpd\") pod \"ceilometer-0\" (UID: \"b75c9a12-7533-4855-b543-e75d0bb77857\") " pod="openstack/ceilometer-0" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.922857 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b75c9a12-7533-4855-b543-e75d0bb77857-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b75c9a12-7533-4855-b543-e75d0bb77857\") " pod="openstack/ceilometer-0" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.922943 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdqvm\" (UniqueName: \"kubernetes.io/projected/b75c9a12-7533-4855-b543-e75d0bb77857-kube-api-access-rdqvm\") pod \"ceilometer-0\" (UID: \"b75c9a12-7533-4855-b543-e75d0bb77857\") " pod="openstack/ceilometer-0" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.922988 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b75c9a12-7533-4855-b543-e75d0bb77857-run-httpd\") pod \"ceilometer-0\" (UID: \"b75c9a12-7533-4855-b543-e75d0bb77857\") " pod="openstack/ceilometer-0" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.923028 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b75c9a12-7533-4855-b543-e75d0bb77857-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b75c9a12-7533-4855-b543-e75d0bb77857\") " pod="openstack/ceilometer-0" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.923064 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b75c9a12-7533-4855-b543-e75d0bb77857-config-data\") pod \"ceilometer-0\" (UID: \"b75c9a12-7533-4855-b543-e75d0bb77857\") " pod="openstack/ceilometer-0" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.923102 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b75c9a12-7533-4855-b543-e75d0bb77857-scripts\") pod \"ceilometer-0\" (UID: \"b75c9a12-7533-4855-b543-e75d0bb77857\") " pod="openstack/ceilometer-0" Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.923128 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:10:31 crc kubenswrapper[4869]: I0312 15:10:31.977021 4869 scope.go:117] "RemoveContainer" containerID="44d83d620e6d575294fd36749c92e155694e571fd0ffb96b94561349e9ad3ad5" Mar 12 15:10:32 crc kubenswrapper[4869]: I0312 15:10:32.025324 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b75c9a12-7533-4855-b543-e75d0bb77857-scripts\") pod \"ceilometer-0\" (UID: \"b75c9a12-7533-4855-b543-e75d0bb77857\") " pod="openstack/ceilometer-0" Mar 12 15:10:32 crc kubenswrapper[4869]: I0312 15:10:32.025417 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b75c9a12-7533-4855-b543-e75d0bb77857-log-httpd\") pod \"ceilometer-0\" (UID: \"b75c9a12-7533-4855-b543-e75d0bb77857\") " pod="openstack/ceilometer-0" Mar 12 15:10:32 crc kubenswrapper[4869]: I0312 15:10:32.025444 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b75c9a12-7533-4855-b543-e75d0bb77857-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b75c9a12-7533-4855-b543-e75d0bb77857\") " pod="openstack/ceilometer-0" Mar 12 15:10:32 crc kubenswrapper[4869]: I0312 15:10:32.025517 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdqvm\" (UniqueName: \"kubernetes.io/projected/b75c9a12-7533-4855-b543-e75d0bb77857-kube-api-access-rdqvm\") pod \"ceilometer-0\" (UID: \"b75c9a12-7533-4855-b543-e75d0bb77857\") " pod="openstack/ceilometer-0" Mar 12 15:10:32 crc kubenswrapper[4869]: I0312 15:10:32.025592 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b75c9a12-7533-4855-b543-e75d0bb77857-run-httpd\") pod \"ceilometer-0\" (UID: \"b75c9a12-7533-4855-b543-e75d0bb77857\") " pod="openstack/ceilometer-0" Mar 12 15:10:32 crc kubenswrapper[4869]: I0312 15:10:32.025624 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b75c9a12-7533-4855-b543-e75d0bb77857-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b75c9a12-7533-4855-b543-e75d0bb77857\") " pod="openstack/ceilometer-0" Mar 12 15:10:32 crc kubenswrapper[4869]: I0312 15:10:32.025666 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b75c9a12-7533-4855-b543-e75d0bb77857-config-data\") pod \"ceilometer-0\" (UID: \"b75c9a12-7533-4855-b543-e75d0bb77857\") " pod="openstack/ceilometer-0" Mar 12 15:10:32 crc kubenswrapper[4869]: I0312 15:10:32.027320 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b75c9a12-7533-4855-b543-e75d0bb77857-run-httpd\") pod \"ceilometer-0\" (UID: \"b75c9a12-7533-4855-b543-e75d0bb77857\") " pod="openstack/ceilometer-0" Mar 12 15:10:32 crc kubenswrapper[4869]: I0312 15:10:32.027500 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b75c9a12-7533-4855-b543-e75d0bb77857-log-httpd\") pod \"ceilometer-0\" (UID: \"b75c9a12-7533-4855-b543-e75d0bb77857\") " pod="openstack/ceilometer-0" Mar 12 15:10:32 crc kubenswrapper[4869]: I0312 15:10:32.030355 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b75c9a12-7533-4855-b543-e75d0bb77857-scripts\") pod \"ceilometer-0\" (UID: \"b75c9a12-7533-4855-b543-e75d0bb77857\") " pod="openstack/ceilometer-0" Mar 12 15:10:32 crc kubenswrapper[4869]: I0312 15:10:32.035112 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b75c9a12-7533-4855-b543-e75d0bb77857-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b75c9a12-7533-4855-b543-e75d0bb77857\") " pod="openstack/ceilometer-0" Mar 12 15:10:32 crc kubenswrapper[4869]: I0312 15:10:32.037405 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b75c9a12-7533-4855-b543-e75d0bb77857-config-data\") pod \"ceilometer-0\" (UID: \"b75c9a12-7533-4855-b543-e75d0bb77857\") " pod="openstack/ceilometer-0" Mar 12 15:10:32 crc kubenswrapper[4869]: I0312 15:10:32.044197 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b75c9a12-7533-4855-b543-e75d0bb77857-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b75c9a12-7533-4855-b543-e75d0bb77857\") " pod="openstack/ceilometer-0" Mar 12 15:10:32 crc kubenswrapper[4869]: I0312 15:10:32.076946 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdqvm\" (UniqueName: \"kubernetes.io/projected/b75c9a12-7533-4855-b543-e75d0bb77857-kube-api-access-rdqvm\") pod \"ceilometer-0\" (UID: \"b75c9a12-7533-4855-b543-e75d0bb77857\") " pod="openstack/ceilometer-0" Mar 12 15:10:32 crc kubenswrapper[4869]: I0312 15:10:32.233013 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 15:10:32 crc kubenswrapper[4869]: I0312 15:10:32.371356 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09af255f-def2-476c-bf2e-77a49b59388d" path="/var/lib/kubelet/pods/09af255f-def2-476c-bf2e-77a49b59388d/volumes" Mar 12 15:10:32 crc kubenswrapper[4869]: W0312 15:10:32.737685 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb75c9a12_7533_4855_b543_e75d0bb77857.slice/crio-fd0013b5e060bf89207eae199fb4346d009e593bdfaea97f471765e3143384d0 WatchSource:0}: Error finding container fd0013b5e060bf89207eae199fb4346d009e593bdfaea97f471765e3143384d0: Status 404 returned error can't find the container with id fd0013b5e060bf89207eae199fb4346d009e593bdfaea97f471765e3143384d0 Mar 12 15:10:32 crc kubenswrapper[4869]: I0312 15:10:32.738651 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:10:32 crc kubenswrapper[4869]: I0312 15:10:32.802636 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b75c9a12-7533-4855-b543-e75d0bb77857","Type":"ContainerStarted","Data":"fd0013b5e060bf89207eae199fb4346d009e593bdfaea97f471765e3143384d0"} Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.287982 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.362769 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7kcg\" (UniqueName: \"kubernetes.io/projected/101c26e1-fa16-444e-936e-82baa2aebf60-kube-api-access-z7kcg\") pod \"101c26e1-fa16-444e-936e-82baa2aebf60\" (UID: \"101c26e1-fa16-444e-936e-82baa2aebf60\") " Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.362896 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/101c26e1-fa16-444e-936e-82baa2aebf60-config-data\") pod \"101c26e1-fa16-444e-936e-82baa2aebf60\" (UID: \"101c26e1-fa16-444e-936e-82baa2aebf60\") " Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.362932 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/101c26e1-fa16-444e-936e-82baa2aebf60-logs\") pod \"101c26e1-fa16-444e-936e-82baa2aebf60\" (UID: \"101c26e1-fa16-444e-936e-82baa2aebf60\") " Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.363027 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/101c26e1-fa16-444e-936e-82baa2aebf60-combined-ca-bundle\") pod \"101c26e1-fa16-444e-936e-82baa2aebf60\" (UID: \"101c26e1-fa16-444e-936e-82baa2aebf60\") " Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.366262 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/101c26e1-fa16-444e-936e-82baa2aebf60-logs" (OuterVolumeSpecName: "logs") pod "101c26e1-fa16-444e-936e-82baa2aebf60" (UID: "101c26e1-fa16-444e-936e-82baa2aebf60"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.372735 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/101c26e1-fa16-444e-936e-82baa2aebf60-kube-api-access-z7kcg" (OuterVolumeSpecName: "kube-api-access-z7kcg") pod "101c26e1-fa16-444e-936e-82baa2aebf60" (UID: "101c26e1-fa16-444e-936e-82baa2aebf60"). InnerVolumeSpecName "kube-api-access-z7kcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.412497 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/101c26e1-fa16-444e-936e-82baa2aebf60-config-data" (OuterVolumeSpecName: "config-data") pod "101c26e1-fa16-444e-936e-82baa2aebf60" (UID: "101c26e1-fa16-444e-936e-82baa2aebf60"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.430982 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/101c26e1-fa16-444e-936e-82baa2aebf60-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "101c26e1-fa16-444e-936e-82baa2aebf60" (UID: "101c26e1-fa16-444e-936e-82baa2aebf60"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.465677 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7kcg\" (UniqueName: \"kubernetes.io/projected/101c26e1-fa16-444e-936e-82baa2aebf60-kube-api-access-z7kcg\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.465714 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/101c26e1-fa16-444e-936e-82baa2aebf60-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.465726 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/101c26e1-fa16-444e-936e-82baa2aebf60-logs\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.465740 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/101c26e1-fa16-444e-936e-82baa2aebf60-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.818575 4869 generic.go:334] "Generic (PLEG): container finished" podID="101c26e1-fa16-444e-936e-82baa2aebf60" containerID="19bf0ec7cf726e8753cb3f746e7bd71378da0722ae771fd64a40e818db937dd7" exitCode=0 Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.818652 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"101c26e1-fa16-444e-936e-82baa2aebf60","Type":"ContainerDied","Data":"19bf0ec7cf726e8753cb3f746e7bd71378da0722ae771fd64a40e818db937dd7"} Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.818689 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"101c26e1-fa16-444e-936e-82baa2aebf60","Type":"ContainerDied","Data":"b25b47c90ee26af12853d342a01e8f708695c4b3b0b4cb8f64bdc3b56ff47f63"} Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.818711 4869 scope.go:117] "RemoveContainer" containerID="19bf0ec7cf726e8753cb3f746e7bd71378da0722ae771fd64a40e818db937dd7" Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.818859 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.829556 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b75c9a12-7533-4855-b543-e75d0bb77857","Type":"ContainerStarted","Data":"5e54392c9966610ce2972edc6f06fca4e394ebeb9d3641035cb4c28f8653adf4"} Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.866100 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.868920 4869 scope.go:117] "RemoveContainer" containerID="50f9dc841391d688b3de13a15773bd6b23c8d1163f074c9d462fde9d66258e3a" Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.874330 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.886122 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 12 15:10:33 crc kubenswrapper[4869]: E0312 15:10:33.886595 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="101c26e1-fa16-444e-936e-82baa2aebf60" containerName="nova-api-api" Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.886617 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="101c26e1-fa16-444e-936e-82baa2aebf60" containerName="nova-api-api" Mar 12 15:10:33 crc kubenswrapper[4869]: E0312 15:10:33.886645 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="101c26e1-fa16-444e-936e-82baa2aebf60" containerName="nova-api-log" Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.886653 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="101c26e1-fa16-444e-936e-82baa2aebf60" containerName="nova-api-log" Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.886848 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="101c26e1-fa16-444e-936e-82baa2aebf60" containerName="nova-api-api" Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.886869 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="101c26e1-fa16-444e-936e-82baa2aebf60" containerName="nova-api-log" Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.887835 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.917388 4869 scope.go:117] "RemoveContainer" containerID="19bf0ec7cf726e8753cb3f746e7bd71378da0722ae771fd64a40e818db937dd7" Mar 12 15:10:33 crc kubenswrapper[4869]: E0312 15:10:33.917933 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19bf0ec7cf726e8753cb3f746e7bd71378da0722ae771fd64a40e818db937dd7\": container with ID starting with 19bf0ec7cf726e8753cb3f746e7bd71378da0722ae771fd64a40e818db937dd7 not found: ID does not exist" containerID="19bf0ec7cf726e8753cb3f746e7bd71378da0722ae771fd64a40e818db937dd7" Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.917970 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19bf0ec7cf726e8753cb3f746e7bd71378da0722ae771fd64a40e818db937dd7"} err="failed to get container status \"19bf0ec7cf726e8753cb3f746e7bd71378da0722ae771fd64a40e818db937dd7\": rpc error: code = NotFound desc = could not find container \"19bf0ec7cf726e8753cb3f746e7bd71378da0722ae771fd64a40e818db937dd7\": container with ID starting with 19bf0ec7cf726e8753cb3f746e7bd71378da0722ae771fd64a40e818db937dd7 not found: ID does not exist" Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.917993 4869 scope.go:117] "RemoveContainer" containerID="50f9dc841391d688b3de13a15773bd6b23c8d1163f074c9d462fde9d66258e3a" Mar 12 15:10:33 crc kubenswrapper[4869]: E0312 15:10:33.918327 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50f9dc841391d688b3de13a15773bd6b23c8d1163f074c9d462fde9d66258e3a\": container with ID starting with 50f9dc841391d688b3de13a15773bd6b23c8d1163f074c9d462fde9d66258e3a not found: ID does not exist" containerID="50f9dc841391d688b3de13a15773bd6b23c8d1163f074c9d462fde9d66258e3a" Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.918360 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50f9dc841391d688b3de13a15773bd6b23c8d1163f074c9d462fde9d66258e3a"} err="failed to get container status \"50f9dc841391d688b3de13a15773bd6b23c8d1163f074c9d462fde9d66258e3a\": rpc error: code = NotFound desc = could not find container \"50f9dc841391d688b3de13a15773bd6b23c8d1163f074c9d462fde9d66258e3a\": container with ID starting with 50f9dc841391d688b3de13a15773bd6b23c8d1163f074c9d462fde9d66258e3a not found: ID does not exist" Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.919536 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.920490 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.921312 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.933251 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.974510 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6802a93-e4d4-4e66-af1d-417e105b51a7-public-tls-certs\") pod \"nova-api-0\" (UID: \"d6802a93-e4d4-4e66-af1d-417e105b51a7\") " pod="openstack/nova-api-0" Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.974572 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l74wz\" (UniqueName: \"kubernetes.io/projected/d6802a93-e4d4-4e66-af1d-417e105b51a7-kube-api-access-l74wz\") pod \"nova-api-0\" (UID: \"d6802a93-e4d4-4e66-af1d-417e105b51a7\") " pod="openstack/nova-api-0" Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.974863 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6802a93-e4d4-4e66-af1d-417e105b51a7-logs\") pod \"nova-api-0\" (UID: \"d6802a93-e4d4-4e66-af1d-417e105b51a7\") " pod="openstack/nova-api-0" Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.975063 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6802a93-e4d4-4e66-af1d-417e105b51a7-config-data\") pod \"nova-api-0\" (UID: \"d6802a93-e4d4-4e66-af1d-417e105b51a7\") " pod="openstack/nova-api-0" Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.975215 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6802a93-e4d4-4e66-af1d-417e105b51a7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d6802a93-e4d4-4e66-af1d-417e105b51a7\") " pod="openstack/nova-api-0" Mar 12 15:10:33 crc kubenswrapper[4869]: I0312 15:10:33.975269 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6802a93-e4d4-4e66-af1d-417e105b51a7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d6802a93-e4d4-4e66-af1d-417e105b51a7\") " pod="openstack/nova-api-0" Mar 12 15:10:34 crc kubenswrapper[4869]: I0312 15:10:34.077858 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6802a93-e4d4-4e66-af1d-417e105b51a7-config-data\") pod \"nova-api-0\" (UID: \"d6802a93-e4d4-4e66-af1d-417e105b51a7\") " pod="openstack/nova-api-0" Mar 12 15:10:34 crc kubenswrapper[4869]: I0312 15:10:34.077959 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6802a93-e4d4-4e66-af1d-417e105b51a7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d6802a93-e4d4-4e66-af1d-417e105b51a7\") " pod="openstack/nova-api-0" Mar 12 15:10:34 crc kubenswrapper[4869]: I0312 15:10:34.078038 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6802a93-e4d4-4e66-af1d-417e105b51a7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d6802a93-e4d4-4e66-af1d-417e105b51a7\") " pod="openstack/nova-api-0" Mar 12 15:10:34 crc kubenswrapper[4869]: I0312 15:10:34.078101 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6802a93-e4d4-4e66-af1d-417e105b51a7-public-tls-certs\") pod \"nova-api-0\" (UID: \"d6802a93-e4d4-4e66-af1d-417e105b51a7\") " pod="openstack/nova-api-0" Mar 12 15:10:34 crc kubenswrapper[4869]: I0312 15:10:34.078130 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l74wz\" (UniqueName: \"kubernetes.io/projected/d6802a93-e4d4-4e66-af1d-417e105b51a7-kube-api-access-l74wz\") pod \"nova-api-0\" (UID: \"d6802a93-e4d4-4e66-af1d-417e105b51a7\") " pod="openstack/nova-api-0" Mar 12 15:10:34 crc kubenswrapper[4869]: I0312 15:10:34.078235 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6802a93-e4d4-4e66-af1d-417e105b51a7-logs\") pod \"nova-api-0\" (UID: \"d6802a93-e4d4-4e66-af1d-417e105b51a7\") " pod="openstack/nova-api-0" Mar 12 15:10:34 crc kubenswrapper[4869]: I0312 15:10:34.078698 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6802a93-e4d4-4e66-af1d-417e105b51a7-logs\") pod \"nova-api-0\" (UID: \"d6802a93-e4d4-4e66-af1d-417e105b51a7\") " pod="openstack/nova-api-0" Mar 12 15:10:34 crc kubenswrapper[4869]: I0312 15:10:34.082735 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6802a93-e4d4-4e66-af1d-417e105b51a7-public-tls-certs\") pod \"nova-api-0\" (UID: \"d6802a93-e4d4-4e66-af1d-417e105b51a7\") " pod="openstack/nova-api-0" Mar 12 15:10:34 crc kubenswrapper[4869]: I0312 15:10:34.083155 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6802a93-e4d4-4e66-af1d-417e105b51a7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d6802a93-e4d4-4e66-af1d-417e105b51a7\") " pod="openstack/nova-api-0" Mar 12 15:10:34 crc kubenswrapper[4869]: I0312 15:10:34.084245 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6802a93-e4d4-4e66-af1d-417e105b51a7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d6802a93-e4d4-4e66-af1d-417e105b51a7\") " pod="openstack/nova-api-0" Mar 12 15:10:34 crc kubenswrapper[4869]: I0312 15:10:34.092571 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6802a93-e4d4-4e66-af1d-417e105b51a7-config-data\") pod \"nova-api-0\" (UID: \"d6802a93-e4d4-4e66-af1d-417e105b51a7\") " pod="openstack/nova-api-0" Mar 12 15:10:34 crc kubenswrapper[4869]: I0312 15:10:34.100184 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l74wz\" (UniqueName: \"kubernetes.io/projected/d6802a93-e4d4-4e66-af1d-417e105b51a7-kube-api-access-l74wz\") pod \"nova-api-0\" (UID: \"d6802a93-e4d4-4e66-af1d-417e105b51a7\") " pod="openstack/nova-api-0" Mar 12 15:10:34 crc kubenswrapper[4869]: I0312 15:10:34.243079 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 15:10:34 crc kubenswrapper[4869]: I0312 15:10:34.357093 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="101c26e1-fa16-444e-936e-82baa2aebf60" path="/var/lib/kubelet/pods/101c26e1-fa16-444e-936e-82baa2aebf60/volumes" Mar 12 15:10:34 crc kubenswrapper[4869]: I0312 15:10:34.709986 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 15:10:34 crc kubenswrapper[4869]: W0312 15:10:34.714148 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6802a93_e4d4_4e66_af1d_417e105b51a7.slice/crio-ee23f40bd627b4878bc3fafb3e2df7b667c0cf490400a3807cf596527506f89c WatchSource:0}: Error finding container ee23f40bd627b4878bc3fafb3e2df7b667c0cf490400a3807cf596527506f89c: Status 404 returned error can't find the container with id ee23f40bd627b4878bc3fafb3e2df7b667c0cf490400a3807cf596527506f89c Mar 12 15:10:34 crc kubenswrapper[4869]: I0312 15:10:34.841479 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d6802a93-e4d4-4e66-af1d-417e105b51a7","Type":"ContainerStarted","Data":"ee23f40bd627b4878bc3fafb3e2df7b667c0cf490400a3807cf596527506f89c"} Mar 12 15:10:34 crc kubenswrapper[4869]: I0312 15:10:34.843613 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b75c9a12-7533-4855-b543-e75d0bb77857","Type":"ContainerStarted","Data":"4cbd5ecd349acba250031dfba5b523eaeefb0f426e6d4f0b2c51b948c7beb6ae"} Mar 12 15:10:35 crc kubenswrapper[4869]: I0312 15:10:35.853922 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d6802a93-e4d4-4e66-af1d-417e105b51a7","Type":"ContainerStarted","Data":"a9612bb1f3dd9a66aca41cceb7cfc88b3472cc70bb76fef34fdff2a9ee21b413"} Mar 12 15:10:35 crc kubenswrapper[4869]: I0312 15:10:35.854227 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d6802a93-e4d4-4e66-af1d-417e105b51a7","Type":"ContainerStarted","Data":"02653386e90ac849973df55ff06718f3a8c7354b38d0d8c56a678389d7dbb943"} Mar 12 15:10:35 crc kubenswrapper[4869]: I0312 15:10:35.857641 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b75c9a12-7533-4855-b543-e75d0bb77857","Type":"ContainerStarted","Data":"64d950dc347615b05eef2c08928bfe51858ba4991d6e8114851333a2167fd80b"} Mar 12 15:10:35 crc kubenswrapper[4869]: I0312 15:10:35.889729 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.889706867 podStartE2EDuration="2.889706867s" podCreationTimestamp="2026-03-12 15:10:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:10:35.880728411 +0000 UTC m=+1388.165953689" watchObservedRunningTime="2026-03-12 15:10:35.889706867 +0000 UTC m=+1388.174932145" Mar 12 15:10:36 crc kubenswrapper[4869]: I0312 15:10:36.867920 4869 generic.go:334] "Generic (PLEG): container finished" podID="b423daaf-54ec-4a18-a9b6-572c4f32a207" containerID="4daa8807894435a3e07a29aced07bd516c2036a676927d13c83627681b7ae737" exitCode=0 Mar 12 15:10:36 crc kubenswrapper[4869]: I0312 15:10:36.868185 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4zpf4" event={"ID":"b423daaf-54ec-4a18-a9b6-572c4f32a207","Type":"ContainerDied","Data":"4daa8807894435a3e07a29aced07bd516c2036a676927d13c83627681b7ae737"} Mar 12 15:10:37 crc kubenswrapper[4869]: I0312 15:10:37.416008 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b4c997d87-pnwr2" Mar 12 15:10:37 crc kubenswrapper[4869]: I0312 15:10:37.490371 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b6c754dc9-jzxfs"] Mar 12 15:10:37 crc kubenswrapper[4869]: I0312 15:10:37.490620 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b6c754dc9-jzxfs" podUID="079bc323-9373-49b7-ba69-d64155e9e902" containerName="dnsmasq-dns" containerID="cri-o://e3a07bd69d8e49c784071c6adf360c3eb344bcfd4b9a799e7db5cdfbbfaafbec" gracePeriod=10 Mar 12 15:10:37 crc kubenswrapper[4869]: I0312 15:10:37.916300 4869 generic.go:334] "Generic (PLEG): container finished" podID="079bc323-9373-49b7-ba69-d64155e9e902" containerID="e3a07bd69d8e49c784071c6adf360c3eb344bcfd4b9a799e7db5cdfbbfaafbec" exitCode=0 Mar 12 15:10:37 crc kubenswrapper[4869]: I0312 15:10:37.916651 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6c754dc9-jzxfs" event={"ID":"079bc323-9373-49b7-ba69-d64155e9e902","Type":"ContainerDied","Data":"e3a07bd69d8e49c784071c6adf360c3eb344bcfd4b9a799e7db5cdfbbfaafbec"} Mar 12 15:10:37 crc kubenswrapper[4869]: I0312 15:10:37.921009 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b75c9a12-7533-4855-b543-e75d0bb77857","Type":"ContainerStarted","Data":"c3aa541af7677f1724d41cd410e143248e18bfb7e5ad57061403bfd9dc3ae1f7"} Mar 12 15:10:37 crc kubenswrapper[4869]: I0312 15:10:37.921198 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 12 15:10:37 crc kubenswrapper[4869]: I0312 15:10:37.953726 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.642794518 podStartE2EDuration="6.953700666s" podCreationTimestamp="2026-03-12 15:10:31 +0000 UTC" firstStartedPulling="2026-03-12 15:10:32.740817816 +0000 UTC m=+1385.026043094" lastFinishedPulling="2026-03-12 15:10:37.051723964 +0000 UTC m=+1389.336949242" observedRunningTime="2026-03-12 15:10:37.950921437 +0000 UTC m=+1390.236146715" watchObservedRunningTime="2026-03-12 15:10:37.953700666 +0000 UTC m=+1390.238925944" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.041944 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b6c754dc9-jzxfs" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.053827 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/079bc323-9373-49b7-ba69-d64155e9e902-ovsdbserver-nb\") pod \"079bc323-9373-49b7-ba69-d64155e9e902\" (UID: \"079bc323-9373-49b7-ba69-d64155e9e902\") " Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.053882 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mb2h4\" (UniqueName: \"kubernetes.io/projected/079bc323-9373-49b7-ba69-d64155e9e902-kube-api-access-mb2h4\") pod \"079bc323-9373-49b7-ba69-d64155e9e902\" (UID: \"079bc323-9373-49b7-ba69-d64155e9e902\") " Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.053907 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/079bc323-9373-49b7-ba69-d64155e9e902-dns-svc\") pod \"079bc323-9373-49b7-ba69-d64155e9e902\" (UID: \"079bc323-9373-49b7-ba69-d64155e9e902\") " Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.053929 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/079bc323-9373-49b7-ba69-d64155e9e902-dns-swift-storage-0\") pod \"079bc323-9373-49b7-ba69-d64155e9e902\" (UID: \"079bc323-9373-49b7-ba69-d64155e9e902\") " Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.053968 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/079bc323-9373-49b7-ba69-d64155e9e902-config\") pod \"079bc323-9373-49b7-ba69-d64155e9e902\" (UID: \"079bc323-9373-49b7-ba69-d64155e9e902\") " Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.054096 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/079bc323-9373-49b7-ba69-d64155e9e902-ovsdbserver-sb\") pod \"079bc323-9373-49b7-ba69-d64155e9e902\" (UID: \"079bc323-9373-49b7-ba69-d64155e9e902\") " Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.070733 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/079bc323-9373-49b7-ba69-d64155e9e902-kube-api-access-mb2h4" (OuterVolumeSpecName: "kube-api-access-mb2h4") pod "079bc323-9373-49b7-ba69-d64155e9e902" (UID: "079bc323-9373-49b7-ba69-d64155e9e902"). InnerVolumeSpecName "kube-api-access-mb2h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.126249 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/079bc323-9373-49b7-ba69-d64155e9e902-config" (OuterVolumeSpecName: "config") pod "079bc323-9373-49b7-ba69-d64155e9e902" (UID: "079bc323-9373-49b7-ba69-d64155e9e902"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.151219 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/079bc323-9373-49b7-ba69-d64155e9e902-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "079bc323-9373-49b7-ba69-d64155e9e902" (UID: "079bc323-9373-49b7-ba69-d64155e9e902"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.159113 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/079bc323-9373-49b7-ba69-d64155e9e902-config\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.159223 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/079bc323-9373-49b7-ba69-d64155e9e902-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.159298 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mb2h4\" (UniqueName: \"kubernetes.io/projected/079bc323-9373-49b7-ba69-d64155e9e902-kube-api-access-mb2h4\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.174113 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/079bc323-9373-49b7-ba69-d64155e9e902-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "079bc323-9373-49b7-ba69-d64155e9e902" (UID: "079bc323-9373-49b7-ba69-d64155e9e902"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.181792 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/079bc323-9373-49b7-ba69-d64155e9e902-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "079bc323-9373-49b7-ba69-d64155e9e902" (UID: "079bc323-9373-49b7-ba69-d64155e9e902"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.247470 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/079bc323-9373-49b7-ba69-d64155e9e902-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "079bc323-9373-49b7-ba69-d64155e9e902" (UID: "079bc323-9373-49b7-ba69-d64155e9e902"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.263789 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/079bc323-9373-49b7-ba69-d64155e9e902-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.263824 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/079bc323-9373-49b7-ba69-d64155e9e902-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.263835 4869 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/079bc323-9373-49b7-ba69-d64155e9e902-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.333736 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sz5k6"] Mar 12 15:10:38 crc kubenswrapper[4869]: E0312 15:10:38.334341 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079bc323-9373-49b7-ba69-d64155e9e902" containerName="dnsmasq-dns" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.334365 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="079bc323-9373-49b7-ba69-d64155e9e902" containerName="dnsmasq-dns" Mar 12 15:10:38 crc kubenswrapper[4869]: E0312 15:10:38.334421 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079bc323-9373-49b7-ba69-d64155e9e902" containerName="init" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.334431 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="079bc323-9373-49b7-ba69-d64155e9e902" containerName="init" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.334692 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="079bc323-9373-49b7-ba69-d64155e9e902" containerName="dnsmasq-dns" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.336573 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sz5k6" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.355626 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4zpf4" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.397012 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sz5k6"] Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.469128 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b423daaf-54ec-4a18-a9b6-572c4f32a207-config-data\") pod \"b423daaf-54ec-4a18-a9b6-572c4f32a207\" (UID: \"b423daaf-54ec-4a18-a9b6-572c4f32a207\") " Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.469256 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b423daaf-54ec-4a18-a9b6-572c4f32a207-combined-ca-bundle\") pod \"b423daaf-54ec-4a18-a9b6-572c4f32a207\" (UID: \"b423daaf-54ec-4a18-a9b6-572c4f32a207\") " Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.469339 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b423daaf-54ec-4a18-a9b6-572c4f32a207-scripts\") pod \"b423daaf-54ec-4a18-a9b6-572c4f32a207\" (UID: \"b423daaf-54ec-4a18-a9b6-572c4f32a207\") " Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.469373 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5b76x\" (UniqueName: \"kubernetes.io/projected/b423daaf-54ec-4a18-a9b6-572c4f32a207-kube-api-access-5b76x\") pod \"b423daaf-54ec-4a18-a9b6-572c4f32a207\" (UID: \"b423daaf-54ec-4a18-a9b6-572c4f32a207\") " Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.469812 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/842d495b-32b4-4235-8cce-c4a1af711991-catalog-content\") pod \"certified-operators-sz5k6\" (UID: \"842d495b-32b4-4235-8cce-c4a1af711991\") " pod="openshift-marketplace/certified-operators-sz5k6" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.469912 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/842d495b-32b4-4235-8cce-c4a1af711991-utilities\") pod \"certified-operators-sz5k6\" (UID: \"842d495b-32b4-4235-8cce-c4a1af711991\") " pod="openshift-marketplace/certified-operators-sz5k6" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.469969 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjjkk\" (UniqueName: \"kubernetes.io/projected/842d495b-32b4-4235-8cce-c4a1af711991-kube-api-access-gjjkk\") pod \"certified-operators-sz5k6\" (UID: \"842d495b-32b4-4235-8cce-c4a1af711991\") " pod="openshift-marketplace/certified-operators-sz5k6" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.478293 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b423daaf-54ec-4a18-a9b6-572c4f32a207-scripts" (OuterVolumeSpecName: "scripts") pod "b423daaf-54ec-4a18-a9b6-572c4f32a207" (UID: "b423daaf-54ec-4a18-a9b6-572c4f32a207"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.488147 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b423daaf-54ec-4a18-a9b6-572c4f32a207-kube-api-access-5b76x" (OuterVolumeSpecName: "kube-api-access-5b76x") pod "b423daaf-54ec-4a18-a9b6-572c4f32a207" (UID: "b423daaf-54ec-4a18-a9b6-572c4f32a207"). InnerVolumeSpecName "kube-api-access-5b76x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.514672 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b423daaf-54ec-4a18-a9b6-572c4f32a207-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b423daaf-54ec-4a18-a9b6-572c4f32a207" (UID: "b423daaf-54ec-4a18-a9b6-572c4f32a207"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.520568 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b423daaf-54ec-4a18-a9b6-572c4f32a207-config-data" (OuterVolumeSpecName: "config-data") pod "b423daaf-54ec-4a18-a9b6-572c4f32a207" (UID: "b423daaf-54ec-4a18-a9b6-572c4f32a207"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.572871 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/842d495b-32b4-4235-8cce-c4a1af711991-catalog-content\") pod \"certified-operators-sz5k6\" (UID: \"842d495b-32b4-4235-8cce-c4a1af711991\") " pod="openshift-marketplace/certified-operators-sz5k6" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.573415 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/842d495b-32b4-4235-8cce-c4a1af711991-catalog-content\") pod \"certified-operators-sz5k6\" (UID: \"842d495b-32b4-4235-8cce-c4a1af711991\") " pod="openshift-marketplace/certified-operators-sz5k6" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.573428 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/842d495b-32b4-4235-8cce-c4a1af711991-utilities\") pod \"certified-operators-sz5k6\" (UID: \"842d495b-32b4-4235-8cce-c4a1af711991\") " pod="openshift-marketplace/certified-operators-sz5k6" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.573645 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjjkk\" (UniqueName: \"kubernetes.io/projected/842d495b-32b4-4235-8cce-c4a1af711991-kube-api-access-gjjkk\") pod \"certified-operators-sz5k6\" (UID: \"842d495b-32b4-4235-8cce-c4a1af711991\") " pod="openshift-marketplace/certified-operators-sz5k6" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.573963 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/842d495b-32b4-4235-8cce-c4a1af711991-utilities\") pod \"certified-operators-sz5k6\" (UID: \"842d495b-32b4-4235-8cce-c4a1af711991\") " pod="openshift-marketplace/certified-operators-sz5k6" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.574119 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b423daaf-54ec-4a18-a9b6-572c4f32a207-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.574143 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b423daaf-54ec-4a18-a9b6-572c4f32a207-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.574157 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b423daaf-54ec-4a18-a9b6-572c4f32a207-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.574168 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5b76x\" (UniqueName: \"kubernetes.io/projected/b423daaf-54ec-4a18-a9b6-572c4f32a207-kube-api-access-5b76x\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.590850 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjjkk\" (UniqueName: \"kubernetes.io/projected/842d495b-32b4-4235-8cce-c4a1af711991-kube-api-access-gjjkk\") pod \"certified-operators-sz5k6\" (UID: \"842d495b-32b4-4235-8cce-c4a1af711991\") " pod="openshift-marketplace/certified-operators-sz5k6" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.681167 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sz5k6" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.945458 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4zpf4" event={"ID":"b423daaf-54ec-4a18-a9b6-572c4f32a207","Type":"ContainerDied","Data":"d1f98d0540b5fe7a3c23ce0303c389c0076e134144f4512eee258cb2a4cd8b6d"} Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.945527 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1f98d0540b5fe7a3c23ce0303c389c0076e134144f4512eee258cb2a4cd8b6d" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.945554 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4zpf4" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.950388 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b6c754dc9-jzxfs" Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.950429 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6c754dc9-jzxfs" event={"ID":"079bc323-9373-49b7-ba69-d64155e9e902","Type":"ContainerDied","Data":"276ee0c126a776e50b6e4ac34ec6efc0ade7e8454dd26d1120c063e088c05948"} Mar 12 15:10:38 crc kubenswrapper[4869]: I0312 15:10:38.950464 4869 scope.go:117] "RemoveContainer" containerID="e3a07bd69d8e49c784071c6adf360c3eb344bcfd4b9a799e7db5cdfbbfaafbec" Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.005941 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b6c754dc9-jzxfs"] Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.010935 4869 scope.go:117] "RemoveContainer" containerID="f31bb762f40d5512ae7613e29ecc5b03ad21a7c2e2bf74be3487911076996654" Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.025152 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b6c754dc9-jzxfs"] Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.081775 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.082012 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d6802a93-e4d4-4e66-af1d-417e105b51a7" containerName="nova-api-log" containerID="cri-o://02653386e90ac849973df55ff06718f3a8c7354b38d0d8c56a678389d7dbb943" gracePeriod=30 Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.082131 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d6802a93-e4d4-4e66-af1d-417e105b51a7" containerName="nova-api-api" containerID="cri-o://a9612bb1f3dd9a66aca41cceb7cfc88b3472cc70bb76fef34fdff2a9ee21b413" gracePeriod=30 Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.137668 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.137965 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d03a118b-3b5d-437b-b3e1-dcf419ec3065" containerName="nova-scheduler-scheduler" containerID="cri-o://870ebf208cd0fac909d931e373003ad829d820f144149533ee90b199d90207af" gracePeriod=30 Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.164138 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.164358 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7d835be4-a958-46b9-8319-4a13cb8ee018" containerName="nova-metadata-log" containerID="cri-o://881b0937148b30891a0a3d74a9663e2bd9b18812d2b722a689daa9673d969f08" gracePeriod=30 Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.165782 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7d835be4-a958-46b9-8319-4a13cb8ee018" containerName="nova-metadata-metadata" containerID="cri-o://bb4d62e29cf6dbc63591016052aa5285281f1051df5b2e48e5b66290aa8d96ee" gracePeriod=30 Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.219248 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sz5k6"] Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.818507 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.909052 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6802a93-e4d4-4e66-af1d-417e105b51a7-combined-ca-bundle\") pod \"d6802a93-e4d4-4e66-af1d-417e105b51a7\" (UID: \"d6802a93-e4d4-4e66-af1d-417e105b51a7\") " Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.909369 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6802a93-e4d4-4e66-af1d-417e105b51a7-logs\") pod \"d6802a93-e4d4-4e66-af1d-417e105b51a7\" (UID: \"d6802a93-e4d4-4e66-af1d-417e105b51a7\") " Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.909471 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6802a93-e4d4-4e66-af1d-417e105b51a7-config-data\") pod \"d6802a93-e4d4-4e66-af1d-417e105b51a7\" (UID: \"d6802a93-e4d4-4e66-af1d-417e105b51a7\") " Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.909588 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6802a93-e4d4-4e66-af1d-417e105b51a7-internal-tls-certs\") pod \"d6802a93-e4d4-4e66-af1d-417e105b51a7\" (UID: \"d6802a93-e4d4-4e66-af1d-417e105b51a7\") " Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.909728 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6802a93-e4d4-4e66-af1d-417e105b51a7-public-tls-certs\") pod \"d6802a93-e4d4-4e66-af1d-417e105b51a7\" (UID: \"d6802a93-e4d4-4e66-af1d-417e105b51a7\") " Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.909899 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l74wz\" (UniqueName: \"kubernetes.io/projected/d6802a93-e4d4-4e66-af1d-417e105b51a7-kube-api-access-l74wz\") pod \"d6802a93-e4d4-4e66-af1d-417e105b51a7\" (UID: \"d6802a93-e4d4-4e66-af1d-417e105b51a7\") " Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.912794 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6802a93-e4d4-4e66-af1d-417e105b51a7-logs" (OuterVolumeSpecName: "logs") pod "d6802a93-e4d4-4e66-af1d-417e105b51a7" (UID: "d6802a93-e4d4-4e66-af1d-417e105b51a7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.924755 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6802a93-e4d4-4e66-af1d-417e105b51a7-kube-api-access-l74wz" (OuterVolumeSpecName: "kube-api-access-l74wz") pod "d6802a93-e4d4-4e66-af1d-417e105b51a7" (UID: "d6802a93-e4d4-4e66-af1d-417e105b51a7"). InnerVolumeSpecName "kube-api-access-l74wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.940283 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6802a93-e4d4-4e66-af1d-417e105b51a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6802a93-e4d4-4e66-af1d-417e105b51a7" (UID: "d6802a93-e4d4-4e66-af1d-417e105b51a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.950207 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6802a93-e4d4-4e66-af1d-417e105b51a7-config-data" (OuterVolumeSpecName: "config-data") pod "d6802a93-e4d4-4e66-af1d-417e105b51a7" (UID: "d6802a93-e4d4-4e66-af1d-417e105b51a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.963406 4869 generic.go:334] "Generic (PLEG): container finished" podID="7d835be4-a958-46b9-8319-4a13cb8ee018" containerID="881b0937148b30891a0a3d74a9663e2bd9b18812d2b722a689daa9673d969f08" exitCode=143 Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.963481 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d835be4-a958-46b9-8319-4a13cb8ee018","Type":"ContainerDied","Data":"881b0937148b30891a0a3d74a9663e2bd9b18812d2b722a689daa9673d969f08"} Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.966165 4869 generic.go:334] "Generic (PLEG): container finished" podID="d6802a93-e4d4-4e66-af1d-417e105b51a7" containerID="a9612bb1f3dd9a66aca41cceb7cfc88b3472cc70bb76fef34fdff2a9ee21b413" exitCode=0 Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.966205 4869 generic.go:334] "Generic (PLEG): container finished" podID="d6802a93-e4d4-4e66-af1d-417e105b51a7" containerID="02653386e90ac849973df55ff06718f3a8c7354b38d0d8c56a678389d7dbb943" exitCode=143 Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.966236 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.966241 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d6802a93-e4d4-4e66-af1d-417e105b51a7","Type":"ContainerDied","Data":"a9612bb1f3dd9a66aca41cceb7cfc88b3472cc70bb76fef34fdff2a9ee21b413"} Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.966277 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d6802a93-e4d4-4e66-af1d-417e105b51a7","Type":"ContainerDied","Data":"02653386e90ac849973df55ff06718f3a8c7354b38d0d8c56a678389d7dbb943"} Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.966313 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d6802a93-e4d4-4e66-af1d-417e105b51a7","Type":"ContainerDied","Data":"ee23f40bd627b4878bc3fafb3e2df7b667c0cf490400a3807cf596527506f89c"} Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.966335 4869 scope.go:117] "RemoveContainer" containerID="a9612bb1f3dd9a66aca41cceb7cfc88b3472cc70bb76fef34fdff2a9ee21b413" Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.970756 4869 generic.go:334] "Generic (PLEG): container finished" podID="842d495b-32b4-4235-8cce-c4a1af711991" containerID="cde1c278734d5ecbe196d95a9c38a4487f773fa7357efff7c6d7ca088f2939bc" exitCode=0 Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.970801 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sz5k6" event={"ID":"842d495b-32b4-4235-8cce-c4a1af711991","Type":"ContainerDied","Data":"cde1c278734d5ecbe196d95a9c38a4487f773fa7357efff7c6d7ca088f2939bc"} Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.970823 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sz5k6" event={"ID":"842d495b-32b4-4235-8cce-c4a1af711991","Type":"ContainerStarted","Data":"073b793021b23a6a4322c1aaae80f0c602462bbf36293a6875ac0375ee293a77"} Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.975742 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6802a93-e4d4-4e66-af1d-417e105b51a7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d6802a93-e4d4-4e66-af1d-417e105b51a7" (UID: "d6802a93-e4d4-4e66-af1d-417e105b51a7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.976557 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6802a93-e4d4-4e66-af1d-417e105b51a7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d6802a93-e4d4-4e66-af1d-417e105b51a7" (UID: "d6802a93-e4d4-4e66-af1d-417e105b51a7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:10:39 crc kubenswrapper[4869]: I0312 15:10:39.999749 4869 scope.go:117] "RemoveContainer" containerID="02653386e90ac849973df55ff06718f3a8c7354b38d0d8c56a678389d7dbb943" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.012473 4869 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6802a93-e4d4-4e66-af1d-417e105b51a7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.013108 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l74wz\" (UniqueName: \"kubernetes.io/projected/d6802a93-e4d4-4e66-af1d-417e105b51a7-kube-api-access-l74wz\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.013173 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6802a93-e4d4-4e66-af1d-417e105b51a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.013324 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6802a93-e4d4-4e66-af1d-417e105b51a7-logs\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.013378 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6802a93-e4d4-4e66-af1d-417e105b51a7-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.013432 4869 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6802a93-e4d4-4e66-af1d-417e105b51a7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.026862 4869 scope.go:117] "RemoveContainer" containerID="a9612bb1f3dd9a66aca41cceb7cfc88b3472cc70bb76fef34fdff2a9ee21b413" Mar 12 15:10:40 crc kubenswrapper[4869]: E0312 15:10:40.027568 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9612bb1f3dd9a66aca41cceb7cfc88b3472cc70bb76fef34fdff2a9ee21b413\": container with ID starting with a9612bb1f3dd9a66aca41cceb7cfc88b3472cc70bb76fef34fdff2a9ee21b413 not found: ID does not exist" containerID="a9612bb1f3dd9a66aca41cceb7cfc88b3472cc70bb76fef34fdff2a9ee21b413" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.027602 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9612bb1f3dd9a66aca41cceb7cfc88b3472cc70bb76fef34fdff2a9ee21b413"} err="failed to get container status \"a9612bb1f3dd9a66aca41cceb7cfc88b3472cc70bb76fef34fdff2a9ee21b413\": rpc error: code = NotFound desc = could not find container \"a9612bb1f3dd9a66aca41cceb7cfc88b3472cc70bb76fef34fdff2a9ee21b413\": container with ID starting with a9612bb1f3dd9a66aca41cceb7cfc88b3472cc70bb76fef34fdff2a9ee21b413 not found: ID does not exist" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.027628 4869 scope.go:117] "RemoveContainer" containerID="02653386e90ac849973df55ff06718f3a8c7354b38d0d8c56a678389d7dbb943" Mar 12 15:10:40 crc kubenswrapper[4869]: E0312 15:10:40.027971 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02653386e90ac849973df55ff06718f3a8c7354b38d0d8c56a678389d7dbb943\": container with ID starting with 02653386e90ac849973df55ff06718f3a8c7354b38d0d8c56a678389d7dbb943 not found: ID does not exist" containerID="02653386e90ac849973df55ff06718f3a8c7354b38d0d8c56a678389d7dbb943" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.028038 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02653386e90ac849973df55ff06718f3a8c7354b38d0d8c56a678389d7dbb943"} err="failed to get container status \"02653386e90ac849973df55ff06718f3a8c7354b38d0d8c56a678389d7dbb943\": rpc error: code = NotFound desc = could not find container \"02653386e90ac849973df55ff06718f3a8c7354b38d0d8c56a678389d7dbb943\": container with ID starting with 02653386e90ac849973df55ff06718f3a8c7354b38d0d8c56a678389d7dbb943 not found: ID does not exist" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.028071 4869 scope.go:117] "RemoveContainer" containerID="a9612bb1f3dd9a66aca41cceb7cfc88b3472cc70bb76fef34fdff2a9ee21b413" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.029965 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9612bb1f3dd9a66aca41cceb7cfc88b3472cc70bb76fef34fdff2a9ee21b413"} err="failed to get container status \"a9612bb1f3dd9a66aca41cceb7cfc88b3472cc70bb76fef34fdff2a9ee21b413\": rpc error: code = NotFound desc = could not find container \"a9612bb1f3dd9a66aca41cceb7cfc88b3472cc70bb76fef34fdff2a9ee21b413\": container with ID starting with a9612bb1f3dd9a66aca41cceb7cfc88b3472cc70bb76fef34fdff2a9ee21b413 not found: ID does not exist" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.029994 4869 scope.go:117] "RemoveContainer" containerID="02653386e90ac849973df55ff06718f3a8c7354b38d0d8c56a678389d7dbb943" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.030329 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02653386e90ac849973df55ff06718f3a8c7354b38d0d8c56a678389d7dbb943"} err="failed to get container status \"02653386e90ac849973df55ff06718f3a8c7354b38d0d8c56a678389d7dbb943\": rpc error: code = NotFound desc = could not find container \"02653386e90ac849973df55ff06718f3a8c7354b38d0d8c56a678389d7dbb943\": container with ID starting with 02653386e90ac849973df55ff06718f3a8c7354b38d0d8c56a678389d7dbb943 not found: ID does not exist" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.296938 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.305747 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.323385 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 12 15:10:40 crc kubenswrapper[4869]: E0312 15:10:40.323830 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b423daaf-54ec-4a18-a9b6-572c4f32a207" containerName="nova-manage" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.323851 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="b423daaf-54ec-4a18-a9b6-572c4f32a207" containerName="nova-manage" Mar 12 15:10:40 crc kubenswrapper[4869]: E0312 15:10:40.323864 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6802a93-e4d4-4e66-af1d-417e105b51a7" containerName="nova-api-api" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.323870 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6802a93-e4d4-4e66-af1d-417e105b51a7" containerName="nova-api-api" Mar 12 15:10:40 crc kubenswrapper[4869]: E0312 15:10:40.323880 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6802a93-e4d4-4e66-af1d-417e105b51a7" containerName="nova-api-log" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.323886 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6802a93-e4d4-4e66-af1d-417e105b51a7" containerName="nova-api-log" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.325955 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6802a93-e4d4-4e66-af1d-417e105b51a7" containerName="nova-api-log" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.326000 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="b423daaf-54ec-4a18-a9b6-572c4f32a207" containerName="nova-manage" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.326013 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6802a93-e4d4-4e66-af1d-417e105b51a7" containerName="nova-api-api" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.327423 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.331725 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.331811 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.331744 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.353217 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="079bc323-9373-49b7-ba69-d64155e9e902" path="/var/lib/kubelet/pods/079bc323-9373-49b7-ba69-d64155e9e902/volumes" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.354103 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6802a93-e4d4-4e66-af1d-417e105b51a7" path="/var/lib/kubelet/pods/d6802a93-e4d4-4e66-af1d-417e105b51a7/volumes" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.354821 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.422882 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd9n9\" (UniqueName: \"kubernetes.io/projected/ded10d99-1db4-470e-bc5f-356cf78424c4-kube-api-access-jd9n9\") pod \"nova-api-0\" (UID: \"ded10d99-1db4-470e-bc5f-356cf78424c4\") " pod="openstack/nova-api-0" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.422989 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded10d99-1db4-470e-bc5f-356cf78424c4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ded10d99-1db4-470e-bc5f-356cf78424c4\") " pod="openstack/nova-api-0" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.423037 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded10d99-1db4-470e-bc5f-356cf78424c4-config-data\") pod \"nova-api-0\" (UID: \"ded10d99-1db4-470e-bc5f-356cf78424c4\") " pod="openstack/nova-api-0" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.423099 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ded10d99-1db4-470e-bc5f-356cf78424c4-logs\") pod \"nova-api-0\" (UID: \"ded10d99-1db4-470e-bc5f-356cf78424c4\") " pod="openstack/nova-api-0" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.423195 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ded10d99-1db4-470e-bc5f-356cf78424c4-public-tls-certs\") pod \"nova-api-0\" (UID: \"ded10d99-1db4-470e-bc5f-356cf78424c4\") " pod="openstack/nova-api-0" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.423258 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ded10d99-1db4-470e-bc5f-356cf78424c4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ded10d99-1db4-470e-bc5f-356cf78424c4\") " pod="openstack/nova-api-0" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.528927 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ded10d99-1db4-470e-bc5f-356cf78424c4-logs\") pod \"nova-api-0\" (UID: \"ded10d99-1db4-470e-bc5f-356cf78424c4\") " pod="openstack/nova-api-0" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.529037 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ded10d99-1db4-470e-bc5f-356cf78424c4-public-tls-certs\") pod \"nova-api-0\" (UID: \"ded10d99-1db4-470e-bc5f-356cf78424c4\") " pod="openstack/nova-api-0" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.529108 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ded10d99-1db4-470e-bc5f-356cf78424c4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ded10d99-1db4-470e-bc5f-356cf78424c4\") " pod="openstack/nova-api-0" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.529200 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd9n9\" (UniqueName: \"kubernetes.io/projected/ded10d99-1db4-470e-bc5f-356cf78424c4-kube-api-access-jd9n9\") pod \"nova-api-0\" (UID: \"ded10d99-1db4-470e-bc5f-356cf78424c4\") " pod="openstack/nova-api-0" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.529244 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded10d99-1db4-470e-bc5f-356cf78424c4-config-data\") pod \"nova-api-0\" (UID: \"ded10d99-1db4-470e-bc5f-356cf78424c4\") " pod="openstack/nova-api-0" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.529263 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded10d99-1db4-470e-bc5f-356cf78424c4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ded10d99-1db4-470e-bc5f-356cf78424c4\") " pod="openstack/nova-api-0" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.529396 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ded10d99-1db4-470e-bc5f-356cf78424c4-logs\") pod \"nova-api-0\" (UID: \"ded10d99-1db4-470e-bc5f-356cf78424c4\") " pod="openstack/nova-api-0" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.536151 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ded10d99-1db4-470e-bc5f-356cf78424c4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ded10d99-1db4-470e-bc5f-356cf78424c4\") " pod="openstack/nova-api-0" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.536716 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded10d99-1db4-470e-bc5f-356cf78424c4-config-data\") pod \"nova-api-0\" (UID: \"ded10d99-1db4-470e-bc5f-356cf78424c4\") " pod="openstack/nova-api-0" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.541073 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ded10d99-1db4-470e-bc5f-356cf78424c4-public-tls-certs\") pod \"nova-api-0\" (UID: \"ded10d99-1db4-470e-bc5f-356cf78424c4\") " pod="openstack/nova-api-0" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.543897 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded10d99-1db4-470e-bc5f-356cf78424c4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ded10d99-1db4-470e-bc5f-356cf78424c4\") " pod="openstack/nova-api-0" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.548863 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd9n9\" (UniqueName: \"kubernetes.io/projected/ded10d99-1db4-470e-bc5f-356cf78424c4-kube-api-access-jd9n9\") pod \"nova-api-0\" (UID: \"ded10d99-1db4-470e-bc5f-356cf78424c4\") " pod="openstack/nova-api-0" Mar 12 15:10:40 crc kubenswrapper[4869]: E0312 15:10:40.630305 4869 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd03a118b_3b5d_437b_b3e1_dcf419ec3065.slice/crio-conmon-870ebf208cd0fac909d931e373003ad829d820f144149533ee90b199d90207af.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd03a118b_3b5d_437b_b3e1_dcf419ec3065.slice/crio-870ebf208cd0fac909d931e373003ad829d820f144149533ee90b199d90207af.scope\": RecentStats: unable to find data in memory cache]" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.649643 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.875166 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.990588 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sz5k6" event={"ID":"842d495b-32b4-4235-8cce-c4a1af711991","Type":"ContainerStarted","Data":"dff057b6ab07f796ee5a8bfadb9b50543e16c73b3d5dd2883307adabca6b0f51"} Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.995615 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.995702 4869 generic.go:334] "Generic (PLEG): container finished" podID="d03a118b-3b5d-437b-b3e1-dcf419ec3065" containerID="870ebf208cd0fac909d931e373003ad829d820f144149533ee90b199d90207af" exitCode=0 Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.995887 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d03a118b-3b5d-437b-b3e1-dcf419ec3065","Type":"ContainerDied","Data":"870ebf208cd0fac909d931e373003ad829d820f144149533ee90b199d90207af"} Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.995954 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d03a118b-3b5d-437b-b3e1-dcf419ec3065","Type":"ContainerDied","Data":"05eade6b9531e29e74adfa75a0e186734fafdf432f243b282ed50939b5939e65"} Mar 12 15:10:40 crc kubenswrapper[4869]: I0312 15:10:40.995975 4869 scope.go:117] "RemoveContainer" containerID="870ebf208cd0fac909d931e373003ad829d820f144149533ee90b199d90207af" Mar 12 15:10:41 crc kubenswrapper[4869]: I0312 15:10:41.027152 4869 scope.go:117] "RemoveContainer" containerID="870ebf208cd0fac909d931e373003ad829d820f144149533ee90b199d90207af" Mar 12 15:10:41 crc kubenswrapper[4869]: E0312 15:10:41.027845 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"870ebf208cd0fac909d931e373003ad829d820f144149533ee90b199d90207af\": container with ID starting with 870ebf208cd0fac909d931e373003ad829d820f144149533ee90b199d90207af not found: ID does not exist" containerID="870ebf208cd0fac909d931e373003ad829d820f144149533ee90b199d90207af" Mar 12 15:10:41 crc kubenswrapper[4869]: I0312 15:10:41.027914 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"870ebf208cd0fac909d931e373003ad829d820f144149533ee90b199d90207af"} err="failed to get container status \"870ebf208cd0fac909d931e373003ad829d820f144149533ee90b199d90207af\": rpc error: code = NotFound desc = could not find container \"870ebf208cd0fac909d931e373003ad829d820f144149533ee90b199d90207af\": container with ID starting with 870ebf208cd0fac909d931e373003ad829d820f144149533ee90b199d90207af not found: ID does not exist" Mar 12 15:10:41 crc kubenswrapper[4869]: I0312 15:10:41.041583 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d03a118b-3b5d-437b-b3e1-dcf419ec3065-combined-ca-bundle\") pod \"d03a118b-3b5d-437b-b3e1-dcf419ec3065\" (UID: \"d03a118b-3b5d-437b-b3e1-dcf419ec3065\") " Mar 12 15:10:41 crc kubenswrapper[4869]: I0312 15:10:41.041671 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d03a118b-3b5d-437b-b3e1-dcf419ec3065-config-data\") pod \"d03a118b-3b5d-437b-b3e1-dcf419ec3065\" (UID: \"d03a118b-3b5d-437b-b3e1-dcf419ec3065\") " Mar 12 15:10:41 crc kubenswrapper[4869]: I0312 15:10:41.041950 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxsmm\" (UniqueName: \"kubernetes.io/projected/d03a118b-3b5d-437b-b3e1-dcf419ec3065-kube-api-access-zxsmm\") pod \"d03a118b-3b5d-437b-b3e1-dcf419ec3065\" (UID: \"d03a118b-3b5d-437b-b3e1-dcf419ec3065\") " Mar 12 15:10:41 crc kubenswrapper[4869]: I0312 15:10:41.054707 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d03a118b-3b5d-437b-b3e1-dcf419ec3065-kube-api-access-zxsmm" (OuterVolumeSpecName: "kube-api-access-zxsmm") pod "d03a118b-3b5d-437b-b3e1-dcf419ec3065" (UID: "d03a118b-3b5d-437b-b3e1-dcf419ec3065"). InnerVolumeSpecName "kube-api-access-zxsmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:10:41 crc kubenswrapper[4869]: I0312 15:10:41.079930 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d03a118b-3b5d-437b-b3e1-dcf419ec3065-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d03a118b-3b5d-437b-b3e1-dcf419ec3065" (UID: "d03a118b-3b5d-437b-b3e1-dcf419ec3065"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:10:41 crc kubenswrapper[4869]: I0312 15:10:41.079946 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d03a118b-3b5d-437b-b3e1-dcf419ec3065-config-data" (OuterVolumeSpecName: "config-data") pod "d03a118b-3b5d-437b-b3e1-dcf419ec3065" (UID: "d03a118b-3b5d-437b-b3e1-dcf419ec3065"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:10:41 crc kubenswrapper[4869]: I0312 15:10:41.144606 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d03a118b-3b5d-437b-b3e1-dcf419ec3065-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:41 crc kubenswrapper[4869]: I0312 15:10:41.144849 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d03a118b-3b5d-437b-b3e1-dcf419ec3065-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:41 crc kubenswrapper[4869]: I0312 15:10:41.144858 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxsmm\" (UniqueName: \"kubernetes.io/projected/d03a118b-3b5d-437b-b3e1-dcf419ec3065-kube-api-access-zxsmm\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:41 crc kubenswrapper[4869]: I0312 15:10:41.180925 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 15:10:41 crc kubenswrapper[4869]: I0312 15:10:41.361713 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 15:10:41 crc kubenswrapper[4869]: I0312 15:10:41.381736 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 15:10:41 crc kubenswrapper[4869]: I0312 15:10:41.391425 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 15:10:41 crc kubenswrapper[4869]: E0312 15:10:41.391901 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d03a118b-3b5d-437b-b3e1-dcf419ec3065" containerName="nova-scheduler-scheduler" Mar 12 15:10:41 crc kubenswrapper[4869]: I0312 15:10:41.391916 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="d03a118b-3b5d-437b-b3e1-dcf419ec3065" containerName="nova-scheduler-scheduler" Mar 12 15:10:41 crc kubenswrapper[4869]: I0312 15:10:41.392143 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="d03a118b-3b5d-437b-b3e1-dcf419ec3065" containerName="nova-scheduler-scheduler" Mar 12 15:10:41 crc kubenswrapper[4869]: I0312 15:10:41.392822 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 15:10:41 crc kubenswrapper[4869]: I0312 15:10:41.395424 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 12 15:10:41 crc kubenswrapper[4869]: I0312 15:10:41.403406 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 15:10:41 crc kubenswrapper[4869]: I0312 15:10:41.552230 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8a25239-e1b9-4ae9-a044-8362e70d6959-config-data\") pod \"nova-scheduler-0\" (UID: \"f8a25239-e1b9-4ae9-a044-8362e70d6959\") " pod="openstack/nova-scheduler-0" Mar 12 15:10:41 crc kubenswrapper[4869]: I0312 15:10:41.552269 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86lxz\" (UniqueName: \"kubernetes.io/projected/f8a25239-e1b9-4ae9-a044-8362e70d6959-kube-api-access-86lxz\") pod \"nova-scheduler-0\" (UID: \"f8a25239-e1b9-4ae9-a044-8362e70d6959\") " pod="openstack/nova-scheduler-0" Mar 12 15:10:41 crc kubenswrapper[4869]: I0312 15:10:41.552533 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8a25239-e1b9-4ae9-a044-8362e70d6959-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f8a25239-e1b9-4ae9-a044-8362e70d6959\") " pod="openstack/nova-scheduler-0" Mar 12 15:10:41 crc kubenswrapper[4869]: I0312 15:10:41.654685 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8a25239-e1b9-4ae9-a044-8362e70d6959-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f8a25239-e1b9-4ae9-a044-8362e70d6959\") " pod="openstack/nova-scheduler-0" Mar 12 15:10:41 crc kubenswrapper[4869]: I0312 15:10:41.655285 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8a25239-e1b9-4ae9-a044-8362e70d6959-config-data\") pod \"nova-scheduler-0\" (UID: \"f8a25239-e1b9-4ae9-a044-8362e70d6959\") " pod="openstack/nova-scheduler-0" Mar 12 15:10:41 crc kubenswrapper[4869]: I0312 15:10:41.655334 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86lxz\" (UniqueName: \"kubernetes.io/projected/f8a25239-e1b9-4ae9-a044-8362e70d6959-kube-api-access-86lxz\") pod \"nova-scheduler-0\" (UID: \"f8a25239-e1b9-4ae9-a044-8362e70d6959\") " pod="openstack/nova-scheduler-0" Mar 12 15:10:41 crc kubenswrapper[4869]: I0312 15:10:41.660151 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8a25239-e1b9-4ae9-a044-8362e70d6959-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f8a25239-e1b9-4ae9-a044-8362e70d6959\") " pod="openstack/nova-scheduler-0" Mar 12 15:10:41 crc kubenswrapper[4869]: I0312 15:10:41.660243 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8a25239-e1b9-4ae9-a044-8362e70d6959-config-data\") pod \"nova-scheduler-0\" (UID: \"f8a25239-e1b9-4ae9-a044-8362e70d6959\") " pod="openstack/nova-scheduler-0" Mar 12 15:10:41 crc kubenswrapper[4869]: I0312 15:10:41.674592 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86lxz\" (UniqueName: \"kubernetes.io/projected/f8a25239-e1b9-4ae9-a044-8362e70d6959-kube-api-access-86lxz\") pod \"nova-scheduler-0\" (UID: \"f8a25239-e1b9-4ae9-a044-8362e70d6959\") " pod="openstack/nova-scheduler-0" Mar 12 15:10:41 crc kubenswrapper[4869]: I0312 15:10:41.719910 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 15:10:42 crc kubenswrapper[4869]: I0312 15:10:42.011987 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ded10d99-1db4-470e-bc5f-356cf78424c4","Type":"ContainerStarted","Data":"1446a0095d6082e79fd3453b7645529d1baa230f6f28f031c1178625bbf54237"} Mar 12 15:10:42 crc kubenswrapper[4869]: I0312 15:10:42.012437 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ded10d99-1db4-470e-bc5f-356cf78424c4","Type":"ContainerStarted","Data":"c73fcb2f4d547c08a4e345eadcdff4e64928e37907266e5c111006ed99057c19"} Mar 12 15:10:42 crc kubenswrapper[4869]: I0312 15:10:42.012447 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ded10d99-1db4-470e-bc5f-356cf78424c4","Type":"ContainerStarted","Data":"93f7501cc8b3c1c5767e7518fcea0aeba3782e6fb0bcc8f84732052aeeaa5f7d"} Mar 12 15:10:42 crc kubenswrapper[4869]: I0312 15:10:42.016633 4869 generic.go:334] "Generic (PLEG): container finished" podID="842d495b-32b4-4235-8cce-c4a1af711991" containerID="dff057b6ab07f796ee5a8bfadb9b50543e16c73b3d5dd2883307adabca6b0f51" exitCode=0 Mar 12 15:10:42 crc kubenswrapper[4869]: I0312 15:10:42.016707 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sz5k6" event={"ID":"842d495b-32b4-4235-8cce-c4a1af711991","Type":"ContainerDied","Data":"dff057b6ab07f796ee5a8bfadb9b50543e16c73b3d5dd2883307adabca6b0f51"} Mar 12 15:10:42 crc kubenswrapper[4869]: I0312 15:10:42.048089 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.048066218 podStartE2EDuration="2.048066218s" podCreationTimestamp="2026-03-12 15:10:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:10:42.038903697 +0000 UTC m=+1394.324129015" watchObservedRunningTime="2026-03-12 15:10:42.048066218 +0000 UTC m=+1394.333291496" Mar 12 15:10:42 crc kubenswrapper[4869]: I0312 15:10:42.155322 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 15:10:42 crc kubenswrapper[4869]: W0312 15:10:42.155435 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8a25239_e1b9_4ae9_a044_8362e70d6959.slice/crio-41be560209089c37db9ec4ca3a0eff3811ca70049f64b70d6ed28e21932be318 WatchSource:0}: Error finding container 41be560209089c37db9ec4ca3a0eff3811ca70049f64b70d6ed28e21932be318: Status 404 returned error can't find the container with id 41be560209089c37db9ec4ca3a0eff3811ca70049f64b70d6ed28e21932be318 Mar 12 15:10:42 crc kubenswrapper[4869]: I0312 15:10:42.355113 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d03a118b-3b5d-437b-b3e1-dcf419ec3065" path="/var/lib/kubelet/pods/d03a118b-3b5d-437b-b3e1-dcf419ec3065/volumes" Mar 12 15:10:42 crc kubenswrapper[4869]: I0312 15:10:42.667329 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 15:10:42 crc kubenswrapper[4869]: I0312 15:10:42.787440 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d835be4-a958-46b9-8319-4a13cb8ee018-nova-metadata-tls-certs\") pod \"7d835be4-a958-46b9-8319-4a13cb8ee018\" (UID: \"7d835be4-a958-46b9-8319-4a13cb8ee018\") " Mar 12 15:10:42 crc kubenswrapper[4869]: I0312 15:10:42.787514 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d835be4-a958-46b9-8319-4a13cb8ee018-logs\") pod \"7d835be4-a958-46b9-8319-4a13cb8ee018\" (UID: \"7d835be4-a958-46b9-8319-4a13cb8ee018\") " Mar 12 15:10:42 crc kubenswrapper[4869]: I0312 15:10:42.787650 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d835be4-a958-46b9-8319-4a13cb8ee018-config-data\") pod \"7d835be4-a958-46b9-8319-4a13cb8ee018\" (UID: \"7d835be4-a958-46b9-8319-4a13cb8ee018\") " Mar 12 15:10:42 crc kubenswrapper[4869]: I0312 15:10:42.787691 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvlfz\" (UniqueName: \"kubernetes.io/projected/7d835be4-a958-46b9-8319-4a13cb8ee018-kube-api-access-gvlfz\") pod \"7d835be4-a958-46b9-8319-4a13cb8ee018\" (UID: \"7d835be4-a958-46b9-8319-4a13cb8ee018\") " Mar 12 15:10:42 crc kubenswrapper[4869]: I0312 15:10:42.787743 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d835be4-a958-46b9-8319-4a13cb8ee018-combined-ca-bundle\") pod \"7d835be4-a958-46b9-8319-4a13cb8ee018\" (UID: \"7d835be4-a958-46b9-8319-4a13cb8ee018\") " Mar 12 15:10:42 crc kubenswrapper[4869]: I0312 15:10:42.789561 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d835be4-a958-46b9-8319-4a13cb8ee018-logs" (OuterVolumeSpecName: "logs") pod "7d835be4-a958-46b9-8319-4a13cb8ee018" (UID: "7d835be4-a958-46b9-8319-4a13cb8ee018"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:10:42 crc kubenswrapper[4869]: I0312 15:10:42.799797 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d835be4-a958-46b9-8319-4a13cb8ee018-kube-api-access-gvlfz" (OuterVolumeSpecName: "kube-api-access-gvlfz") pod "7d835be4-a958-46b9-8319-4a13cb8ee018" (UID: "7d835be4-a958-46b9-8319-4a13cb8ee018"). InnerVolumeSpecName "kube-api-access-gvlfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:10:42 crc kubenswrapper[4869]: I0312 15:10:42.822652 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d835be4-a958-46b9-8319-4a13cb8ee018-config-data" (OuterVolumeSpecName: "config-data") pod "7d835be4-a958-46b9-8319-4a13cb8ee018" (UID: "7d835be4-a958-46b9-8319-4a13cb8ee018"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:10:42 crc kubenswrapper[4869]: I0312 15:10:42.826772 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d835be4-a958-46b9-8319-4a13cb8ee018-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d835be4-a958-46b9-8319-4a13cb8ee018" (UID: "7d835be4-a958-46b9-8319-4a13cb8ee018"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:10:42 crc kubenswrapper[4869]: I0312 15:10:42.847207 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d835be4-a958-46b9-8319-4a13cb8ee018-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "7d835be4-a958-46b9-8319-4a13cb8ee018" (UID: "7d835be4-a958-46b9-8319-4a13cb8ee018"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:10:42 crc kubenswrapper[4869]: I0312 15:10:42.889902 4869 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d835be4-a958-46b9-8319-4a13cb8ee018-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:42 crc kubenswrapper[4869]: I0312 15:10:42.889933 4869 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d835be4-a958-46b9-8319-4a13cb8ee018-logs\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:42 crc kubenswrapper[4869]: I0312 15:10:42.889943 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d835be4-a958-46b9-8319-4a13cb8ee018-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:42 crc kubenswrapper[4869]: I0312 15:10:42.889953 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvlfz\" (UniqueName: \"kubernetes.io/projected/7d835be4-a958-46b9-8319-4a13cb8ee018-kube-api-access-gvlfz\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:42 crc kubenswrapper[4869]: I0312 15:10:42.889987 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d835be4-a958-46b9-8319-4a13cb8ee018-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.030982 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sz5k6" event={"ID":"842d495b-32b4-4235-8cce-c4a1af711991","Type":"ContainerStarted","Data":"9a4d2a60c59276b4349a976cb95338ac3528870208a56bf3c0c3036f6d1f1543"} Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.033797 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f8a25239-e1b9-4ae9-a044-8362e70d6959","Type":"ContainerStarted","Data":"163a236c34b1f7b997eec5b85c3f90af3e2e437d4cbf11edb346ffad1127e729"} Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.033841 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f8a25239-e1b9-4ae9-a044-8362e70d6959","Type":"ContainerStarted","Data":"41be560209089c37db9ec4ca3a0eff3811ca70049f64b70d6ed28e21932be318"} Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.038761 4869 generic.go:334] "Generic (PLEG): container finished" podID="7d835be4-a958-46b9-8319-4a13cb8ee018" containerID="bb4d62e29cf6dbc63591016052aa5285281f1051df5b2e48e5b66290aa8d96ee" exitCode=0 Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.039375 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.041574 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d835be4-a958-46b9-8319-4a13cb8ee018","Type":"ContainerDied","Data":"bb4d62e29cf6dbc63591016052aa5285281f1051df5b2e48e5b66290aa8d96ee"} Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.041645 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d835be4-a958-46b9-8319-4a13cb8ee018","Type":"ContainerDied","Data":"e29d46567310258a937b32673b5e52bc020aade7a46aed78f2f185cd566ec626"} Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.041673 4869 scope.go:117] "RemoveContainer" containerID="bb4d62e29cf6dbc63591016052aa5285281f1051df5b2e48e5b66290aa8d96ee" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.064799 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sz5k6" podStartSLOduration=2.619377798 podStartE2EDuration="5.064771847s" podCreationTimestamp="2026-03-12 15:10:38 +0000 UTC" firstStartedPulling="2026-03-12 15:10:39.985989183 +0000 UTC m=+1392.271214471" lastFinishedPulling="2026-03-12 15:10:42.431383242 +0000 UTC m=+1394.716608520" observedRunningTime="2026-03-12 15:10:43.050033497 +0000 UTC m=+1395.335258785" watchObservedRunningTime="2026-03-12 15:10:43.064771847 +0000 UTC m=+1395.349997135" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.077884 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.077867849 podStartE2EDuration="2.077867849s" podCreationTimestamp="2026-03-12 15:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:10:43.075030559 +0000 UTC m=+1395.360255837" watchObservedRunningTime="2026-03-12 15:10:43.077867849 +0000 UTC m=+1395.363093127" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.100029 4869 scope.go:117] "RemoveContainer" containerID="881b0937148b30891a0a3d74a9663e2bd9b18812d2b722a689daa9673d969f08" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.106795 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.115652 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.123710 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 12 15:10:43 crc kubenswrapper[4869]: E0312 15:10:43.124114 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d835be4-a958-46b9-8319-4a13cb8ee018" containerName="nova-metadata-log" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.124130 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d835be4-a958-46b9-8319-4a13cb8ee018" containerName="nova-metadata-log" Mar 12 15:10:43 crc kubenswrapper[4869]: E0312 15:10:43.124159 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d835be4-a958-46b9-8319-4a13cb8ee018" containerName="nova-metadata-metadata" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.124168 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d835be4-a958-46b9-8319-4a13cb8ee018" containerName="nova-metadata-metadata" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.124322 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d835be4-a958-46b9-8319-4a13cb8ee018" containerName="nova-metadata-metadata" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.124344 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d835be4-a958-46b9-8319-4a13cb8ee018" containerName="nova-metadata-log" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.125763 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.131387 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.131653 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.141254 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.204088 4869 scope.go:117] "RemoveContainer" containerID="bb4d62e29cf6dbc63591016052aa5285281f1051df5b2e48e5b66290aa8d96ee" Mar 12 15:10:43 crc kubenswrapper[4869]: E0312 15:10:43.205183 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb4d62e29cf6dbc63591016052aa5285281f1051df5b2e48e5b66290aa8d96ee\": container with ID starting with bb4d62e29cf6dbc63591016052aa5285281f1051df5b2e48e5b66290aa8d96ee not found: ID does not exist" containerID="bb4d62e29cf6dbc63591016052aa5285281f1051df5b2e48e5b66290aa8d96ee" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.205313 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb4d62e29cf6dbc63591016052aa5285281f1051df5b2e48e5b66290aa8d96ee"} err="failed to get container status \"bb4d62e29cf6dbc63591016052aa5285281f1051df5b2e48e5b66290aa8d96ee\": rpc error: code = NotFound desc = could not find container \"bb4d62e29cf6dbc63591016052aa5285281f1051df5b2e48e5b66290aa8d96ee\": container with ID starting with bb4d62e29cf6dbc63591016052aa5285281f1051df5b2e48e5b66290aa8d96ee not found: ID does not exist" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.205452 4869 scope.go:117] "RemoveContainer" containerID="881b0937148b30891a0a3d74a9663e2bd9b18812d2b722a689daa9673d969f08" Mar 12 15:10:43 crc kubenswrapper[4869]: E0312 15:10:43.205865 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"881b0937148b30891a0a3d74a9663e2bd9b18812d2b722a689daa9673d969f08\": container with ID starting with 881b0937148b30891a0a3d74a9663e2bd9b18812d2b722a689daa9673d969f08 not found: ID does not exist" containerID="881b0937148b30891a0a3d74a9663e2bd9b18812d2b722a689daa9673d969f08" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.205908 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"881b0937148b30891a0a3d74a9663e2bd9b18812d2b722a689daa9673d969f08"} err="failed to get container status \"881b0937148b30891a0a3d74a9663e2bd9b18812d2b722a689daa9673d969f08\": rpc error: code = NotFound desc = could not find container \"881b0937148b30891a0a3d74a9663e2bd9b18812d2b722a689daa9673d969f08\": container with ID starting with 881b0937148b30891a0a3d74a9663e2bd9b18812d2b722a689daa9673d969f08 not found: ID does not exist" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.298845 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2defbef6-f414-43b6-b36e-6d60027baa77-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2defbef6-f414-43b6-b36e-6d60027baa77\") " pod="openstack/nova-metadata-0" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.299190 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2defbef6-f414-43b6-b36e-6d60027baa77-logs\") pod \"nova-metadata-0\" (UID: \"2defbef6-f414-43b6-b36e-6d60027baa77\") " pod="openstack/nova-metadata-0" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.299322 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt466\" (UniqueName: \"kubernetes.io/projected/2defbef6-f414-43b6-b36e-6d60027baa77-kube-api-access-nt466\") pod \"nova-metadata-0\" (UID: \"2defbef6-f414-43b6-b36e-6d60027baa77\") " pod="openstack/nova-metadata-0" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.299734 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2defbef6-f414-43b6-b36e-6d60027baa77-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2defbef6-f414-43b6-b36e-6d60027baa77\") " pod="openstack/nova-metadata-0" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.299848 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2defbef6-f414-43b6-b36e-6d60027baa77-config-data\") pod \"nova-metadata-0\" (UID: \"2defbef6-f414-43b6-b36e-6d60027baa77\") " pod="openstack/nova-metadata-0" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.401482 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2defbef6-f414-43b6-b36e-6d60027baa77-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2defbef6-f414-43b6-b36e-6d60027baa77\") " pod="openstack/nova-metadata-0" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.401565 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2defbef6-f414-43b6-b36e-6d60027baa77-config-data\") pod \"nova-metadata-0\" (UID: \"2defbef6-f414-43b6-b36e-6d60027baa77\") " pod="openstack/nova-metadata-0" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.401624 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2defbef6-f414-43b6-b36e-6d60027baa77-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2defbef6-f414-43b6-b36e-6d60027baa77\") " pod="openstack/nova-metadata-0" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.401688 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2defbef6-f414-43b6-b36e-6d60027baa77-logs\") pod \"nova-metadata-0\" (UID: \"2defbef6-f414-43b6-b36e-6d60027baa77\") " pod="openstack/nova-metadata-0" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.401711 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt466\" (UniqueName: \"kubernetes.io/projected/2defbef6-f414-43b6-b36e-6d60027baa77-kube-api-access-nt466\") pod \"nova-metadata-0\" (UID: \"2defbef6-f414-43b6-b36e-6d60027baa77\") " pod="openstack/nova-metadata-0" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.402343 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2defbef6-f414-43b6-b36e-6d60027baa77-logs\") pod \"nova-metadata-0\" (UID: \"2defbef6-f414-43b6-b36e-6d60027baa77\") " pod="openstack/nova-metadata-0" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.405959 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2defbef6-f414-43b6-b36e-6d60027baa77-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2defbef6-f414-43b6-b36e-6d60027baa77\") " pod="openstack/nova-metadata-0" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.406281 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2defbef6-f414-43b6-b36e-6d60027baa77-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2defbef6-f414-43b6-b36e-6d60027baa77\") " pod="openstack/nova-metadata-0" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.407479 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2defbef6-f414-43b6-b36e-6d60027baa77-config-data\") pod \"nova-metadata-0\" (UID: \"2defbef6-f414-43b6-b36e-6d60027baa77\") " pod="openstack/nova-metadata-0" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.415557 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt466\" (UniqueName: \"kubernetes.io/projected/2defbef6-f414-43b6-b36e-6d60027baa77-kube-api-access-nt466\") pod \"nova-metadata-0\" (UID: \"2defbef6-f414-43b6-b36e-6d60027baa77\") " pod="openstack/nova-metadata-0" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.496081 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 15:10:43 crc kubenswrapper[4869]: I0312 15:10:43.953655 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 15:10:44 crc kubenswrapper[4869]: I0312 15:10:44.049085 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2defbef6-f414-43b6-b36e-6d60027baa77","Type":"ContainerStarted","Data":"e7beaf29431142db15a3bc05eca77164afd587ec518690aba2ffd890a57fbdbe"} Mar 12 15:10:44 crc kubenswrapper[4869]: I0312 15:10:44.355232 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d835be4-a958-46b9-8319-4a13cb8ee018" path="/var/lib/kubelet/pods/7d835be4-a958-46b9-8319-4a13cb8ee018/volumes" Mar 12 15:10:45 crc kubenswrapper[4869]: I0312 15:10:45.061222 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2defbef6-f414-43b6-b36e-6d60027baa77","Type":"ContainerStarted","Data":"1a3ec46481ad85e9d637b91151803bedd39b44dc5eeb03cbefdb142e9f35d954"} Mar 12 15:10:45 crc kubenswrapper[4869]: I0312 15:10:45.061267 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2defbef6-f414-43b6-b36e-6d60027baa77","Type":"ContainerStarted","Data":"2fc445286541421e6017699f20bdd85868c0b44ebc7143125359017647776b5f"} Mar 12 15:10:45 crc kubenswrapper[4869]: I0312 15:10:45.078649 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.078625869 podStartE2EDuration="2.078625869s" podCreationTimestamp="2026-03-12 15:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:10:45.075049037 +0000 UTC m=+1397.360274335" watchObservedRunningTime="2026-03-12 15:10:45.078625869 +0000 UTC m=+1397.363851167" Mar 12 15:10:46 crc kubenswrapper[4869]: I0312 15:10:46.720844 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 12 15:10:48 crc kubenswrapper[4869]: I0312 15:10:48.496963 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 12 15:10:48 crc kubenswrapper[4869]: I0312 15:10:48.497336 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 12 15:10:48 crc kubenswrapper[4869]: I0312 15:10:48.682026 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sz5k6" Mar 12 15:10:48 crc kubenswrapper[4869]: I0312 15:10:48.682077 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sz5k6" Mar 12 15:10:48 crc kubenswrapper[4869]: I0312 15:10:48.726084 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sz5k6" Mar 12 15:10:49 crc kubenswrapper[4869]: I0312 15:10:49.142344 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sz5k6" Mar 12 15:10:49 crc kubenswrapper[4869]: I0312 15:10:49.208999 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sz5k6"] Mar 12 15:10:50 crc kubenswrapper[4869]: I0312 15:10:50.650206 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 12 15:10:50 crc kubenswrapper[4869]: I0312 15:10:50.650970 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 12 15:10:51 crc kubenswrapper[4869]: I0312 15:10:51.115937 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sz5k6" podUID="842d495b-32b4-4235-8cce-c4a1af711991" containerName="registry-server" containerID="cri-o://9a4d2a60c59276b4349a976cb95338ac3528870208a56bf3c0c3036f6d1f1543" gracePeriod=2 Mar 12 15:10:51 crc kubenswrapper[4869]: I0312 15:10:51.377134 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tj6l5"] Mar 12 15:10:51 crc kubenswrapper[4869]: I0312 15:10:51.386704 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tj6l5" Mar 12 15:10:51 crc kubenswrapper[4869]: I0312 15:10:51.411958 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tj6l5"] Mar 12 15:10:51 crc kubenswrapper[4869]: I0312 15:10:51.509849 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrjng\" (UniqueName: \"kubernetes.io/projected/033c416a-2107-40d2-950b-f83b72d7b681-kube-api-access-xrjng\") pod \"redhat-marketplace-tj6l5\" (UID: \"033c416a-2107-40d2-950b-f83b72d7b681\") " pod="openshift-marketplace/redhat-marketplace-tj6l5" Mar 12 15:10:51 crc kubenswrapper[4869]: I0312 15:10:51.510053 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/033c416a-2107-40d2-950b-f83b72d7b681-catalog-content\") pod \"redhat-marketplace-tj6l5\" (UID: \"033c416a-2107-40d2-950b-f83b72d7b681\") " pod="openshift-marketplace/redhat-marketplace-tj6l5" Mar 12 15:10:51 crc kubenswrapper[4869]: I0312 15:10:51.510161 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/033c416a-2107-40d2-950b-f83b72d7b681-utilities\") pod \"redhat-marketplace-tj6l5\" (UID: \"033c416a-2107-40d2-950b-f83b72d7b681\") " pod="openshift-marketplace/redhat-marketplace-tj6l5" Mar 12 15:10:51 crc kubenswrapper[4869]: I0312 15:10:51.612074 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/033c416a-2107-40d2-950b-f83b72d7b681-catalog-content\") pod \"redhat-marketplace-tj6l5\" (UID: \"033c416a-2107-40d2-950b-f83b72d7b681\") " pod="openshift-marketplace/redhat-marketplace-tj6l5" Mar 12 15:10:51 crc kubenswrapper[4869]: I0312 15:10:51.612453 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/033c416a-2107-40d2-950b-f83b72d7b681-utilities\") pod \"redhat-marketplace-tj6l5\" (UID: \"033c416a-2107-40d2-950b-f83b72d7b681\") " pod="openshift-marketplace/redhat-marketplace-tj6l5" Mar 12 15:10:51 crc kubenswrapper[4869]: I0312 15:10:51.612557 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrjng\" (UniqueName: \"kubernetes.io/projected/033c416a-2107-40d2-950b-f83b72d7b681-kube-api-access-xrjng\") pod \"redhat-marketplace-tj6l5\" (UID: \"033c416a-2107-40d2-950b-f83b72d7b681\") " pod="openshift-marketplace/redhat-marketplace-tj6l5" Mar 12 15:10:51 crc kubenswrapper[4869]: I0312 15:10:51.612942 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/033c416a-2107-40d2-950b-f83b72d7b681-utilities\") pod \"redhat-marketplace-tj6l5\" (UID: \"033c416a-2107-40d2-950b-f83b72d7b681\") " pod="openshift-marketplace/redhat-marketplace-tj6l5" Mar 12 15:10:51 crc kubenswrapper[4869]: I0312 15:10:51.614763 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/033c416a-2107-40d2-950b-f83b72d7b681-catalog-content\") pod \"redhat-marketplace-tj6l5\" (UID: \"033c416a-2107-40d2-950b-f83b72d7b681\") " pod="openshift-marketplace/redhat-marketplace-tj6l5" Mar 12 15:10:51 crc kubenswrapper[4869]: I0312 15:10:51.639448 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrjng\" (UniqueName: \"kubernetes.io/projected/033c416a-2107-40d2-950b-f83b72d7b681-kube-api-access-xrjng\") pod \"redhat-marketplace-tj6l5\" (UID: \"033c416a-2107-40d2-950b-f83b72d7b681\") " pod="openshift-marketplace/redhat-marketplace-tj6l5" Mar 12 15:10:51 crc kubenswrapper[4869]: I0312 15:10:51.664480 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ded10d99-1db4-470e-bc5f-356cf78424c4" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.224:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 15:10:51 crc kubenswrapper[4869]: I0312 15:10:51.664903 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ded10d99-1db4-470e-bc5f-356cf78424c4" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.224:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 15:10:51 crc kubenswrapper[4869]: I0312 15:10:51.721126 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 12 15:10:51 crc kubenswrapper[4869]: I0312 15:10:51.727356 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tj6l5" Mar 12 15:10:51 crc kubenswrapper[4869]: I0312 15:10:51.772260 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 12 15:10:51 crc kubenswrapper[4869]: I0312 15:10:51.832821 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sz5k6" Mar 12 15:10:51 crc kubenswrapper[4869]: I0312 15:10:51.916867 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjjkk\" (UniqueName: \"kubernetes.io/projected/842d495b-32b4-4235-8cce-c4a1af711991-kube-api-access-gjjkk\") pod \"842d495b-32b4-4235-8cce-c4a1af711991\" (UID: \"842d495b-32b4-4235-8cce-c4a1af711991\") " Mar 12 15:10:51 crc kubenswrapper[4869]: I0312 15:10:51.916980 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/842d495b-32b4-4235-8cce-c4a1af711991-utilities\") pod \"842d495b-32b4-4235-8cce-c4a1af711991\" (UID: \"842d495b-32b4-4235-8cce-c4a1af711991\") " Mar 12 15:10:51 crc kubenswrapper[4869]: I0312 15:10:51.917070 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/842d495b-32b4-4235-8cce-c4a1af711991-catalog-content\") pod \"842d495b-32b4-4235-8cce-c4a1af711991\" (UID: \"842d495b-32b4-4235-8cce-c4a1af711991\") " Mar 12 15:10:51 crc kubenswrapper[4869]: I0312 15:10:51.918094 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/842d495b-32b4-4235-8cce-c4a1af711991-utilities" (OuterVolumeSpecName: "utilities") pod "842d495b-32b4-4235-8cce-c4a1af711991" (UID: "842d495b-32b4-4235-8cce-c4a1af711991"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:10:51 crc kubenswrapper[4869]: I0312 15:10:51.923203 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/842d495b-32b4-4235-8cce-c4a1af711991-kube-api-access-gjjkk" (OuterVolumeSpecName: "kube-api-access-gjjkk") pod "842d495b-32b4-4235-8cce-c4a1af711991" (UID: "842d495b-32b4-4235-8cce-c4a1af711991"). InnerVolumeSpecName "kube-api-access-gjjkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:10:51 crc kubenswrapper[4869]: I0312 15:10:51.942018 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/842d495b-32b4-4235-8cce-c4a1af711991-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:51 crc kubenswrapper[4869]: I0312 15:10:51.942047 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjjkk\" (UniqueName: \"kubernetes.io/projected/842d495b-32b4-4235-8cce-c4a1af711991-kube-api-access-gjjkk\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:51 crc kubenswrapper[4869]: I0312 15:10:51.996467 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/842d495b-32b4-4235-8cce-c4a1af711991-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "842d495b-32b4-4235-8cce-c4a1af711991" (UID: "842d495b-32b4-4235-8cce-c4a1af711991"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:10:52 crc kubenswrapper[4869]: I0312 15:10:52.043637 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/842d495b-32b4-4235-8cce-c4a1af711991-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:52 crc kubenswrapper[4869]: I0312 15:10:52.129190 4869 generic.go:334] "Generic (PLEG): container finished" podID="842d495b-32b4-4235-8cce-c4a1af711991" containerID="9a4d2a60c59276b4349a976cb95338ac3528870208a56bf3c0c3036f6d1f1543" exitCode=0 Mar 12 15:10:52 crc kubenswrapper[4869]: I0312 15:10:52.129239 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sz5k6" Mar 12 15:10:52 crc kubenswrapper[4869]: I0312 15:10:52.129245 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sz5k6" event={"ID":"842d495b-32b4-4235-8cce-c4a1af711991","Type":"ContainerDied","Data":"9a4d2a60c59276b4349a976cb95338ac3528870208a56bf3c0c3036f6d1f1543"} Mar 12 15:10:52 crc kubenswrapper[4869]: I0312 15:10:52.129891 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sz5k6" event={"ID":"842d495b-32b4-4235-8cce-c4a1af711991","Type":"ContainerDied","Data":"073b793021b23a6a4322c1aaae80f0c602462bbf36293a6875ac0375ee293a77"} Mar 12 15:10:52 crc kubenswrapper[4869]: I0312 15:10:52.129912 4869 scope.go:117] "RemoveContainer" containerID="9a4d2a60c59276b4349a976cb95338ac3528870208a56bf3c0c3036f6d1f1543" Mar 12 15:10:52 crc kubenswrapper[4869]: I0312 15:10:52.161628 4869 scope.go:117] "RemoveContainer" containerID="dff057b6ab07f796ee5a8bfadb9b50543e16c73b3d5dd2883307adabca6b0f51" Mar 12 15:10:52 crc kubenswrapper[4869]: I0312 15:10:52.163322 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sz5k6"] Mar 12 15:10:52 crc kubenswrapper[4869]: I0312 15:10:52.166872 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 12 15:10:52 crc kubenswrapper[4869]: I0312 15:10:52.174019 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sz5k6"] Mar 12 15:10:52 crc kubenswrapper[4869]: I0312 15:10:52.193225 4869 scope.go:117] "RemoveContainer" containerID="cde1c278734d5ecbe196d95a9c38a4487f773fa7357efff7c6d7ca088f2939bc" Mar 12 15:10:52 crc kubenswrapper[4869]: I0312 15:10:52.210261 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tj6l5"] Mar 12 15:10:52 crc kubenswrapper[4869]: I0312 15:10:52.225117 4869 scope.go:117] "RemoveContainer" containerID="9a4d2a60c59276b4349a976cb95338ac3528870208a56bf3c0c3036f6d1f1543" Mar 12 15:10:52 crc kubenswrapper[4869]: E0312 15:10:52.225787 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a4d2a60c59276b4349a976cb95338ac3528870208a56bf3c0c3036f6d1f1543\": container with ID starting with 9a4d2a60c59276b4349a976cb95338ac3528870208a56bf3c0c3036f6d1f1543 not found: ID does not exist" containerID="9a4d2a60c59276b4349a976cb95338ac3528870208a56bf3c0c3036f6d1f1543" Mar 12 15:10:52 crc kubenswrapper[4869]: I0312 15:10:52.225822 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a4d2a60c59276b4349a976cb95338ac3528870208a56bf3c0c3036f6d1f1543"} err="failed to get container status \"9a4d2a60c59276b4349a976cb95338ac3528870208a56bf3c0c3036f6d1f1543\": rpc error: code = NotFound desc = could not find container \"9a4d2a60c59276b4349a976cb95338ac3528870208a56bf3c0c3036f6d1f1543\": container with ID starting with 9a4d2a60c59276b4349a976cb95338ac3528870208a56bf3c0c3036f6d1f1543 not found: ID does not exist" Mar 12 15:10:52 crc kubenswrapper[4869]: I0312 15:10:52.225867 4869 scope.go:117] "RemoveContainer" containerID="dff057b6ab07f796ee5a8bfadb9b50543e16c73b3d5dd2883307adabca6b0f51" Mar 12 15:10:52 crc kubenswrapper[4869]: E0312 15:10:52.226184 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dff057b6ab07f796ee5a8bfadb9b50543e16c73b3d5dd2883307adabca6b0f51\": container with ID starting with dff057b6ab07f796ee5a8bfadb9b50543e16c73b3d5dd2883307adabca6b0f51 not found: ID does not exist" containerID="dff057b6ab07f796ee5a8bfadb9b50543e16c73b3d5dd2883307adabca6b0f51" Mar 12 15:10:52 crc kubenswrapper[4869]: I0312 15:10:52.226227 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dff057b6ab07f796ee5a8bfadb9b50543e16c73b3d5dd2883307adabca6b0f51"} err="failed to get container status \"dff057b6ab07f796ee5a8bfadb9b50543e16c73b3d5dd2883307adabca6b0f51\": rpc error: code = NotFound desc = could not find container \"dff057b6ab07f796ee5a8bfadb9b50543e16c73b3d5dd2883307adabca6b0f51\": container with ID starting with dff057b6ab07f796ee5a8bfadb9b50543e16c73b3d5dd2883307adabca6b0f51 not found: ID does not exist" Mar 12 15:10:52 crc kubenswrapper[4869]: I0312 15:10:52.226263 4869 scope.go:117] "RemoveContainer" containerID="cde1c278734d5ecbe196d95a9c38a4487f773fa7357efff7c6d7ca088f2939bc" Mar 12 15:10:52 crc kubenswrapper[4869]: E0312 15:10:52.226554 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cde1c278734d5ecbe196d95a9c38a4487f773fa7357efff7c6d7ca088f2939bc\": container with ID starting with cde1c278734d5ecbe196d95a9c38a4487f773fa7357efff7c6d7ca088f2939bc not found: ID does not exist" containerID="cde1c278734d5ecbe196d95a9c38a4487f773fa7357efff7c6d7ca088f2939bc" Mar 12 15:10:52 crc kubenswrapper[4869]: I0312 15:10:52.226593 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cde1c278734d5ecbe196d95a9c38a4487f773fa7357efff7c6d7ca088f2939bc"} err="failed to get container status \"cde1c278734d5ecbe196d95a9c38a4487f773fa7357efff7c6d7ca088f2939bc\": rpc error: code = NotFound desc = could not find container \"cde1c278734d5ecbe196d95a9c38a4487f773fa7357efff7c6d7ca088f2939bc\": container with ID starting with cde1c278734d5ecbe196d95a9c38a4487f773fa7357efff7c6d7ca088f2939bc not found: ID does not exist" Mar 12 15:10:52 crc kubenswrapper[4869]: I0312 15:10:52.349298 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="842d495b-32b4-4235-8cce-c4a1af711991" path="/var/lib/kubelet/pods/842d495b-32b4-4235-8cce-c4a1af711991/volumes" Mar 12 15:10:53 crc kubenswrapper[4869]: I0312 15:10:53.142486 4869 generic.go:334] "Generic (PLEG): container finished" podID="033c416a-2107-40d2-950b-f83b72d7b681" containerID="e785906055b758ed86c0754fe153cef4ef117dcbc8dc56df05d539d758f07a9a" exitCode=0 Mar 12 15:10:53 crc kubenswrapper[4869]: I0312 15:10:53.142770 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tj6l5" event={"ID":"033c416a-2107-40d2-950b-f83b72d7b681","Type":"ContainerDied","Data":"e785906055b758ed86c0754fe153cef4ef117dcbc8dc56df05d539d758f07a9a"} Mar 12 15:10:53 crc kubenswrapper[4869]: I0312 15:10:53.143143 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tj6l5" event={"ID":"033c416a-2107-40d2-950b-f83b72d7b681","Type":"ContainerStarted","Data":"23daa6f3855dc2ea44ef54f25c4dbd9b042d0be3a0448ef6d6f79efc5eae5f06"} Mar 12 15:10:53 crc kubenswrapper[4869]: I0312 15:10:53.497354 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 12 15:10:53 crc kubenswrapper[4869]: I0312 15:10:53.497437 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 12 15:10:54 crc kubenswrapper[4869]: I0312 15:10:54.156314 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tj6l5" event={"ID":"033c416a-2107-40d2-950b-f83b72d7b681","Type":"ContainerStarted","Data":"095a579631f9bb8f6b065bbea919bf8ff5c0d26fa21f2c724bf320eb4a22744c"} Mar 12 15:10:54 crc kubenswrapper[4869]: I0312 15:10:54.509677 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2defbef6-f414-43b6-b36e-6d60027baa77" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.226:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 15:10:54 crc kubenswrapper[4869]: I0312 15:10:54.509709 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2defbef6-f414-43b6-b36e-6d60027baa77" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.226:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 15:10:55 crc kubenswrapper[4869]: I0312 15:10:55.164920 4869 generic.go:334] "Generic (PLEG): container finished" podID="033c416a-2107-40d2-950b-f83b72d7b681" containerID="095a579631f9bb8f6b065bbea919bf8ff5c0d26fa21f2c724bf320eb4a22744c" exitCode=0 Mar 12 15:10:55 crc kubenswrapper[4869]: I0312 15:10:55.164971 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tj6l5" event={"ID":"033c416a-2107-40d2-950b-f83b72d7b681","Type":"ContainerDied","Data":"095a579631f9bb8f6b065bbea919bf8ff5c0d26fa21f2c724bf320eb4a22744c"} Mar 12 15:10:56 crc kubenswrapper[4869]: I0312 15:10:56.178280 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tj6l5" event={"ID":"033c416a-2107-40d2-950b-f83b72d7b681","Type":"ContainerStarted","Data":"adf12739b12e658bfa4b57770e3bb13eab5bf4c06d119112a178852b31565fc6"} Mar 12 15:10:56 crc kubenswrapper[4869]: I0312 15:10:56.201749 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tj6l5" podStartSLOduration=2.552956584 podStartE2EDuration="5.201727823s" podCreationTimestamp="2026-03-12 15:10:51 +0000 UTC" firstStartedPulling="2026-03-12 15:10:53.145594375 +0000 UTC m=+1405.430819653" lastFinishedPulling="2026-03-12 15:10:55.794365614 +0000 UTC m=+1408.079590892" observedRunningTime="2026-03-12 15:10:56.19425859 +0000 UTC m=+1408.479483868" watchObservedRunningTime="2026-03-12 15:10:56.201727823 +0000 UTC m=+1408.486953101" Mar 12 15:11:00 crc kubenswrapper[4869]: I0312 15:11:00.656350 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 12 15:11:00 crc kubenswrapper[4869]: I0312 15:11:00.657963 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 12 15:11:00 crc kubenswrapper[4869]: I0312 15:11:00.661575 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 12 15:11:00 crc kubenswrapper[4869]: I0312 15:11:00.664994 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 12 15:11:01 crc kubenswrapper[4869]: I0312 15:11:01.223222 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 12 15:11:01 crc kubenswrapper[4869]: I0312 15:11:01.231044 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 12 15:11:01 crc kubenswrapper[4869]: I0312 15:11:01.731528 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tj6l5" Mar 12 15:11:01 crc kubenswrapper[4869]: I0312 15:11:01.732282 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tj6l5" Mar 12 15:11:01 crc kubenswrapper[4869]: I0312 15:11:01.781901 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tj6l5" Mar 12 15:11:02 crc kubenswrapper[4869]: I0312 15:11:02.240548 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 12 15:11:02 crc kubenswrapper[4869]: I0312 15:11:02.309003 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tj6l5" Mar 12 15:11:02 crc kubenswrapper[4869]: I0312 15:11:02.362208 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tj6l5"] Mar 12 15:11:03 crc kubenswrapper[4869]: I0312 15:11:03.502421 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 12 15:11:03 crc kubenswrapper[4869]: I0312 15:11:03.510514 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 12 15:11:03 crc kubenswrapper[4869]: I0312 15:11:03.516292 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 12 15:11:04 crc kubenswrapper[4869]: I0312 15:11:04.248471 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tj6l5" podUID="033c416a-2107-40d2-950b-f83b72d7b681" containerName="registry-server" containerID="cri-o://adf12739b12e658bfa4b57770e3bb13eab5bf4c06d119112a178852b31565fc6" gracePeriod=2 Mar 12 15:11:04 crc kubenswrapper[4869]: I0312 15:11:04.263332 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 12 15:11:04 crc kubenswrapper[4869]: I0312 15:11:04.771078 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tj6l5" Mar 12 15:11:04 crc kubenswrapper[4869]: I0312 15:11:04.868903 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/033c416a-2107-40d2-950b-f83b72d7b681-catalog-content\") pod \"033c416a-2107-40d2-950b-f83b72d7b681\" (UID: \"033c416a-2107-40d2-950b-f83b72d7b681\") " Mar 12 15:11:04 crc kubenswrapper[4869]: I0312 15:11:04.869005 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/033c416a-2107-40d2-950b-f83b72d7b681-utilities\") pod \"033c416a-2107-40d2-950b-f83b72d7b681\" (UID: \"033c416a-2107-40d2-950b-f83b72d7b681\") " Mar 12 15:11:04 crc kubenswrapper[4869]: I0312 15:11:04.869027 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrjng\" (UniqueName: \"kubernetes.io/projected/033c416a-2107-40d2-950b-f83b72d7b681-kube-api-access-xrjng\") pod \"033c416a-2107-40d2-950b-f83b72d7b681\" (UID: \"033c416a-2107-40d2-950b-f83b72d7b681\") " Mar 12 15:11:04 crc kubenswrapper[4869]: I0312 15:11:04.870133 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/033c416a-2107-40d2-950b-f83b72d7b681-utilities" (OuterVolumeSpecName: "utilities") pod "033c416a-2107-40d2-950b-f83b72d7b681" (UID: "033c416a-2107-40d2-950b-f83b72d7b681"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:11:04 crc kubenswrapper[4869]: I0312 15:11:04.874755 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/033c416a-2107-40d2-950b-f83b72d7b681-kube-api-access-xrjng" (OuterVolumeSpecName: "kube-api-access-xrjng") pod "033c416a-2107-40d2-950b-f83b72d7b681" (UID: "033c416a-2107-40d2-950b-f83b72d7b681"). InnerVolumeSpecName "kube-api-access-xrjng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:11:04 crc kubenswrapper[4869]: I0312 15:11:04.893022 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/033c416a-2107-40d2-950b-f83b72d7b681-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "033c416a-2107-40d2-950b-f83b72d7b681" (UID: "033c416a-2107-40d2-950b-f83b72d7b681"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:11:04 crc kubenswrapper[4869]: I0312 15:11:04.971612 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/033c416a-2107-40d2-950b-f83b72d7b681-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:04 crc kubenswrapper[4869]: I0312 15:11:04.971806 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrjng\" (UniqueName: \"kubernetes.io/projected/033c416a-2107-40d2-950b-f83b72d7b681-kube-api-access-xrjng\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:04 crc kubenswrapper[4869]: I0312 15:11:04.971900 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/033c416a-2107-40d2-950b-f83b72d7b681-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:05 crc kubenswrapper[4869]: I0312 15:11:05.258100 4869 generic.go:334] "Generic (PLEG): container finished" podID="033c416a-2107-40d2-950b-f83b72d7b681" containerID="adf12739b12e658bfa4b57770e3bb13eab5bf4c06d119112a178852b31565fc6" exitCode=0 Mar 12 15:11:05 crc kubenswrapper[4869]: I0312 15:11:05.259302 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tj6l5" Mar 12 15:11:05 crc kubenswrapper[4869]: I0312 15:11:05.261687 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tj6l5" event={"ID":"033c416a-2107-40d2-950b-f83b72d7b681","Type":"ContainerDied","Data":"adf12739b12e658bfa4b57770e3bb13eab5bf4c06d119112a178852b31565fc6"} Mar 12 15:11:05 crc kubenswrapper[4869]: I0312 15:11:05.261767 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tj6l5" event={"ID":"033c416a-2107-40d2-950b-f83b72d7b681","Type":"ContainerDied","Data":"23daa6f3855dc2ea44ef54f25c4dbd9b042d0be3a0448ef6d6f79efc5eae5f06"} Mar 12 15:11:05 crc kubenswrapper[4869]: I0312 15:11:05.261790 4869 scope.go:117] "RemoveContainer" containerID="adf12739b12e658bfa4b57770e3bb13eab5bf4c06d119112a178852b31565fc6" Mar 12 15:11:05 crc kubenswrapper[4869]: I0312 15:11:05.287316 4869 scope.go:117] "RemoveContainer" containerID="095a579631f9bb8f6b065bbea919bf8ff5c0d26fa21f2c724bf320eb4a22744c" Mar 12 15:11:05 crc kubenswrapper[4869]: I0312 15:11:05.296462 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tj6l5"] Mar 12 15:11:05 crc kubenswrapper[4869]: I0312 15:11:05.305695 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tj6l5"] Mar 12 15:11:05 crc kubenswrapper[4869]: I0312 15:11:05.312528 4869 scope.go:117] "RemoveContainer" containerID="e785906055b758ed86c0754fe153cef4ef117dcbc8dc56df05d539d758f07a9a" Mar 12 15:11:05 crc kubenswrapper[4869]: I0312 15:11:05.357830 4869 scope.go:117] "RemoveContainer" containerID="adf12739b12e658bfa4b57770e3bb13eab5bf4c06d119112a178852b31565fc6" Mar 12 15:11:05 crc kubenswrapper[4869]: E0312 15:11:05.358328 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adf12739b12e658bfa4b57770e3bb13eab5bf4c06d119112a178852b31565fc6\": container with ID starting with adf12739b12e658bfa4b57770e3bb13eab5bf4c06d119112a178852b31565fc6 not found: ID does not exist" containerID="adf12739b12e658bfa4b57770e3bb13eab5bf4c06d119112a178852b31565fc6" Mar 12 15:11:05 crc kubenswrapper[4869]: I0312 15:11:05.358364 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adf12739b12e658bfa4b57770e3bb13eab5bf4c06d119112a178852b31565fc6"} err="failed to get container status \"adf12739b12e658bfa4b57770e3bb13eab5bf4c06d119112a178852b31565fc6\": rpc error: code = NotFound desc = could not find container \"adf12739b12e658bfa4b57770e3bb13eab5bf4c06d119112a178852b31565fc6\": container with ID starting with adf12739b12e658bfa4b57770e3bb13eab5bf4c06d119112a178852b31565fc6 not found: ID does not exist" Mar 12 15:11:05 crc kubenswrapper[4869]: I0312 15:11:05.358389 4869 scope.go:117] "RemoveContainer" containerID="095a579631f9bb8f6b065bbea919bf8ff5c0d26fa21f2c724bf320eb4a22744c" Mar 12 15:11:05 crc kubenswrapper[4869]: E0312 15:11:05.358730 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"095a579631f9bb8f6b065bbea919bf8ff5c0d26fa21f2c724bf320eb4a22744c\": container with ID starting with 095a579631f9bb8f6b065bbea919bf8ff5c0d26fa21f2c724bf320eb4a22744c not found: ID does not exist" containerID="095a579631f9bb8f6b065bbea919bf8ff5c0d26fa21f2c724bf320eb4a22744c" Mar 12 15:11:05 crc kubenswrapper[4869]: I0312 15:11:05.358756 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"095a579631f9bb8f6b065bbea919bf8ff5c0d26fa21f2c724bf320eb4a22744c"} err="failed to get container status \"095a579631f9bb8f6b065bbea919bf8ff5c0d26fa21f2c724bf320eb4a22744c\": rpc error: code = NotFound desc = could not find container \"095a579631f9bb8f6b065bbea919bf8ff5c0d26fa21f2c724bf320eb4a22744c\": container with ID starting with 095a579631f9bb8f6b065bbea919bf8ff5c0d26fa21f2c724bf320eb4a22744c not found: ID does not exist" Mar 12 15:11:05 crc kubenswrapper[4869]: I0312 15:11:05.358770 4869 scope.go:117] "RemoveContainer" containerID="e785906055b758ed86c0754fe153cef4ef117dcbc8dc56df05d539d758f07a9a" Mar 12 15:11:05 crc kubenswrapper[4869]: E0312 15:11:05.358967 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e785906055b758ed86c0754fe153cef4ef117dcbc8dc56df05d539d758f07a9a\": container with ID starting with e785906055b758ed86c0754fe153cef4ef117dcbc8dc56df05d539d758f07a9a not found: ID does not exist" containerID="e785906055b758ed86c0754fe153cef4ef117dcbc8dc56df05d539d758f07a9a" Mar 12 15:11:05 crc kubenswrapper[4869]: I0312 15:11:05.358991 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e785906055b758ed86c0754fe153cef4ef117dcbc8dc56df05d539d758f07a9a"} err="failed to get container status \"e785906055b758ed86c0754fe153cef4ef117dcbc8dc56df05d539d758f07a9a\": rpc error: code = NotFound desc = could not find container \"e785906055b758ed86c0754fe153cef4ef117dcbc8dc56df05d539d758f07a9a\": container with ID starting with e785906055b758ed86c0754fe153cef4ef117dcbc8dc56df05d539d758f07a9a not found: ID does not exist" Mar 12 15:11:05 crc kubenswrapper[4869]: I0312 15:11:05.697031 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 15:11:05 crc kubenswrapper[4869]: I0312 15:11:05.697290 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="5c2e4870-a7a0-4271-b4dd-a68fc8a80ef3" containerName="kube-state-metrics" containerID="cri-o://5a1b42db99d09470784dcb7982ce70a4984212578a29d237a8442c595ca4b20e" gracePeriod=30 Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.176845 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.192416 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bxxn\" (UniqueName: \"kubernetes.io/projected/5c2e4870-a7a0-4271-b4dd-a68fc8a80ef3-kube-api-access-8bxxn\") pod \"5c2e4870-a7a0-4271-b4dd-a68fc8a80ef3\" (UID: \"5c2e4870-a7a0-4271-b4dd-a68fc8a80ef3\") " Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.201327 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c2e4870-a7a0-4271-b4dd-a68fc8a80ef3-kube-api-access-8bxxn" (OuterVolumeSpecName: "kube-api-access-8bxxn") pod "5c2e4870-a7a0-4271-b4dd-a68fc8a80ef3" (UID: "5c2e4870-a7a0-4271-b4dd-a68fc8a80ef3"). InnerVolumeSpecName "kube-api-access-8bxxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.270348 4869 generic.go:334] "Generic (PLEG): container finished" podID="5c2e4870-a7a0-4271-b4dd-a68fc8a80ef3" containerID="5a1b42db99d09470784dcb7982ce70a4984212578a29d237a8442c595ca4b20e" exitCode=2 Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.270446 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5c2e4870-a7a0-4271-b4dd-a68fc8a80ef3","Type":"ContainerDied","Data":"5a1b42db99d09470784dcb7982ce70a4984212578a29d237a8442c595ca4b20e"} Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.270510 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5c2e4870-a7a0-4271-b4dd-a68fc8a80ef3","Type":"ContainerDied","Data":"9bc697c96f2c504a60ace19a99b1c769b5b869a718d5b7f73938d03e5925e355"} Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.270505 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.270528 4869 scope.go:117] "RemoveContainer" containerID="5a1b42db99d09470784dcb7982ce70a4984212578a29d237a8442c595ca4b20e" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.300485 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bxxn\" (UniqueName: \"kubernetes.io/projected/5c2e4870-a7a0-4271-b4dd-a68fc8a80ef3-kube-api-access-8bxxn\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.303922 4869 scope.go:117] "RemoveContainer" containerID="5a1b42db99d09470784dcb7982ce70a4984212578a29d237a8442c595ca4b20e" Mar 12 15:11:06 crc kubenswrapper[4869]: E0312 15:11:06.306434 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a1b42db99d09470784dcb7982ce70a4984212578a29d237a8442c595ca4b20e\": container with ID starting with 5a1b42db99d09470784dcb7982ce70a4984212578a29d237a8442c595ca4b20e not found: ID does not exist" containerID="5a1b42db99d09470784dcb7982ce70a4984212578a29d237a8442c595ca4b20e" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.306466 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a1b42db99d09470784dcb7982ce70a4984212578a29d237a8442c595ca4b20e"} err="failed to get container status \"5a1b42db99d09470784dcb7982ce70a4984212578a29d237a8442c595ca4b20e\": rpc error: code = NotFound desc = could not find container \"5a1b42db99d09470784dcb7982ce70a4984212578a29d237a8442c595ca4b20e\": container with ID starting with 5a1b42db99d09470784dcb7982ce70a4984212578a29d237a8442c595ca4b20e not found: ID does not exist" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.317789 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.328448 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.349363 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="033c416a-2107-40d2-950b-f83b72d7b681" path="/var/lib/kubelet/pods/033c416a-2107-40d2-950b-f83b72d7b681/volumes" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.350220 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c2e4870-a7a0-4271-b4dd-a68fc8a80ef3" path="/var/lib/kubelet/pods/5c2e4870-a7a0-4271-b4dd-a68fc8a80ef3/volumes" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.350775 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 15:11:06 crc kubenswrapper[4869]: E0312 15:11:06.351075 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c2e4870-a7a0-4271-b4dd-a68fc8a80ef3" containerName="kube-state-metrics" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.351126 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c2e4870-a7a0-4271-b4dd-a68fc8a80ef3" containerName="kube-state-metrics" Mar 12 15:11:06 crc kubenswrapper[4869]: E0312 15:11:06.351145 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="033c416a-2107-40d2-950b-f83b72d7b681" containerName="extract-utilities" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.351151 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="033c416a-2107-40d2-950b-f83b72d7b681" containerName="extract-utilities" Mar 12 15:11:06 crc kubenswrapper[4869]: E0312 15:11:06.351173 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="033c416a-2107-40d2-950b-f83b72d7b681" containerName="extract-content" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.351180 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="033c416a-2107-40d2-950b-f83b72d7b681" containerName="extract-content" Mar 12 15:11:06 crc kubenswrapper[4869]: E0312 15:11:06.351200 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="033c416a-2107-40d2-950b-f83b72d7b681" containerName="registry-server" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.351205 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="033c416a-2107-40d2-950b-f83b72d7b681" containerName="registry-server" Mar 12 15:11:06 crc kubenswrapper[4869]: E0312 15:11:06.351216 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="842d495b-32b4-4235-8cce-c4a1af711991" containerName="extract-content" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.351222 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="842d495b-32b4-4235-8cce-c4a1af711991" containerName="extract-content" Mar 12 15:11:06 crc kubenswrapper[4869]: E0312 15:11:06.351231 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="842d495b-32b4-4235-8cce-c4a1af711991" containerName="extract-utilities" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.351237 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="842d495b-32b4-4235-8cce-c4a1af711991" containerName="extract-utilities" Mar 12 15:11:06 crc kubenswrapper[4869]: E0312 15:11:06.351246 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="842d495b-32b4-4235-8cce-c4a1af711991" containerName="registry-server" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.351251 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="842d495b-32b4-4235-8cce-c4a1af711991" containerName="registry-server" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.351418 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="842d495b-32b4-4235-8cce-c4a1af711991" containerName="registry-server" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.351439 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c2e4870-a7a0-4271-b4dd-a68fc8a80ef3" containerName="kube-state-metrics" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.351452 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="033c416a-2107-40d2-950b-f83b72d7b681" containerName="registry-server" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.352530 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.352652 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.355040 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.358779 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.401934 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/39aee064-bbb0-45fa-93b9-8c720fdca852-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"39aee064-bbb0-45fa-93b9-8c720fdca852\") " pod="openstack/kube-state-metrics-0" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.402012 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74gdt\" (UniqueName: \"kubernetes.io/projected/39aee064-bbb0-45fa-93b9-8c720fdca852-kube-api-access-74gdt\") pod \"kube-state-metrics-0\" (UID: \"39aee064-bbb0-45fa-93b9-8c720fdca852\") " pod="openstack/kube-state-metrics-0" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.402181 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/39aee064-bbb0-45fa-93b9-8c720fdca852-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"39aee064-bbb0-45fa-93b9-8c720fdca852\") " pod="openstack/kube-state-metrics-0" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.402383 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39aee064-bbb0-45fa-93b9-8c720fdca852-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"39aee064-bbb0-45fa-93b9-8c720fdca852\") " pod="openstack/kube-state-metrics-0" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.503832 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39aee064-bbb0-45fa-93b9-8c720fdca852-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"39aee064-bbb0-45fa-93b9-8c720fdca852\") " pod="openstack/kube-state-metrics-0" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.503941 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/39aee064-bbb0-45fa-93b9-8c720fdca852-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"39aee064-bbb0-45fa-93b9-8c720fdca852\") " pod="openstack/kube-state-metrics-0" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.504002 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74gdt\" (UniqueName: \"kubernetes.io/projected/39aee064-bbb0-45fa-93b9-8c720fdca852-kube-api-access-74gdt\") pod \"kube-state-metrics-0\" (UID: \"39aee064-bbb0-45fa-93b9-8c720fdca852\") " pod="openstack/kube-state-metrics-0" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.504044 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/39aee064-bbb0-45fa-93b9-8c720fdca852-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"39aee064-bbb0-45fa-93b9-8c720fdca852\") " pod="openstack/kube-state-metrics-0" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.507924 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/39aee064-bbb0-45fa-93b9-8c720fdca852-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"39aee064-bbb0-45fa-93b9-8c720fdca852\") " pod="openstack/kube-state-metrics-0" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.508275 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/39aee064-bbb0-45fa-93b9-8c720fdca852-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"39aee064-bbb0-45fa-93b9-8c720fdca852\") " pod="openstack/kube-state-metrics-0" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.510326 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39aee064-bbb0-45fa-93b9-8c720fdca852-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"39aee064-bbb0-45fa-93b9-8c720fdca852\") " pod="openstack/kube-state-metrics-0" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.523959 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74gdt\" (UniqueName: \"kubernetes.io/projected/39aee064-bbb0-45fa-93b9-8c720fdca852-kube-api-access-74gdt\") pod \"kube-state-metrics-0\" (UID: \"39aee064-bbb0-45fa-93b9-8c720fdca852\") " pod="openstack/kube-state-metrics-0" Mar 12 15:11:06 crc kubenswrapper[4869]: I0312 15:11:06.672811 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 12 15:11:07 crc kubenswrapper[4869]: I0312 15:11:07.141626 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 15:11:07 crc kubenswrapper[4869]: W0312 15:11:07.143269 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39aee064_bbb0_45fa_93b9_8c720fdca852.slice/crio-0a91c94b5a066b9e390de941c7212cf223473f1cd89d615392861ef84dc73799 WatchSource:0}: Error finding container 0a91c94b5a066b9e390de941c7212cf223473f1cd89d615392861ef84dc73799: Status 404 returned error can't find the container with id 0a91c94b5a066b9e390de941c7212cf223473f1cd89d615392861ef84dc73799 Mar 12 15:11:07 crc kubenswrapper[4869]: I0312 15:11:07.283411 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"39aee064-bbb0-45fa-93b9-8c720fdca852","Type":"ContainerStarted","Data":"0a91c94b5a066b9e390de941c7212cf223473f1cd89d615392861ef84dc73799"} Mar 12 15:11:07 crc kubenswrapper[4869]: I0312 15:11:07.341329 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:11:07 crc kubenswrapper[4869]: I0312 15:11:07.341696 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b75c9a12-7533-4855-b543-e75d0bb77857" containerName="ceilometer-central-agent" containerID="cri-o://5e54392c9966610ce2972edc6f06fca4e394ebeb9d3641035cb4c28f8653adf4" gracePeriod=30 Mar 12 15:11:07 crc kubenswrapper[4869]: I0312 15:11:07.341765 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b75c9a12-7533-4855-b543-e75d0bb77857" containerName="sg-core" containerID="cri-o://64d950dc347615b05eef2c08928bfe51858ba4991d6e8114851333a2167fd80b" gracePeriod=30 Mar 12 15:11:07 crc kubenswrapper[4869]: I0312 15:11:07.341797 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b75c9a12-7533-4855-b543-e75d0bb77857" containerName="ceilometer-notification-agent" containerID="cri-o://4cbd5ecd349acba250031dfba5b523eaeefb0f426e6d4f0b2c51b948c7beb6ae" gracePeriod=30 Mar 12 15:11:07 crc kubenswrapper[4869]: I0312 15:11:07.341780 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b75c9a12-7533-4855-b543-e75d0bb77857" containerName="proxy-httpd" containerID="cri-o://c3aa541af7677f1724d41cd410e143248e18bfb7e5ad57061403bfd9dc3ae1f7" gracePeriod=30 Mar 12 15:11:08 crc kubenswrapper[4869]: I0312 15:11:08.297641 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"39aee064-bbb0-45fa-93b9-8c720fdca852","Type":"ContainerStarted","Data":"7b06032ca98f5979309350f6a27d978036c542762249291be1b76d5192bf59a6"} Mar 12 15:11:08 crc kubenswrapper[4869]: I0312 15:11:08.298151 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 12 15:11:08 crc kubenswrapper[4869]: I0312 15:11:08.299649 4869 generic.go:334] "Generic (PLEG): container finished" podID="b75c9a12-7533-4855-b543-e75d0bb77857" containerID="c3aa541af7677f1724d41cd410e143248e18bfb7e5ad57061403bfd9dc3ae1f7" exitCode=0 Mar 12 15:11:08 crc kubenswrapper[4869]: I0312 15:11:08.299665 4869 generic.go:334] "Generic (PLEG): container finished" podID="b75c9a12-7533-4855-b543-e75d0bb77857" containerID="64d950dc347615b05eef2c08928bfe51858ba4991d6e8114851333a2167fd80b" exitCode=2 Mar 12 15:11:08 crc kubenswrapper[4869]: I0312 15:11:08.299673 4869 generic.go:334] "Generic (PLEG): container finished" podID="b75c9a12-7533-4855-b543-e75d0bb77857" containerID="5e54392c9966610ce2972edc6f06fca4e394ebeb9d3641035cb4c28f8653adf4" exitCode=0 Mar 12 15:11:08 crc kubenswrapper[4869]: I0312 15:11:08.299687 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b75c9a12-7533-4855-b543-e75d0bb77857","Type":"ContainerDied","Data":"c3aa541af7677f1724d41cd410e143248e18bfb7e5ad57061403bfd9dc3ae1f7"} Mar 12 15:11:08 crc kubenswrapper[4869]: I0312 15:11:08.299703 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b75c9a12-7533-4855-b543-e75d0bb77857","Type":"ContainerDied","Data":"64d950dc347615b05eef2c08928bfe51858ba4991d6e8114851333a2167fd80b"} Mar 12 15:11:08 crc kubenswrapper[4869]: I0312 15:11:08.299714 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b75c9a12-7533-4855-b543-e75d0bb77857","Type":"ContainerDied","Data":"5e54392c9966610ce2972edc6f06fca4e394ebeb9d3641035cb4c28f8653adf4"} Mar 12 15:11:08 crc kubenswrapper[4869]: I0312 15:11:08.317565 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.90617189 podStartE2EDuration="2.317512792s" podCreationTimestamp="2026-03-12 15:11:06 +0000 UTC" firstStartedPulling="2026-03-12 15:11:07.14614828 +0000 UTC m=+1419.431373558" lastFinishedPulling="2026-03-12 15:11:07.557489172 +0000 UTC m=+1419.842714460" observedRunningTime="2026-03-12 15:11:08.313692683 +0000 UTC m=+1420.598917961" watchObservedRunningTime="2026-03-12 15:11:08.317512792 +0000 UTC m=+1420.602738070" Mar 12 15:11:10 crc kubenswrapper[4869]: I0312 15:11:10.713821 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 15:11:10 crc kubenswrapper[4869]: I0312 15:11:10.895415 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b75c9a12-7533-4855-b543-e75d0bb77857-sg-core-conf-yaml\") pod \"b75c9a12-7533-4855-b543-e75d0bb77857\" (UID: \"b75c9a12-7533-4855-b543-e75d0bb77857\") " Mar 12 15:11:10 crc kubenswrapper[4869]: I0312 15:11:10.896072 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b75c9a12-7533-4855-b543-e75d0bb77857-combined-ca-bundle\") pod \"b75c9a12-7533-4855-b543-e75d0bb77857\" (UID: \"b75c9a12-7533-4855-b543-e75d0bb77857\") " Mar 12 15:11:10 crc kubenswrapper[4869]: I0312 15:11:10.896238 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdqvm\" (UniqueName: \"kubernetes.io/projected/b75c9a12-7533-4855-b543-e75d0bb77857-kube-api-access-rdqvm\") pod \"b75c9a12-7533-4855-b543-e75d0bb77857\" (UID: \"b75c9a12-7533-4855-b543-e75d0bb77857\") " Mar 12 15:11:10 crc kubenswrapper[4869]: I0312 15:11:10.896343 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b75c9a12-7533-4855-b543-e75d0bb77857-run-httpd\") pod \"b75c9a12-7533-4855-b543-e75d0bb77857\" (UID: \"b75c9a12-7533-4855-b543-e75d0bb77857\") " Mar 12 15:11:10 crc kubenswrapper[4869]: I0312 15:11:10.896473 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b75c9a12-7533-4855-b543-e75d0bb77857-config-data\") pod \"b75c9a12-7533-4855-b543-e75d0bb77857\" (UID: \"b75c9a12-7533-4855-b543-e75d0bb77857\") " Mar 12 15:11:10 crc kubenswrapper[4869]: I0312 15:11:10.896653 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b75c9a12-7533-4855-b543-e75d0bb77857-log-httpd\") pod \"b75c9a12-7533-4855-b543-e75d0bb77857\" (UID: \"b75c9a12-7533-4855-b543-e75d0bb77857\") " Mar 12 15:11:10 crc kubenswrapper[4869]: I0312 15:11:10.896765 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b75c9a12-7533-4855-b543-e75d0bb77857-scripts\") pod \"b75c9a12-7533-4855-b543-e75d0bb77857\" (UID: \"b75c9a12-7533-4855-b543-e75d0bb77857\") " Mar 12 15:11:10 crc kubenswrapper[4869]: I0312 15:11:10.897480 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b75c9a12-7533-4855-b543-e75d0bb77857-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b75c9a12-7533-4855-b543-e75d0bb77857" (UID: "b75c9a12-7533-4855-b543-e75d0bb77857"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:11:10 crc kubenswrapper[4869]: I0312 15:11:10.898527 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b75c9a12-7533-4855-b543-e75d0bb77857-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b75c9a12-7533-4855-b543-e75d0bb77857" (UID: "b75c9a12-7533-4855-b543-e75d0bb77857"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:11:10 crc kubenswrapper[4869]: I0312 15:11:10.902370 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b75c9a12-7533-4855-b543-e75d0bb77857-scripts" (OuterVolumeSpecName: "scripts") pod "b75c9a12-7533-4855-b543-e75d0bb77857" (UID: "b75c9a12-7533-4855-b543-e75d0bb77857"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:11:10 crc kubenswrapper[4869]: I0312 15:11:10.905895 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b75c9a12-7533-4855-b543-e75d0bb77857-kube-api-access-rdqvm" (OuterVolumeSpecName: "kube-api-access-rdqvm") pod "b75c9a12-7533-4855-b543-e75d0bb77857" (UID: "b75c9a12-7533-4855-b543-e75d0bb77857"). InnerVolumeSpecName "kube-api-access-rdqvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:11:10 crc kubenswrapper[4869]: I0312 15:11:10.926201 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b75c9a12-7533-4855-b543-e75d0bb77857-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b75c9a12-7533-4855-b543-e75d0bb77857" (UID: "b75c9a12-7533-4855-b543-e75d0bb77857"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:11:10 crc kubenswrapper[4869]: I0312 15:11:10.987373 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b75c9a12-7533-4855-b543-e75d0bb77857-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b75c9a12-7533-4855-b543-e75d0bb77857" (UID: "b75c9a12-7533-4855-b543-e75d0bb77857"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:11:10 crc kubenswrapper[4869]: I0312 15:11:10.998435 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b75c9a12-7533-4855-b543-e75d0bb77857-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:10 crc kubenswrapper[4869]: I0312 15:11:10.998464 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdqvm\" (UniqueName: \"kubernetes.io/projected/b75c9a12-7533-4855-b543-e75d0bb77857-kube-api-access-rdqvm\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:10 crc kubenswrapper[4869]: I0312 15:11:10.998475 4869 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b75c9a12-7533-4855-b543-e75d0bb77857-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:10 crc kubenswrapper[4869]: I0312 15:11:10.998483 4869 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b75c9a12-7533-4855-b543-e75d0bb77857-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:10 crc kubenswrapper[4869]: I0312 15:11:10.998492 4869 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b75c9a12-7533-4855-b543-e75d0bb77857-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:10 crc kubenswrapper[4869]: I0312 15:11:10.998500 4869 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b75c9a12-7533-4855-b543-e75d0bb77857-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.008249 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b75c9a12-7533-4855-b543-e75d0bb77857-config-data" (OuterVolumeSpecName: "config-data") pod "b75c9a12-7533-4855-b543-e75d0bb77857" (UID: "b75c9a12-7533-4855-b543-e75d0bb77857"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.100475 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b75c9a12-7533-4855-b543-e75d0bb77857-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.328504 4869 generic.go:334] "Generic (PLEG): container finished" podID="b75c9a12-7533-4855-b543-e75d0bb77857" containerID="4cbd5ecd349acba250031dfba5b523eaeefb0f426e6d4f0b2c51b948c7beb6ae" exitCode=0 Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.328566 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b75c9a12-7533-4855-b543-e75d0bb77857","Type":"ContainerDied","Data":"4cbd5ecd349acba250031dfba5b523eaeefb0f426e6d4f0b2c51b948c7beb6ae"} Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.328608 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b75c9a12-7533-4855-b543-e75d0bb77857","Type":"ContainerDied","Data":"fd0013b5e060bf89207eae199fb4346d009e593bdfaea97f471765e3143384d0"} Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.328621 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.328638 4869 scope.go:117] "RemoveContainer" containerID="c3aa541af7677f1724d41cd410e143248e18bfb7e5ad57061403bfd9dc3ae1f7" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.369921 4869 scope.go:117] "RemoveContainer" containerID="64d950dc347615b05eef2c08928bfe51858ba4991d6e8114851333a2167fd80b" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.393371 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.399941 4869 scope.go:117] "RemoveContainer" containerID="4cbd5ecd349acba250031dfba5b523eaeefb0f426e6d4f0b2c51b948c7beb6ae" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.411326 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.421978 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:11:11 crc kubenswrapper[4869]: E0312 15:11:11.422387 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b75c9a12-7533-4855-b543-e75d0bb77857" containerName="ceilometer-notification-agent" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.422404 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="b75c9a12-7533-4855-b543-e75d0bb77857" containerName="ceilometer-notification-agent" Mar 12 15:11:11 crc kubenswrapper[4869]: E0312 15:11:11.422431 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b75c9a12-7533-4855-b543-e75d0bb77857" containerName="sg-core" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.422438 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="b75c9a12-7533-4855-b543-e75d0bb77857" containerName="sg-core" Mar 12 15:11:11 crc kubenswrapper[4869]: E0312 15:11:11.422455 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b75c9a12-7533-4855-b543-e75d0bb77857" containerName="ceilometer-central-agent" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.422461 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="b75c9a12-7533-4855-b543-e75d0bb77857" containerName="ceilometer-central-agent" Mar 12 15:11:11 crc kubenswrapper[4869]: E0312 15:11:11.422468 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b75c9a12-7533-4855-b543-e75d0bb77857" containerName="proxy-httpd" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.422473 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="b75c9a12-7533-4855-b543-e75d0bb77857" containerName="proxy-httpd" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.422702 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="b75c9a12-7533-4855-b543-e75d0bb77857" containerName="ceilometer-notification-agent" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.422721 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="b75c9a12-7533-4855-b543-e75d0bb77857" containerName="proxy-httpd" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.422730 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="b75c9a12-7533-4855-b543-e75d0bb77857" containerName="ceilometer-central-agent" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.422737 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="b75c9a12-7533-4855-b543-e75d0bb77857" containerName="sg-core" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.424504 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.425604 4869 scope.go:117] "RemoveContainer" containerID="5e54392c9966610ce2972edc6f06fca4e394ebeb9d3641035cb4c28f8653adf4" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.428379 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.428664 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.428805 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.430057 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.482896 4869 scope.go:117] "RemoveContainer" containerID="c3aa541af7677f1724d41cd410e143248e18bfb7e5ad57061403bfd9dc3ae1f7" Mar 12 15:11:11 crc kubenswrapper[4869]: E0312 15:11:11.483311 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3aa541af7677f1724d41cd410e143248e18bfb7e5ad57061403bfd9dc3ae1f7\": container with ID starting with c3aa541af7677f1724d41cd410e143248e18bfb7e5ad57061403bfd9dc3ae1f7 not found: ID does not exist" containerID="c3aa541af7677f1724d41cd410e143248e18bfb7e5ad57061403bfd9dc3ae1f7" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.483342 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3aa541af7677f1724d41cd410e143248e18bfb7e5ad57061403bfd9dc3ae1f7"} err="failed to get container status \"c3aa541af7677f1724d41cd410e143248e18bfb7e5ad57061403bfd9dc3ae1f7\": rpc error: code = NotFound desc = could not find container \"c3aa541af7677f1724d41cd410e143248e18bfb7e5ad57061403bfd9dc3ae1f7\": container with ID starting with c3aa541af7677f1724d41cd410e143248e18bfb7e5ad57061403bfd9dc3ae1f7 not found: ID does not exist" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.483364 4869 scope.go:117] "RemoveContainer" containerID="64d950dc347615b05eef2c08928bfe51858ba4991d6e8114851333a2167fd80b" Mar 12 15:11:11 crc kubenswrapper[4869]: E0312 15:11:11.483648 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64d950dc347615b05eef2c08928bfe51858ba4991d6e8114851333a2167fd80b\": container with ID starting with 64d950dc347615b05eef2c08928bfe51858ba4991d6e8114851333a2167fd80b not found: ID does not exist" containerID="64d950dc347615b05eef2c08928bfe51858ba4991d6e8114851333a2167fd80b" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.483673 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64d950dc347615b05eef2c08928bfe51858ba4991d6e8114851333a2167fd80b"} err="failed to get container status \"64d950dc347615b05eef2c08928bfe51858ba4991d6e8114851333a2167fd80b\": rpc error: code = NotFound desc = could not find container \"64d950dc347615b05eef2c08928bfe51858ba4991d6e8114851333a2167fd80b\": container with ID starting with 64d950dc347615b05eef2c08928bfe51858ba4991d6e8114851333a2167fd80b not found: ID does not exist" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.483688 4869 scope.go:117] "RemoveContainer" containerID="4cbd5ecd349acba250031dfba5b523eaeefb0f426e6d4f0b2c51b948c7beb6ae" Mar 12 15:11:11 crc kubenswrapper[4869]: E0312 15:11:11.484435 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cbd5ecd349acba250031dfba5b523eaeefb0f426e6d4f0b2c51b948c7beb6ae\": container with ID starting with 4cbd5ecd349acba250031dfba5b523eaeefb0f426e6d4f0b2c51b948c7beb6ae not found: ID does not exist" containerID="4cbd5ecd349acba250031dfba5b523eaeefb0f426e6d4f0b2c51b948c7beb6ae" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.484458 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cbd5ecd349acba250031dfba5b523eaeefb0f426e6d4f0b2c51b948c7beb6ae"} err="failed to get container status \"4cbd5ecd349acba250031dfba5b523eaeefb0f426e6d4f0b2c51b948c7beb6ae\": rpc error: code = NotFound desc = could not find container \"4cbd5ecd349acba250031dfba5b523eaeefb0f426e6d4f0b2c51b948c7beb6ae\": container with ID starting with 4cbd5ecd349acba250031dfba5b523eaeefb0f426e6d4f0b2c51b948c7beb6ae not found: ID does not exist" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.484470 4869 scope.go:117] "RemoveContainer" containerID="5e54392c9966610ce2972edc6f06fca4e394ebeb9d3641035cb4c28f8653adf4" Mar 12 15:11:11 crc kubenswrapper[4869]: E0312 15:11:11.485011 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e54392c9966610ce2972edc6f06fca4e394ebeb9d3641035cb4c28f8653adf4\": container with ID starting with 5e54392c9966610ce2972edc6f06fca4e394ebeb9d3641035cb4c28f8653adf4 not found: ID does not exist" containerID="5e54392c9966610ce2972edc6f06fca4e394ebeb9d3641035cb4c28f8653adf4" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.485035 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e54392c9966610ce2972edc6f06fca4e394ebeb9d3641035cb4c28f8653adf4"} err="failed to get container status \"5e54392c9966610ce2972edc6f06fca4e394ebeb9d3641035cb4c28f8653adf4\": rpc error: code = NotFound desc = could not find container \"5e54392c9966610ce2972edc6f06fca4e394ebeb9d3641035cb4c28f8653adf4\": container with ID starting with 5e54392c9966610ce2972edc6f06fca4e394ebeb9d3641035cb4c28f8653adf4 not found: ID does not exist" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.509879 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/699fa772-3cdc-41d3-840e-ced246a8090f-scripts\") pod \"ceilometer-0\" (UID: \"699fa772-3cdc-41d3-840e-ced246a8090f\") " pod="openstack/ceilometer-0" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.509946 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/699fa772-3cdc-41d3-840e-ced246a8090f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"699fa772-3cdc-41d3-840e-ced246a8090f\") " pod="openstack/ceilometer-0" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.510105 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/699fa772-3cdc-41d3-840e-ced246a8090f-config-data\") pod \"ceilometer-0\" (UID: \"699fa772-3cdc-41d3-840e-ced246a8090f\") " pod="openstack/ceilometer-0" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.510179 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/699fa772-3cdc-41d3-840e-ced246a8090f-log-httpd\") pod \"ceilometer-0\" (UID: \"699fa772-3cdc-41d3-840e-ced246a8090f\") " pod="openstack/ceilometer-0" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.510251 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdz6v\" (UniqueName: \"kubernetes.io/projected/699fa772-3cdc-41d3-840e-ced246a8090f-kube-api-access-fdz6v\") pod \"ceilometer-0\" (UID: \"699fa772-3cdc-41d3-840e-ced246a8090f\") " pod="openstack/ceilometer-0" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.510277 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/699fa772-3cdc-41d3-840e-ced246a8090f-run-httpd\") pod \"ceilometer-0\" (UID: \"699fa772-3cdc-41d3-840e-ced246a8090f\") " pod="openstack/ceilometer-0" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.510317 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/699fa772-3cdc-41d3-840e-ced246a8090f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"699fa772-3cdc-41d3-840e-ced246a8090f\") " pod="openstack/ceilometer-0" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.510351 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/699fa772-3cdc-41d3-840e-ced246a8090f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"699fa772-3cdc-41d3-840e-ced246a8090f\") " pod="openstack/ceilometer-0" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.612354 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/699fa772-3cdc-41d3-840e-ced246a8090f-scripts\") pod \"ceilometer-0\" (UID: \"699fa772-3cdc-41d3-840e-ced246a8090f\") " pod="openstack/ceilometer-0" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.612400 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/699fa772-3cdc-41d3-840e-ced246a8090f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"699fa772-3cdc-41d3-840e-ced246a8090f\") " pod="openstack/ceilometer-0" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.612491 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/699fa772-3cdc-41d3-840e-ced246a8090f-config-data\") pod \"ceilometer-0\" (UID: \"699fa772-3cdc-41d3-840e-ced246a8090f\") " pod="openstack/ceilometer-0" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.612510 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/699fa772-3cdc-41d3-840e-ced246a8090f-log-httpd\") pod \"ceilometer-0\" (UID: \"699fa772-3cdc-41d3-840e-ced246a8090f\") " pod="openstack/ceilometer-0" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.612575 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdz6v\" (UniqueName: \"kubernetes.io/projected/699fa772-3cdc-41d3-840e-ced246a8090f-kube-api-access-fdz6v\") pod \"ceilometer-0\" (UID: \"699fa772-3cdc-41d3-840e-ced246a8090f\") " pod="openstack/ceilometer-0" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.612593 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/699fa772-3cdc-41d3-840e-ced246a8090f-run-httpd\") pod \"ceilometer-0\" (UID: \"699fa772-3cdc-41d3-840e-ced246a8090f\") " pod="openstack/ceilometer-0" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.612625 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/699fa772-3cdc-41d3-840e-ced246a8090f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"699fa772-3cdc-41d3-840e-ced246a8090f\") " pod="openstack/ceilometer-0" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.612645 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/699fa772-3cdc-41d3-840e-ced246a8090f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"699fa772-3cdc-41d3-840e-ced246a8090f\") " pod="openstack/ceilometer-0" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.613334 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/699fa772-3cdc-41d3-840e-ced246a8090f-run-httpd\") pod \"ceilometer-0\" (UID: \"699fa772-3cdc-41d3-840e-ced246a8090f\") " pod="openstack/ceilometer-0" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.613442 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/699fa772-3cdc-41d3-840e-ced246a8090f-log-httpd\") pod \"ceilometer-0\" (UID: \"699fa772-3cdc-41d3-840e-ced246a8090f\") " pod="openstack/ceilometer-0" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.617387 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/699fa772-3cdc-41d3-840e-ced246a8090f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"699fa772-3cdc-41d3-840e-ced246a8090f\") " pod="openstack/ceilometer-0" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.617868 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/699fa772-3cdc-41d3-840e-ced246a8090f-scripts\") pod \"ceilometer-0\" (UID: \"699fa772-3cdc-41d3-840e-ced246a8090f\") " pod="openstack/ceilometer-0" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.618783 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/699fa772-3cdc-41d3-840e-ced246a8090f-config-data\") pod \"ceilometer-0\" (UID: \"699fa772-3cdc-41d3-840e-ced246a8090f\") " pod="openstack/ceilometer-0" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.619367 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/699fa772-3cdc-41d3-840e-ced246a8090f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"699fa772-3cdc-41d3-840e-ced246a8090f\") " pod="openstack/ceilometer-0" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.625668 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/699fa772-3cdc-41d3-840e-ced246a8090f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"699fa772-3cdc-41d3-840e-ced246a8090f\") " pod="openstack/ceilometer-0" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.634451 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdz6v\" (UniqueName: \"kubernetes.io/projected/699fa772-3cdc-41d3-840e-ced246a8090f-kube-api-access-fdz6v\") pod \"ceilometer-0\" (UID: \"699fa772-3cdc-41d3-840e-ced246a8090f\") " pod="openstack/ceilometer-0" Mar 12 15:11:11 crc kubenswrapper[4869]: I0312 15:11:11.785432 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 15:11:12 crc kubenswrapper[4869]: I0312 15:11:12.256011 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 15:11:12 crc kubenswrapper[4869]: W0312 15:11:12.264048 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod699fa772_3cdc_41d3_840e_ced246a8090f.slice/crio-766fcea273e9cadd424b66a9939013ca73b59c15fa63d4f9c40c9131f83b1c08 WatchSource:0}: Error finding container 766fcea273e9cadd424b66a9939013ca73b59c15fa63d4f9c40c9131f83b1c08: Status 404 returned error can't find the container with id 766fcea273e9cadd424b66a9939013ca73b59c15fa63d4f9c40c9131f83b1c08 Mar 12 15:11:12 crc kubenswrapper[4869]: I0312 15:11:12.266476 4869 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 15:11:12 crc kubenswrapper[4869]: I0312 15:11:12.348423 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b75c9a12-7533-4855-b543-e75d0bb77857" path="/var/lib/kubelet/pods/b75c9a12-7533-4855-b543-e75d0bb77857/volumes" Mar 12 15:11:12 crc kubenswrapper[4869]: I0312 15:11:12.349366 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"699fa772-3cdc-41d3-840e-ced246a8090f","Type":"ContainerStarted","Data":"766fcea273e9cadd424b66a9939013ca73b59c15fa63d4f9c40c9131f83b1c08"} Mar 12 15:11:13 crc kubenswrapper[4869]: I0312 15:11:13.367496 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"699fa772-3cdc-41d3-840e-ced246a8090f","Type":"ContainerStarted","Data":"3c45745967901dea18218aff0cc71fb2ca72b50d6d00f09e755f2bd4c61afbde"} Mar 12 15:11:13 crc kubenswrapper[4869]: I0312 15:11:13.580124 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 15:11:13 crc kubenswrapper[4869]: I0312 15:11:13.692191 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 15:11:14 crc kubenswrapper[4869]: I0312 15:11:14.380357 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"699fa772-3cdc-41d3-840e-ced246a8090f","Type":"ContainerStarted","Data":"5f73c8d455be0e3f9e6af4e54d040ca3a5a40b651a836a09b98c5e5eabe1b545"} Mar 12 15:11:14 crc kubenswrapper[4869]: I0312 15:11:14.380677 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"699fa772-3cdc-41d3-840e-ced246a8090f","Type":"ContainerStarted","Data":"9f5a7e6b56a3dfa896951399fcd0a8062f43e3ffedc4e32ea1fb806d63d83bab"} Mar 12 15:11:16 crc kubenswrapper[4869]: I0312 15:11:16.409110 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"699fa772-3cdc-41d3-840e-ced246a8090f","Type":"ContainerStarted","Data":"2a387f98c7c01c949d1e7af9d5bfdc22c8d16c3a4b79f46b3ae605f9afd42c98"} Mar 12 15:11:16 crc kubenswrapper[4869]: I0312 15:11:16.412920 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 12 15:11:16 crc kubenswrapper[4869]: I0312 15:11:16.439525 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.843585596 podStartE2EDuration="5.439508025s" podCreationTimestamp="2026-03-12 15:11:11 +0000 UTC" firstStartedPulling="2026-03-12 15:11:12.266284308 +0000 UTC m=+1424.551509586" lastFinishedPulling="2026-03-12 15:11:15.862206737 +0000 UTC m=+1428.147432015" observedRunningTime="2026-03-12 15:11:16.437187668 +0000 UTC m=+1428.722413006" watchObservedRunningTime="2026-03-12 15:11:16.439508025 +0000 UTC m=+1428.724733303" Mar 12 15:11:16 crc kubenswrapper[4869]: I0312 15:11:16.691040 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 12 15:11:18 crc kubenswrapper[4869]: I0312 15:11:18.160866 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="e0323899-ea3b-4572-baa4-3483b0d5fd86" containerName="rabbitmq" containerID="cri-o://2697723ecc4b8b6205845e7410eff3ca8ae4c708e1ae9679c82243acb8b0acc3" gracePeriod=604796 Mar 12 15:11:18 crc kubenswrapper[4869]: I0312 15:11:18.326373 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="3e764959-1933-4a88-b8de-fd853d49a0d3" containerName="rabbitmq" containerID="cri-o://d5177b769d612b137b3086add7c6a0909cc0fd2ea3b64c3ba4f6533ae01b1548" gracePeriod=604796 Mar 12 15:11:19 crc kubenswrapper[4869]: I0312 15:11:19.684808 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:11:19 crc kubenswrapper[4869]: I0312 15:11:19.685177 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:11:23 crc kubenswrapper[4869]: I0312 15:11:23.637676 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="e0323899-ea3b-4572-baa4-3483b0d5fd86" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.007884 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="3e764959-1933-4a88-b8de-fd853d49a0d3" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.505828 4869 generic.go:334] "Generic (PLEG): container finished" podID="e0323899-ea3b-4572-baa4-3483b0d5fd86" containerID="2697723ecc4b8b6205845e7410eff3ca8ae4c708e1ae9679c82243acb8b0acc3" exitCode=0 Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.506130 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e0323899-ea3b-4572-baa4-3483b0d5fd86","Type":"ContainerDied","Data":"2697723ecc4b8b6205845e7410eff3ca8ae4c708e1ae9679c82243acb8b0acc3"} Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.775295 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.864091 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.884265 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e0323899-ea3b-4572-baa4-3483b0d5fd86-pod-info\") pod \"e0323899-ea3b-4572-baa4-3483b0d5fd86\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.884331 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3e764959-1933-4a88-b8de-fd853d49a0d3-rabbitmq-erlang-cookie\") pod \"3e764959-1933-4a88-b8de-fd853d49a0d3\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.884367 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3e764959-1933-4a88-b8de-fd853d49a0d3-erlang-cookie-secret\") pod \"3e764959-1933-4a88-b8de-fd853d49a0d3\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.884390 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3e764959-1933-4a88-b8de-fd853d49a0d3-rabbitmq-tls\") pod \"3e764959-1933-4a88-b8de-fd853d49a0d3\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.884465 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3e764959-1933-4a88-b8de-fd853d49a0d3-rabbitmq-plugins\") pod \"3e764959-1933-4a88-b8de-fd853d49a0d3\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.884487 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3e764959-1933-4a88-b8de-fd853d49a0d3-pod-info\") pod \"3e764959-1933-4a88-b8de-fd853d49a0d3\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.884508 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"e0323899-ea3b-4572-baa4-3483b0d5fd86\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.884643 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e0323899-ea3b-4572-baa4-3483b0d5fd86-rabbitmq-erlang-cookie\") pod \"e0323899-ea3b-4572-baa4-3483b0d5fd86\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.884671 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e0323899-ea3b-4572-baa4-3483b0d5fd86-rabbitmq-tls\") pod \"e0323899-ea3b-4572-baa4-3483b0d5fd86\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.884707 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0323899-ea3b-4572-baa4-3483b0d5fd86-config-data\") pod \"e0323899-ea3b-4572-baa4-3483b0d5fd86\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.884736 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"3e764959-1933-4a88-b8de-fd853d49a0d3\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.884784 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e0323899-ea3b-4572-baa4-3483b0d5fd86-rabbitmq-confd\") pod \"e0323899-ea3b-4572-baa4-3483b0d5fd86\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.884838 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3e764959-1933-4a88-b8de-fd853d49a0d3-server-conf\") pod \"3e764959-1933-4a88-b8de-fd853d49a0d3\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.884874 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e764959-1933-4a88-b8de-fd853d49a0d3-config-data\") pod \"3e764959-1933-4a88-b8de-fd853d49a0d3\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.884921 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3e764959-1933-4a88-b8de-fd853d49a0d3-rabbitmq-confd\") pod \"3e764959-1933-4a88-b8de-fd853d49a0d3\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.884986 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3e764959-1933-4a88-b8de-fd853d49a0d3-plugins-conf\") pod \"3e764959-1933-4a88-b8de-fd853d49a0d3\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.885017 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdbmn\" (UniqueName: \"kubernetes.io/projected/e0323899-ea3b-4572-baa4-3483b0d5fd86-kube-api-access-zdbmn\") pod \"e0323899-ea3b-4572-baa4-3483b0d5fd86\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.885042 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e0323899-ea3b-4572-baa4-3483b0d5fd86-rabbitmq-plugins\") pod \"e0323899-ea3b-4572-baa4-3483b0d5fd86\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.885066 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e0323899-ea3b-4572-baa4-3483b0d5fd86-server-conf\") pod \"e0323899-ea3b-4572-baa4-3483b0d5fd86\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.885094 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e0323899-ea3b-4572-baa4-3483b0d5fd86-plugins-conf\") pod \"e0323899-ea3b-4572-baa4-3483b0d5fd86\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.885125 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6x4h\" (UniqueName: \"kubernetes.io/projected/3e764959-1933-4a88-b8de-fd853d49a0d3-kube-api-access-l6x4h\") pod \"3e764959-1933-4a88-b8de-fd853d49a0d3\" (UID: \"3e764959-1933-4a88-b8de-fd853d49a0d3\") " Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.885177 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e0323899-ea3b-4572-baa4-3483b0d5fd86-erlang-cookie-secret\") pod \"e0323899-ea3b-4572-baa4-3483b0d5fd86\" (UID: \"e0323899-ea3b-4572-baa4-3483b0d5fd86\") " Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.887479 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0323899-ea3b-4572-baa4-3483b0d5fd86-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e0323899-ea3b-4572-baa4-3483b0d5fd86" (UID: "e0323899-ea3b-4572-baa4-3483b0d5fd86"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.888211 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e764959-1933-4a88-b8de-fd853d49a0d3-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3e764959-1933-4a88-b8de-fd853d49a0d3" (UID: "3e764959-1933-4a88-b8de-fd853d49a0d3"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.888702 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e764959-1933-4a88-b8de-fd853d49a0d3-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3e764959-1933-4a88-b8de-fd853d49a0d3" (UID: "3e764959-1933-4a88-b8de-fd853d49a0d3"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.893612 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0323899-ea3b-4572-baa4-3483b0d5fd86-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e0323899-ea3b-4572-baa4-3483b0d5fd86" (UID: "e0323899-ea3b-4572-baa4-3483b0d5fd86"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.893747 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0323899-ea3b-4572-baa4-3483b0d5fd86-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e0323899-ea3b-4572-baa4-3483b0d5fd86" (UID: "e0323899-ea3b-4572-baa4-3483b0d5fd86"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.894291 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e764959-1933-4a88-b8de-fd853d49a0d3-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3e764959-1933-4a88-b8de-fd853d49a0d3" (UID: "3e764959-1933-4a88-b8de-fd853d49a0d3"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.905253 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3e764959-1933-4a88-b8de-fd853d49a0d3-pod-info" (OuterVolumeSpecName: "pod-info") pod "3e764959-1933-4a88-b8de-fd853d49a0d3" (UID: "3e764959-1933-4a88-b8de-fd853d49a0d3"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.906145 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "persistence") pod "e0323899-ea3b-4572-baa4-3483b0d5fd86" (UID: "e0323899-ea3b-4572-baa4-3483b0d5fd86"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.909000 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "3e764959-1933-4a88-b8de-fd853d49a0d3" (UID: "3e764959-1933-4a88-b8de-fd853d49a0d3"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.909042 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e764959-1933-4a88-b8de-fd853d49a0d3-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3e764959-1933-4a88-b8de-fd853d49a0d3" (UID: "3e764959-1933-4a88-b8de-fd853d49a0d3"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.909496 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e764959-1933-4a88-b8de-fd853d49a0d3-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "3e764959-1933-4a88-b8de-fd853d49a0d3" (UID: "3e764959-1933-4a88-b8de-fd853d49a0d3"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.909598 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0323899-ea3b-4572-baa4-3483b0d5fd86-kube-api-access-zdbmn" (OuterVolumeSpecName: "kube-api-access-zdbmn") pod "e0323899-ea3b-4572-baa4-3483b0d5fd86" (UID: "e0323899-ea3b-4572-baa4-3483b0d5fd86"). InnerVolumeSpecName "kube-api-access-zdbmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.909692 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e0323899-ea3b-4572-baa4-3483b0d5fd86-pod-info" (OuterVolumeSpecName: "pod-info") pod "e0323899-ea3b-4572-baa4-3483b0d5fd86" (UID: "e0323899-ea3b-4572-baa4-3483b0d5fd86"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.910523 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0323899-ea3b-4572-baa4-3483b0d5fd86-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e0323899-ea3b-4572-baa4-3483b0d5fd86" (UID: "e0323899-ea3b-4572-baa4-3483b0d5fd86"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.910898 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0323899-ea3b-4572-baa4-3483b0d5fd86-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "e0323899-ea3b-4572-baa4-3483b0d5fd86" (UID: "e0323899-ea3b-4572-baa4-3483b0d5fd86"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.912085 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e764959-1933-4a88-b8de-fd853d49a0d3-kube-api-access-l6x4h" (OuterVolumeSpecName: "kube-api-access-l6x4h") pod "3e764959-1933-4a88-b8de-fd853d49a0d3" (UID: "3e764959-1933-4a88-b8de-fd853d49a0d3"). InnerVolumeSpecName "kube-api-access-l6x4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.939578 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0323899-ea3b-4572-baa4-3483b0d5fd86-config-data" (OuterVolumeSpecName: "config-data") pod "e0323899-ea3b-4572-baa4-3483b0d5fd86" (UID: "e0323899-ea3b-4572-baa4-3483b0d5fd86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.961648 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e764959-1933-4a88-b8de-fd853d49a0d3-config-data" (OuterVolumeSpecName: "config-data") pod "3e764959-1933-4a88-b8de-fd853d49a0d3" (UID: "3e764959-1933-4a88-b8de-fd853d49a0d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.984941 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e764959-1933-4a88-b8de-fd853d49a0d3-server-conf" (OuterVolumeSpecName: "server-conf") pod "3e764959-1933-4a88-b8de-fd853d49a0d3" (UID: "3e764959-1933-4a88-b8de-fd853d49a0d3"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.988602 4869 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3e764959-1933-4a88-b8de-fd853d49a0d3-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.988638 4869 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3e764959-1933-4a88-b8de-fd853d49a0d3-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.988650 4869 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3e764959-1933-4a88-b8de-fd853d49a0d3-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.988659 4869 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3e764959-1933-4a88-b8de-fd853d49a0d3-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.988668 4869 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3e764959-1933-4a88-b8de-fd853d49a0d3-pod-info\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.988696 4869 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.988706 4869 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e0323899-ea3b-4572-baa4-3483b0d5fd86-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.988718 4869 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e0323899-ea3b-4572-baa4-3483b0d5fd86-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.988727 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0323899-ea3b-4572-baa4-3483b0d5fd86-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.988742 4869 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.988751 4869 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3e764959-1933-4a88-b8de-fd853d49a0d3-server-conf\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.988762 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e764959-1933-4a88-b8de-fd853d49a0d3-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.988770 4869 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3e764959-1933-4a88-b8de-fd853d49a0d3-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.988779 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdbmn\" (UniqueName: \"kubernetes.io/projected/e0323899-ea3b-4572-baa4-3483b0d5fd86-kube-api-access-zdbmn\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.988788 4869 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e0323899-ea3b-4572-baa4-3483b0d5fd86-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.988797 4869 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e0323899-ea3b-4572-baa4-3483b0d5fd86-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.988804 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6x4h\" (UniqueName: \"kubernetes.io/projected/3e764959-1933-4a88-b8de-fd853d49a0d3-kube-api-access-l6x4h\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.988813 4869 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e0323899-ea3b-4572-baa4-3483b0d5fd86-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:24 crc kubenswrapper[4869]: I0312 15:11:24.988820 4869 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e0323899-ea3b-4572-baa4-3483b0d5fd86-pod-info\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.021028 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0323899-ea3b-4572-baa4-3483b0d5fd86-server-conf" (OuterVolumeSpecName: "server-conf") pod "e0323899-ea3b-4572-baa4-3483b0d5fd86" (UID: "e0323899-ea3b-4572-baa4-3483b0d5fd86"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.032116 4869 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.048716 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0323899-ea3b-4572-baa4-3483b0d5fd86-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e0323899-ea3b-4572-baa4-3483b0d5fd86" (UID: "e0323899-ea3b-4572-baa4-3483b0d5fd86"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.055463 4869 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.083224 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e764959-1933-4a88-b8de-fd853d49a0d3-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3e764959-1933-4a88-b8de-fd853d49a0d3" (UID: "3e764959-1933-4a88-b8de-fd853d49a0d3"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.091737 4869 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.091803 4869 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.091816 4869 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e0323899-ea3b-4572-baa4-3483b0d5fd86-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.091825 4869 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3e764959-1933-4a88-b8de-fd853d49a0d3-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.091834 4869 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e0323899-ea3b-4572-baa4-3483b0d5fd86-server-conf\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.517916 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e0323899-ea3b-4572-baa4-3483b0d5fd86","Type":"ContainerDied","Data":"e8b3d6437b78bc78380602884eef512b1bdb0c6a9ad5b0e001d6087cbc220f1e"} Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.518424 4869 scope.go:117] "RemoveContainer" containerID="2697723ecc4b8b6205845e7410eff3ca8ae4c708e1ae9679c82243acb8b0acc3" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.517981 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.521250 4869 generic.go:334] "Generic (PLEG): container finished" podID="3e764959-1933-4a88-b8de-fd853d49a0d3" containerID="d5177b769d612b137b3086add7c6a0909cc0fd2ea3b64c3ba4f6533ae01b1548" exitCode=0 Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.521305 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3e764959-1933-4a88-b8de-fd853d49a0d3","Type":"ContainerDied","Data":"d5177b769d612b137b3086add7c6a0909cc0fd2ea3b64c3ba4f6533ae01b1548"} Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.521343 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3e764959-1933-4a88-b8de-fd853d49a0d3","Type":"ContainerDied","Data":"5f1fd2094bef98383942a91561a5cbbf3befcfbbb8d294e9a33ae0b2e565e142"} Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.521432 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.561725 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.569887 4869 scope.go:117] "RemoveContainer" containerID="4fcf47aa2c397bc1f0fa16216f4cc78821cbeb8188e49ef858d02fe2468e098b" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.578710 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.611697 4869 scope.go:117] "RemoveContainer" containerID="d5177b769d612b137b3086add7c6a0909cc0fd2ea3b64c3ba4f6533ae01b1548" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.619991 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.630083 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.638439 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 15:11:25 crc kubenswrapper[4869]: E0312 15:11:25.639125 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0323899-ea3b-4572-baa4-3483b0d5fd86" containerName="rabbitmq" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.639199 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0323899-ea3b-4572-baa4-3483b0d5fd86" containerName="rabbitmq" Mar 12 15:11:25 crc kubenswrapper[4869]: E0312 15:11:25.639260 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0323899-ea3b-4572-baa4-3483b0d5fd86" containerName="setup-container" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.639320 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0323899-ea3b-4572-baa4-3483b0d5fd86" containerName="setup-container" Mar 12 15:11:25 crc kubenswrapper[4869]: E0312 15:11:25.639373 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e764959-1933-4a88-b8de-fd853d49a0d3" containerName="rabbitmq" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.639421 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e764959-1933-4a88-b8de-fd853d49a0d3" containerName="rabbitmq" Mar 12 15:11:25 crc kubenswrapper[4869]: E0312 15:11:25.639491 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e764959-1933-4a88-b8de-fd853d49a0d3" containerName="setup-container" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.639553 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e764959-1933-4a88-b8de-fd853d49a0d3" containerName="setup-container" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.639788 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0323899-ea3b-4572-baa4-3483b0d5fd86" containerName="rabbitmq" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.639862 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e764959-1933-4a88-b8de-fd853d49a0d3" containerName="rabbitmq" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.641268 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.644390 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.644459 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.644559 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.644563 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.644650 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-xbp7k" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.644965 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.645472 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.647907 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.655349 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.663839 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.668222 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.671261 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.672079 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.673193 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.673892 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-2cvmj" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.680164 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.681085 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.683334 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.705885 4869 scope.go:117] "RemoveContainer" containerID="fb475407f5163c190c98f95c63c48745bc8f6824ce9801f43a20edc1296046f5" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.748205 4869 scope.go:117] "RemoveContainer" containerID="d5177b769d612b137b3086add7c6a0909cc0fd2ea3b64c3ba4f6533ae01b1548" Mar 12 15:11:25 crc kubenswrapper[4869]: E0312 15:11:25.750100 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5177b769d612b137b3086add7c6a0909cc0fd2ea3b64c3ba4f6533ae01b1548\": container with ID starting with d5177b769d612b137b3086add7c6a0909cc0fd2ea3b64c3ba4f6533ae01b1548 not found: ID does not exist" containerID="d5177b769d612b137b3086add7c6a0909cc0fd2ea3b64c3ba4f6533ae01b1548" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.750157 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5177b769d612b137b3086add7c6a0909cc0fd2ea3b64c3ba4f6533ae01b1548"} err="failed to get container status \"d5177b769d612b137b3086add7c6a0909cc0fd2ea3b64c3ba4f6533ae01b1548\": rpc error: code = NotFound desc = could not find container \"d5177b769d612b137b3086add7c6a0909cc0fd2ea3b64c3ba4f6533ae01b1548\": container with ID starting with d5177b769d612b137b3086add7c6a0909cc0fd2ea3b64c3ba4f6533ae01b1548 not found: ID does not exist" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.750189 4869 scope.go:117] "RemoveContainer" containerID="fb475407f5163c190c98f95c63c48745bc8f6824ce9801f43a20edc1296046f5" Mar 12 15:11:25 crc kubenswrapper[4869]: E0312 15:11:25.750568 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb475407f5163c190c98f95c63c48745bc8f6824ce9801f43a20edc1296046f5\": container with ID starting with fb475407f5163c190c98f95c63c48745bc8f6824ce9801f43a20edc1296046f5 not found: ID does not exist" containerID="fb475407f5163c190c98f95c63c48745bc8f6824ce9801f43a20edc1296046f5" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.750619 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb475407f5163c190c98f95c63c48745bc8f6824ce9801f43a20edc1296046f5"} err="failed to get container status \"fb475407f5163c190c98f95c63c48745bc8f6824ce9801f43a20edc1296046f5\": rpc error: code = NotFound desc = could not find container \"fb475407f5163c190c98f95c63c48745bc8f6824ce9801f43a20edc1296046f5\": container with ID starting with fb475407f5163c190c98f95c63c48745bc8f6824ce9801f43a20edc1296046f5 not found: ID does not exist" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.807050 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5njd4\" (UniqueName: \"kubernetes.io/projected/f1182ade-04e6-4329-b41f-eda38443a859-kube-api-access-5njd4\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.807101 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/be06ec61-ee02-4ab3-8e42-d98f94af4a87-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.807124 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.807146 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f1182ade-04e6-4329-b41f-eda38443a859-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.807617 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/be06ec61-ee02-4ab3-8e42-d98f94af4a87-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.807701 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/be06ec61-ee02-4ab3-8e42-d98f94af4a87-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.807787 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/be06ec61-ee02-4ab3-8e42-d98f94af4a87-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.807826 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f1182ade-04e6-4329-b41f-eda38443a859-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.808036 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/be06ec61-ee02-4ab3-8e42-d98f94af4a87-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.808086 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f1182ade-04e6-4329-b41f-eda38443a859-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.808114 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f1182ade-04e6-4329-b41f-eda38443a859-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.808233 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f1182ade-04e6-4329-b41f-eda38443a859-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.808264 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/be06ec61-ee02-4ab3-8e42-d98f94af4a87-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.808303 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4scvf\" (UniqueName: \"kubernetes.io/projected/be06ec61-ee02-4ab3-8e42-d98f94af4a87-kube-api-access-4scvf\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.808422 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f1182ade-04e6-4329-b41f-eda38443a859-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.808459 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f1182ade-04e6-4329-b41f-eda38443a859-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.808495 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/be06ec61-ee02-4ab3-8e42-d98f94af4a87-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.808528 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/be06ec61-ee02-4ab3-8e42-d98f94af4a87-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.808712 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.808780 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f1182ade-04e6-4329-b41f-eda38443a859-config-data\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.808827 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f1182ade-04e6-4329-b41f-eda38443a859-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.808868 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/be06ec61-ee02-4ab3-8e42-d98f94af4a87-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.910888 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/be06ec61-ee02-4ab3-8e42-d98f94af4a87-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.910948 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f1182ade-04e6-4329-b41f-eda38443a859-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.910968 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f1182ade-04e6-4329-b41f-eda38443a859-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.911009 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f1182ade-04e6-4329-b41f-eda38443a859-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.911043 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/be06ec61-ee02-4ab3-8e42-d98f94af4a87-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.911077 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4scvf\" (UniqueName: \"kubernetes.io/projected/be06ec61-ee02-4ab3-8e42-d98f94af4a87-kube-api-access-4scvf\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.911144 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f1182ade-04e6-4329-b41f-eda38443a859-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.911175 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f1182ade-04e6-4329-b41f-eda38443a859-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.911198 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/be06ec61-ee02-4ab3-8e42-d98f94af4a87-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.911226 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/be06ec61-ee02-4ab3-8e42-d98f94af4a87-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.911451 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.911479 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f1182ade-04e6-4329-b41f-eda38443a859-config-data\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.911514 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f1182ade-04e6-4329-b41f-eda38443a859-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.911565 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/be06ec61-ee02-4ab3-8e42-d98f94af4a87-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.911595 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5njd4\" (UniqueName: \"kubernetes.io/projected/f1182ade-04e6-4329-b41f-eda38443a859-kube-api-access-5njd4\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.911618 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/be06ec61-ee02-4ab3-8e42-d98f94af4a87-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.911648 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.911678 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f1182ade-04e6-4329-b41f-eda38443a859-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.911748 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/be06ec61-ee02-4ab3-8e42-d98f94af4a87-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.912068 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/be06ec61-ee02-4ab3-8e42-d98f94af4a87-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.911647 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f1182ade-04e6-4329-b41f-eda38443a859-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.912332 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f1182ade-04e6-4329-b41f-eda38443a859-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.912923 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f1182ade-04e6-4329-b41f-eda38443a859-config-data\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.912948 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f1182ade-04e6-4329-b41f-eda38443a859-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.913310 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/be06ec61-ee02-4ab3-8e42-d98f94af4a87-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.913979 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f1182ade-04e6-4329-b41f-eda38443a859-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.914194 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/be06ec61-ee02-4ab3-8e42-d98f94af4a87-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.914524 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.914731 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.915383 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/be06ec61-ee02-4ab3-8e42-d98f94af4a87-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.915560 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/be06ec61-ee02-4ab3-8e42-d98f94af4a87-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.915637 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/be06ec61-ee02-4ab3-8e42-d98f94af4a87-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.915665 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f1182ade-04e6-4329-b41f-eda38443a859-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.915812 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/be06ec61-ee02-4ab3-8e42-d98f94af4a87-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.916761 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/be06ec61-ee02-4ab3-8e42-d98f94af4a87-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.916974 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f1182ade-04e6-4329-b41f-eda38443a859-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.918028 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f1182ade-04e6-4329-b41f-eda38443a859-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.918729 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f1182ade-04e6-4329-b41f-eda38443a859-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.922611 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/be06ec61-ee02-4ab3-8e42-d98f94af4a87-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.922913 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/be06ec61-ee02-4ab3-8e42-d98f94af4a87-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.923967 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f1182ade-04e6-4329-b41f-eda38443a859-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.927627 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/be06ec61-ee02-4ab3-8e42-d98f94af4a87-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.936450 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5njd4\" (UniqueName: \"kubernetes.io/projected/f1182ade-04e6-4329-b41f-eda38443a859-kube-api-access-5njd4\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.938973 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4scvf\" (UniqueName: \"kubernetes.io/projected/be06ec61-ee02-4ab3-8e42-d98f94af4a87-kube-api-access-4scvf\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.957729 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"be06ec61-ee02-4ab3-8e42-d98f94af4a87\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.980783 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"f1182ade-04e6-4329-b41f-eda38443a859\") " pod="openstack/rabbitmq-server-0" Mar 12 15:11:25 crc kubenswrapper[4869]: I0312 15:11:25.981637 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:11:26 crc kubenswrapper[4869]: I0312 15:11:26.028626 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 12 15:11:26 crc kubenswrapper[4869]: I0312 15:11:26.355011 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e764959-1933-4a88-b8de-fd853d49a0d3" path="/var/lib/kubelet/pods/3e764959-1933-4a88-b8de-fd853d49a0d3/volumes" Mar 12 15:11:26 crc kubenswrapper[4869]: I0312 15:11:26.357214 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0323899-ea3b-4572-baa4-3483b0d5fd86" path="/var/lib/kubelet/pods/e0323899-ea3b-4572-baa4-3483b0d5fd86/volumes" Mar 12 15:11:26 crc kubenswrapper[4869]: I0312 15:11:26.473046 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 15:11:26 crc kubenswrapper[4869]: W0312 15:11:26.473611 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe06ec61_ee02_4ab3_8e42_d98f94af4a87.slice/crio-4ce5c95ee89d4c52c5f8e8534c1160204662b47fc8866789aed86fa1226040a6 WatchSource:0}: Error finding container 4ce5c95ee89d4c52c5f8e8534c1160204662b47fc8866789aed86fa1226040a6: Status 404 returned error can't find the container with id 4ce5c95ee89d4c52c5f8e8534c1160204662b47fc8866789aed86fa1226040a6 Mar 12 15:11:26 crc kubenswrapper[4869]: I0312 15:11:26.540217 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"be06ec61-ee02-4ab3-8e42-d98f94af4a87","Type":"ContainerStarted","Data":"4ce5c95ee89d4c52c5f8e8534c1160204662b47fc8866789aed86fa1226040a6"} Mar 12 15:11:26 crc kubenswrapper[4869]: I0312 15:11:26.566686 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 15:11:26 crc kubenswrapper[4869]: W0312 15:11:26.571124 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1182ade_04e6_4329_b41f_eda38443a859.slice/crio-91734c4a0225c65719ad2493e32e552313cf3236433f8019189f56a187fa9944 WatchSource:0}: Error finding container 91734c4a0225c65719ad2493e32e552313cf3236433f8019189f56a187fa9944: Status 404 returned error can't find the container with id 91734c4a0225c65719ad2493e32e552313cf3236433f8019189f56a187fa9944 Mar 12 15:11:27 crc kubenswrapper[4869]: I0312 15:11:27.153515 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5559d4f67f-lcvmk"] Mar 12 15:11:27 crc kubenswrapper[4869]: I0312 15:11:27.156346 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" Mar 12 15:11:27 crc kubenswrapper[4869]: I0312 15:11:27.157947 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 12 15:11:27 crc kubenswrapper[4869]: I0312 15:11:27.175483 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5559d4f67f-lcvmk"] Mar 12 15:11:27 crc kubenswrapper[4869]: I0312 15:11:27.344914 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-config\") pod \"dnsmasq-dns-5559d4f67f-lcvmk\" (UID: \"a6ae89fe-68c9-47df-b557-553e1237a1a1\") " pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" Mar 12 15:11:27 crc kubenswrapper[4869]: I0312 15:11:27.345077 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-dns-svc\") pod \"dnsmasq-dns-5559d4f67f-lcvmk\" (UID: \"a6ae89fe-68c9-47df-b557-553e1237a1a1\") " pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" Mar 12 15:11:27 crc kubenswrapper[4869]: I0312 15:11:27.345134 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-openstack-edpm-ipam\") pod \"dnsmasq-dns-5559d4f67f-lcvmk\" (UID: \"a6ae89fe-68c9-47df-b557-553e1237a1a1\") " pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" Mar 12 15:11:27 crc kubenswrapper[4869]: I0312 15:11:27.345155 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-ovsdbserver-nb\") pod \"dnsmasq-dns-5559d4f67f-lcvmk\" (UID: \"a6ae89fe-68c9-47df-b557-553e1237a1a1\") " pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" Mar 12 15:11:27 crc kubenswrapper[4869]: I0312 15:11:27.345240 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-ovsdbserver-sb\") pod \"dnsmasq-dns-5559d4f67f-lcvmk\" (UID: \"a6ae89fe-68c9-47df-b557-553e1237a1a1\") " pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" Mar 12 15:11:27 crc kubenswrapper[4869]: I0312 15:11:27.345268 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwj8l\" (UniqueName: \"kubernetes.io/projected/a6ae89fe-68c9-47df-b557-553e1237a1a1-kube-api-access-dwj8l\") pod \"dnsmasq-dns-5559d4f67f-lcvmk\" (UID: \"a6ae89fe-68c9-47df-b557-553e1237a1a1\") " pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" Mar 12 15:11:27 crc kubenswrapper[4869]: I0312 15:11:27.345418 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-dns-swift-storage-0\") pod \"dnsmasq-dns-5559d4f67f-lcvmk\" (UID: \"a6ae89fe-68c9-47df-b557-553e1237a1a1\") " pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" Mar 12 15:11:27 crc kubenswrapper[4869]: I0312 15:11:27.447638 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-dns-svc\") pod \"dnsmasq-dns-5559d4f67f-lcvmk\" (UID: \"a6ae89fe-68c9-47df-b557-553e1237a1a1\") " pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" Mar 12 15:11:27 crc kubenswrapper[4869]: I0312 15:11:27.447708 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-openstack-edpm-ipam\") pod \"dnsmasq-dns-5559d4f67f-lcvmk\" (UID: \"a6ae89fe-68c9-47df-b557-553e1237a1a1\") " pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" Mar 12 15:11:27 crc kubenswrapper[4869]: I0312 15:11:27.447729 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-ovsdbserver-nb\") pod \"dnsmasq-dns-5559d4f67f-lcvmk\" (UID: \"a6ae89fe-68c9-47df-b557-553e1237a1a1\") " pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" Mar 12 15:11:27 crc kubenswrapper[4869]: I0312 15:11:27.447808 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-ovsdbserver-sb\") pod \"dnsmasq-dns-5559d4f67f-lcvmk\" (UID: \"a6ae89fe-68c9-47df-b557-553e1237a1a1\") " pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" Mar 12 15:11:27 crc kubenswrapper[4869]: I0312 15:11:27.447829 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwj8l\" (UniqueName: \"kubernetes.io/projected/a6ae89fe-68c9-47df-b557-553e1237a1a1-kube-api-access-dwj8l\") pod \"dnsmasq-dns-5559d4f67f-lcvmk\" (UID: \"a6ae89fe-68c9-47df-b557-553e1237a1a1\") " pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" Mar 12 15:11:27 crc kubenswrapper[4869]: I0312 15:11:27.447894 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-dns-swift-storage-0\") pod \"dnsmasq-dns-5559d4f67f-lcvmk\" (UID: \"a6ae89fe-68c9-47df-b557-553e1237a1a1\") " pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" Mar 12 15:11:27 crc kubenswrapper[4869]: I0312 15:11:27.447937 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-config\") pod \"dnsmasq-dns-5559d4f67f-lcvmk\" (UID: \"a6ae89fe-68c9-47df-b557-553e1237a1a1\") " pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" Mar 12 15:11:27 crc kubenswrapper[4869]: I0312 15:11:27.449594 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-dns-swift-storage-0\") pod \"dnsmasq-dns-5559d4f67f-lcvmk\" (UID: \"a6ae89fe-68c9-47df-b557-553e1237a1a1\") " pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" Mar 12 15:11:27 crc kubenswrapper[4869]: I0312 15:11:27.449609 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-dns-svc\") pod \"dnsmasq-dns-5559d4f67f-lcvmk\" (UID: \"a6ae89fe-68c9-47df-b557-553e1237a1a1\") " pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" Mar 12 15:11:27 crc kubenswrapper[4869]: I0312 15:11:27.449786 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-ovsdbserver-nb\") pod \"dnsmasq-dns-5559d4f67f-lcvmk\" (UID: \"a6ae89fe-68c9-47df-b557-553e1237a1a1\") " pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" Mar 12 15:11:27 crc kubenswrapper[4869]: I0312 15:11:27.450373 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-ovsdbserver-sb\") pod \"dnsmasq-dns-5559d4f67f-lcvmk\" (UID: \"a6ae89fe-68c9-47df-b557-553e1237a1a1\") " pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" Mar 12 15:11:27 crc kubenswrapper[4869]: I0312 15:11:27.450416 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-openstack-edpm-ipam\") pod \"dnsmasq-dns-5559d4f67f-lcvmk\" (UID: \"a6ae89fe-68c9-47df-b557-553e1237a1a1\") " pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" Mar 12 15:11:27 crc kubenswrapper[4869]: I0312 15:11:27.450915 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-config\") pod \"dnsmasq-dns-5559d4f67f-lcvmk\" (UID: \"a6ae89fe-68c9-47df-b557-553e1237a1a1\") " pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" Mar 12 15:11:27 crc kubenswrapper[4869]: I0312 15:11:27.468638 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwj8l\" (UniqueName: \"kubernetes.io/projected/a6ae89fe-68c9-47df-b557-553e1237a1a1-kube-api-access-dwj8l\") pod \"dnsmasq-dns-5559d4f67f-lcvmk\" (UID: \"a6ae89fe-68c9-47df-b557-553e1237a1a1\") " pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" Mar 12 15:11:27 crc kubenswrapper[4869]: I0312 15:11:27.472513 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" Mar 12 15:11:27 crc kubenswrapper[4869]: I0312 15:11:27.549560 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f1182ade-04e6-4329-b41f-eda38443a859","Type":"ContainerStarted","Data":"91734c4a0225c65719ad2493e32e552313cf3236433f8019189f56a187fa9944"} Mar 12 15:11:28 crc kubenswrapper[4869]: I0312 15:11:28.000270 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5559d4f67f-lcvmk"] Mar 12 15:11:28 crc kubenswrapper[4869]: I0312 15:11:28.559077 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"be06ec61-ee02-4ab3-8e42-d98f94af4a87","Type":"ContainerStarted","Data":"3240b8f8ba437219c1fabe3d7871964a08b1ad28b4678f8b752914aaa360e26e"} Mar 12 15:11:28 crc kubenswrapper[4869]: I0312 15:11:28.561805 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f1182ade-04e6-4329-b41f-eda38443a859","Type":"ContainerStarted","Data":"b8e03577a6f18acc56c02e4f1efe3dc2a3c4b38a0667f1d0b0678e263e2e33a6"} Mar 12 15:11:28 crc kubenswrapper[4869]: I0312 15:11:28.563820 4869 generic.go:334] "Generic (PLEG): container finished" podID="a6ae89fe-68c9-47df-b557-553e1237a1a1" containerID="430cba1c7e7a89867929f28f200343cbf6660e96f8c45d48d98e4a43a899e90a" exitCode=0 Mar 12 15:11:28 crc kubenswrapper[4869]: I0312 15:11:28.563868 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" event={"ID":"a6ae89fe-68c9-47df-b557-553e1237a1a1","Type":"ContainerDied","Data":"430cba1c7e7a89867929f28f200343cbf6660e96f8c45d48d98e4a43a899e90a"} Mar 12 15:11:28 crc kubenswrapper[4869]: I0312 15:11:28.563904 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" event={"ID":"a6ae89fe-68c9-47df-b557-553e1237a1a1","Type":"ContainerStarted","Data":"415d56e7a36777ff91b2aa75a470a2ff00f22d8c060b8b1813f2b7b6715180ae"} Mar 12 15:11:29 crc kubenswrapper[4869]: I0312 15:11:29.573583 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" event={"ID":"a6ae89fe-68c9-47df-b557-553e1237a1a1","Type":"ContainerStarted","Data":"efe16279a2ec4460df4970b43b5073e0024e15384512a125c7388432ffadc90c"} Mar 12 15:11:29 crc kubenswrapper[4869]: I0312 15:11:29.574193 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" Mar 12 15:11:29 crc kubenswrapper[4869]: I0312 15:11:29.600293 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" podStartSLOduration=2.600262197 podStartE2EDuration="2.600262197s" podCreationTimestamp="2026-03-12 15:11:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:11:29.590433587 +0000 UTC m=+1441.875658875" watchObservedRunningTime="2026-03-12 15:11:29.600262197 +0000 UTC m=+1441.885487505" Mar 12 15:11:37 crc kubenswrapper[4869]: I0312 15:11:37.474361 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" Mar 12 15:11:37 crc kubenswrapper[4869]: I0312 15:11:37.552906 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b4c997d87-pnwr2"] Mar 12 15:11:37 crc kubenswrapper[4869]: I0312 15:11:37.553201 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b4c997d87-pnwr2" podUID="3301f303-5379-498d-978f-6606497ae3da" containerName="dnsmasq-dns" containerID="cri-o://f4f9a2c68cb581aa8b7a9969638e76f3e80eef2920a4f4f65cb47fbb88d7b0a7" gracePeriod=10 Mar 12 15:11:37 crc kubenswrapper[4869]: I0312 15:11:37.688928 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d99fc9df9-bblrn"] Mar 12 15:11:37 crc kubenswrapper[4869]: I0312 15:11:37.693895 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d99fc9df9-bblrn" Mar 12 15:11:37 crc kubenswrapper[4869]: I0312 15:11:37.736181 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d99fc9df9-bblrn"] Mar 12 15:11:37 crc kubenswrapper[4869]: I0312 15:11:37.757875 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/052cf777-9cf7-44bf-bbc7-5cc89c1d5e22-config\") pod \"dnsmasq-dns-5d99fc9df9-bblrn\" (UID: \"052cf777-9cf7-44bf-bbc7-5cc89c1d5e22\") " pod="openstack/dnsmasq-dns-5d99fc9df9-bblrn" Mar 12 15:11:37 crc kubenswrapper[4869]: I0312 15:11:37.757929 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/052cf777-9cf7-44bf-bbc7-5cc89c1d5e22-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d99fc9df9-bblrn\" (UID: \"052cf777-9cf7-44bf-bbc7-5cc89c1d5e22\") " pod="openstack/dnsmasq-dns-5d99fc9df9-bblrn" Mar 12 15:11:37 crc kubenswrapper[4869]: I0312 15:11:37.757969 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/052cf777-9cf7-44bf-bbc7-5cc89c1d5e22-ovsdbserver-nb\") pod \"dnsmasq-dns-5d99fc9df9-bblrn\" (UID: \"052cf777-9cf7-44bf-bbc7-5cc89c1d5e22\") " pod="openstack/dnsmasq-dns-5d99fc9df9-bblrn" Mar 12 15:11:37 crc kubenswrapper[4869]: I0312 15:11:37.758003 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/052cf777-9cf7-44bf-bbc7-5cc89c1d5e22-dns-swift-storage-0\") pod \"dnsmasq-dns-5d99fc9df9-bblrn\" (UID: \"052cf777-9cf7-44bf-bbc7-5cc89c1d5e22\") " pod="openstack/dnsmasq-dns-5d99fc9df9-bblrn" Mar 12 15:11:37 crc kubenswrapper[4869]: I0312 15:11:37.758027 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/052cf777-9cf7-44bf-bbc7-5cc89c1d5e22-dns-svc\") pod \"dnsmasq-dns-5d99fc9df9-bblrn\" (UID: \"052cf777-9cf7-44bf-bbc7-5cc89c1d5e22\") " pod="openstack/dnsmasq-dns-5d99fc9df9-bblrn" Mar 12 15:11:37 crc kubenswrapper[4869]: I0312 15:11:37.758202 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/052cf777-9cf7-44bf-bbc7-5cc89c1d5e22-ovsdbserver-sb\") pod \"dnsmasq-dns-5d99fc9df9-bblrn\" (UID: \"052cf777-9cf7-44bf-bbc7-5cc89c1d5e22\") " pod="openstack/dnsmasq-dns-5d99fc9df9-bblrn" Mar 12 15:11:37 crc kubenswrapper[4869]: I0312 15:11:37.758253 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9t89\" (UniqueName: \"kubernetes.io/projected/052cf777-9cf7-44bf-bbc7-5cc89c1d5e22-kube-api-access-p9t89\") pod \"dnsmasq-dns-5d99fc9df9-bblrn\" (UID: \"052cf777-9cf7-44bf-bbc7-5cc89c1d5e22\") " pod="openstack/dnsmasq-dns-5d99fc9df9-bblrn" Mar 12 15:11:37 crc kubenswrapper[4869]: I0312 15:11:37.860187 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/052cf777-9cf7-44bf-bbc7-5cc89c1d5e22-ovsdbserver-sb\") pod \"dnsmasq-dns-5d99fc9df9-bblrn\" (UID: \"052cf777-9cf7-44bf-bbc7-5cc89c1d5e22\") " pod="openstack/dnsmasq-dns-5d99fc9df9-bblrn" Mar 12 15:11:37 crc kubenswrapper[4869]: I0312 15:11:37.860244 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9t89\" (UniqueName: \"kubernetes.io/projected/052cf777-9cf7-44bf-bbc7-5cc89c1d5e22-kube-api-access-p9t89\") pod \"dnsmasq-dns-5d99fc9df9-bblrn\" (UID: \"052cf777-9cf7-44bf-bbc7-5cc89c1d5e22\") " pod="openstack/dnsmasq-dns-5d99fc9df9-bblrn" Mar 12 15:11:37 crc kubenswrapper[4869]: I0312 15:11:37.860369 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/052cf777-9cf7-44bf-bbc7-5cc89c1d5e22-config\") pod \"dnsmasq-dns-5d99fc9df9-bblrn\" (UID: \"052cf777-9cf7-44bf-bbc7-5cc89c1d5e22\") " pod="openstack/dnsmasq-dns-5d99fc9df9-bblrn" Mar 12 15:11:37 crc kubenswrapper[4869]: I0312 15:11:37.860396 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/052cf777-9cf7-44bf-bbc7-5cc89c1d5e22-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d99fc9df9-bblrn\" (UID: \"052cf777-9cf7-44bf-bbc7-5cc89c1d5e22\") " pod="openstack/dnsmasq-dns-5d99fc9df9-bblrn" Mar 12 15:11:37 crc kubenswrapper[4869]: I0312 15:11:37.860418 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/052cf777-9cf7-44bf-bbc7-5cc89c1d5e22-ovsdbserver-nb\") pod \"dnsmasq-dns-5d99fc9df9-bblrn\" (UID: \"052cf777-9cf7-44bf-bbc7-5cc89c1d5e22\") " pod="openstack/dnsmasq-dns-5d99fc9df9-bblrn" Mar 12 15:11:37 crc kubenswrapper[4869]: I0312 15:11:37.860444 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/052cf777-9cf7-44bf-bbc7-5cc89c1d5e22-dns-svc\") pod \"dnsmasq-dns-5d99fc9df9-bblrn\" (UID: \"052cf777-9cf7-44bf-bbc7-5cc89c1d5e22\") " pod="openstack/dnsmasq-dns-5d99fc9df9-bblrn" Mar 12 15:11:37 crc kubenswrapper[4869]: I0312 15:11:37.860463 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/052cf777-9cf7-44bf-bbc7-5cc89c1d5e22-dns-swift-storage-0\") pod \"dnsmasq-dns-5d99fc9df9-bblrn\" (UID: \"052cf777-9cf7-44bf-bbc7-5cc89c1d5e22\") " pod="openstack/dnsmasq-dns-5d99fc9df9-bblrn" Mar 12 15:11:37 crc kubenswrapper[4869]: I0312 15:11:37.861605 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/052cf777-9cf7-44bf-bbc7-5cc89c1d5e22-ovsdbserver-sb\") pod \"dnsmasq-dns-5d99fc9df9-bblrn\" (UID: \"052cf777-9cf7-44bf-bbc7-5cc89c1d5e22\") " pod="openstack/dnsmasq-dns-5d99fc9df9-bblrn" Mar 12 15:11:37 crc kubenswrapper[4869]: I0312 15:11:37.864199 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/052cf777-9cf7-44bf-bbc7-5cc89c1d5e22-dns-swift-storage-0\") pod \"dnsmasq-dns-5d99fc9df9-bblrn\" (UID: \"052cf777-9cf7-44bf-bbc7-5cc89c1d5e22\") " pod="openstack/dnsmasq-dns-5d99fc9df9-bblrn" Mar 12 15:11:37 crc kubenswrapper[4869]: I0312 15:11:37.865785 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/052cf777-9cf7-44bf-bbc7-5cc89c1d5e22-config\") pod \"dnsmasq-dns-5d99fc9df9-bblrn\" (UID: \"052cf777-9cf7-44bf-bbc7-5cc89c1d5e22\") " pod="openstack/dnsmasq-dns-5d99fc9df9-bblrn" Mar 12 15:11:37 crc kubenswrapper[4869]: I0312 15:11:37.866227 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/052cf777-9cf7-44bf-bbc7-5cc89c1d5e22-ovsdbserver-nb\") pod \"dnsmasq-dns-5d99fc9df9-bblrn\" (UID: \"052cf777-9cf7-44bf-bbc7-5cc89c1d5e22\") " pod="openstack/dnsmasq-dns-5d99fc9df9-bblrn" Mar 12 15:11:37 crc kubenswrapper[4869]: I0312 15:11:37.867946 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/052cf777-9cf7-44bf-bbc7-5cc89c1d5e22-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d99fc9df9-bblrn\" (UID: \"052cf777-9cf7-44bf-bbc7-5cc89c1d5e22\") " pod="openstack/dnsmasq-dns-5d99fc9df9-bblrn" Mar 12 15:11:37 crc kubenswrapper[4869]: I0312 15:11:37.869137 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/052cf777-9cf7-44bf-bbc7-5cc89c1d5e22-dns-svc\") pod \"dnsmasq-dns-5d99fc9df9-bblrn\" (UID: \"052cf777-9cf7-44bf-bbc7-5cc89c1d5e22\") " pod="openstack/dnsmasq-dns-5d99fc9df9-bblrn" Mar 12 15:11:37 crc kubenswrapper[4869]: I0312 15:11:37.886717 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9t89\" (UniqueName: \"kubernetes.io/projected/052cf777-9cf7-44bf-bbc7-5cc89c1d5e22-kube-api-access-p9t89\") pod \"dnsmasq-dns-5d99fc9df9-bblrn\" (UID: \"052cf777-9cf7-44bf-bbc7-5cc89c1d5e22\") " pod="openstack/dnsmasq-dns-5d99fc9df9-bblrn" Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.053395 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d99fc9df9-bblrn" Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.194496 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b4c997d87-pnwr2" Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.272018 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3301f303-5379-498d-978f-6606497ae3da-dns-swift-storage-0\") pod \"3301f303-5379-498d-978f-6606497ae3da\" (UID: \"3301f303-5379-498d-978f-6606497ae3da\") " Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.272410 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3301f303-5379-498d-978f-6606497ae3da-ovsdbserver-nb\") pod \"3301f303-5379-498d-978f-6606497ae3da\" (UID: \"3301f303-5379-498d-978f-6606497ae3da\") " Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.272438 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8flm5\" (UniqueName: \"kubernetes.io/projected/3301f303-5379-498d-978f-6606497ae3da-kube-api-access-8flm5\") pod \"3301f303-5379-498d-978f-6606497ae3da\" (UID: \"3301f303-5379-498d-978f-6606497ae3da\") " Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.272482 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3301f303-5379-498d-978f-6606497ae3da-config\") pod \"3301f303-5379-498d-978f-6606497ae3da\" (UID: \"3301f303-5379-498d-978f-6606497ae3da\") " Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.272505 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3301f303-5379-498d-978f-6606497ae3da-ovsdbserver-sb\") pod \"3301f303-5379-498d-978f-6606497ae3da\" (UID: \"3301f303-5379-498d-978f-6606497ae3da\") " Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.272593 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3301f303-5379-498d-978f-6606497ae3da-dns-svc\") pod \"3301f303-5379-498d-978f-6606497ae3da\" (UID: \"3301f303-5379-498d-978f-6606497ae3da\") " Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.278625 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3301f303-5379-498d-978f-6606497ae3da-kube-api-access-8flm5" (OuterVolumeSpecName: "kube-api-access-8flm5") pod "3301f303-5379-498d-978f-6606497ae3da" (UID: "3301f303-5379-498d-978f-6606497ae3da"). InnerVolumeSpecName "kube-api-access-8flm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.351792 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3301f303-5379-498d-978f-6606497ae3da-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3301f303-5379-498d-978f-6606497ae3da" (UID: "3301f303-5379-498d-978f-6606497ae3da"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.366178 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3301f303-5379-498d-978f-6606497ae3da-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3301f303-5379-498d-978f-6606497ae3da" (UID: "3301f303-5379-498d-978f-6606497ae3da"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.379908 4869 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3301f303-5379-498d-978f-6606497ae3da-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.380138 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3301f303-5379-498d-978f-6606497ae3da-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.380235 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8flm5\" (UniqueName: \"kubernetes.io/projected/3301f303-5379-498d-978f-6606497ae3da-kube-api-access-8flm5\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.380054 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3301f303-5379-498d-978f-6606497ae3da-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3301f303-5379-498d-978f-6606497ae3da" (UID: "3301f303-5379-498d-978f-6606497ae3da"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.382500 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3301f303-5379-498d-978f-6606497ae3da-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3301f303-5379-498d-978f-6606497ae3da" (UID: "3301f303-5379-498d-978f-6606497ae3da"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.383526 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3301f303-5379-498d-978f-6606497ae3da-config" (OuterVolumeSpecName: "config") pod "3301f303-5379-498d-978f-6606497ae3da" (UID: "3301f303-5379-498d-978f-6606497ae3da"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.482833 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3301f303-5379-498d-978f-6606497ae3da-config\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.482868 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3301f303-5379-498d-978f-6606497ae3da-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.482882 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3301f303-5379-498d-978f-6606497ae3da-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.558157 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d99fc9df9-bblrn"] Mar 12 15:11:38 crc kubenswrapper[4869]: W0312 15:11:38.564183 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod052cf777_9cf7_44bf_bbc7_5cc89c1d5e22.slice/crio-5cfeba1d153196f9bf9a6350fbff680d0c78cdf495d8bf71cb44c8cb61caea33 WatchSource:0}: Error finding container 5cfeba1d153196f9bf9a6350fbff680d0c78cdf495d8bf71cb44c8cb61caea33: Status 404 returned error can't find the container with id 5cfeba1d153196f9bf9a6350fbff680d0c78cdf495d8bf71cb44c8cb61caea33 Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.657230 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d99fc9df9-bblrn" event={"ID":"052cf777-9cf7-44bf-bbc7-5cc89c1d5e22","Type":"ContainerStarted","Data":"5cfeba1d153196f9bf9a6350fbff680d0c78cdf495d8bf71cb44c8cb61caea33"} Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.659550 4869 generic.go:334] "Generic (PLEG): container finished" podID="3301f303-5379-498d-978f-6606497ae3da" containerID="f4f9a2c68cb581aa8b7a9969638e76f3e80eef2920a4f4f65cb47fbb88d7b0a7" exitCode=0 Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.659553 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b4c997d87-pnwr2" Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.659578 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b4c997d87-pnwr2" event={"ID":"3301f303-5379-498d-978f-6606497ae3da","Type":"ContainerDied","Data":"f4f9a2c68cb581aa8b7a9969638e76f3e80eef2920a4f4f65cb47fbb88d7b0a7"} Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.659624 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b4c997d87-pnwr2" event={"ID":"3301f303-5379-498d-978f-6606497ae3da","Type":"ContainerDied","Data":"32ce90efaf2bf231f0cba42e67b6a82ac1fec04cdc64fc9ebe2f2a58c0b11cf5"} Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.659643 4869 scope.go:117] "RemoveContainer" containerID="f4f9a2c68cb581aa8b7a9969638e76f3e80eef2920a4f4f65cb47fbb88d7b0a7" Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.759892 4869 scope.go:117] "RemoveContainer" containerID="771b6a7e54188575e33684bc34c2ab8e4af795e3035c6d5f13139403a75da52f" Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.796246 4869 scope.go:117] "RemoveContainer" containerID="f4f9a2c68cb581aa8b7a9969638e76f3e80eef2920a4f4f65cb47fbb88d7b0a7" Mar 12 15:11:38 crc kubenswrapper[4869]: E0312 15:11:38.796752 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4f9a2c68cb581aa8b7a9969638e76f3e80eef2920a4f4f65cb47fbb88d7b0a7\": container with ID starting with f4f9a2c68cb581aa8b7a9969638e76f3e80eef2920a4f4f65cb47fbb88d7b0a7 not found: ID does not exist" containerID="f4f9a2c68cb581aa8b7a9969638e76f3e80eef2920a4f4f65cb47fbb88d7b0a7" Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.796793 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4f9a2c68cb581aa8b7a9969638e76f3e80eef2920a4f4f65cb47fbb88d7b0a7"} err="failed to get container status \"f4f9a2c68cb581aa8b7a9969638e76f3e80eef2920a4f4f65cb47fbb88d7b0a7\": rpc error: code = NotFound desc = could not find container \"f4f9a2c68cb581aa8b7a9969638e76f3e80eef2920a4f4f65cb47fbb88d7b0a7\": container with ID starting with f4f9a2c68cb581aa8b7a9969638e76f3e80eef2920a4f4f65cb47fbb88d7b0a7 not found: ID does not exist" Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.796822 4869 scope.go:117] "RemoveContainer" containerID="771b6a7e54188575e33684bc34c2ab8e4af795e3035c6d5f13139403a75da52f" Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.796881 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b4c997d87-pnwr2"] Mar 12 15:11:38 crc kubenswrapper[4869]: E0312 15:11:38.797999 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"771b6a7e54188575e33684bc34c2ab8e4af795e3035c6d5f13139403a75da52f\": container with ID starting with 771b6a7e54188575e33684bc34c2ab8e4af795e3035c6d5f13139403a75da52f not found: ID does not exist" containerID="771b6a7e54188575e33684bc34c2ab8e4af795e3035c6d5f13139403a75da52f" Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.798030 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"771b6a7e54188575e33684bc34c2ab8e4af795e3035c6d5f13139403a75da52f"} err="failed to get container status \"771b6a7e54188575e33684bc34c2ab8e4af795e3035c6d5f13139403a75da52f\": rpc error: code = NotFound desc = could not find container \"771b6a7e54188575e33684bc34c2ab8e4af795e3035c6d5f13139403a75da52f\": container with ID starting with 771b6a7e54188575e33684bc34c2ab8e4af795e3035c6d5f13139403a75da52f not found: ID does not exist" Mar 12 15:11:38 crc kubenswrapper[4869]: I0312 15:11:38.805288 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b4c997d87-pnwr2"] Mar 12 15:11:39 crc kubenswrapper[4869]: I0312 15:11:39.670473 4869 generic.go:334] "Generic (PLEG): container finished" podID="052cf777-9cf7-44bf-bbc7-5cc89c1d5e22" containerID="299a6d541dd8f80ea2c9bd7eeaa38b52710f73f9c9fdac6e2eb7e4f5eed23694" exitCode=0 Mar 12 15:11:39 crc kubenswrapper[4869]: I0312 15:11:39.670671 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d99fc9df9-bblrn" event={"ID":"052cf777-9cf7-44bf-bbc7-5cc89c1d5e22","Type":"ContainerDied","Data":"299a6d541dd8f80ea2c9bd7eeaa38b52710f73f9c9fdac6e2eb7e4f5eed23694"} Mar 12 15:11:40 crc kubenswrapper[4869]: I0312 15:11:40.350732 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3301f303-5379-498d-978f-6606497ae3da" path="/var/lib/kubelet/pods/3301f303-5379-498d-978f-6606497ae3da/volumes" Mar 12 15:11:40 crc kubenswrapper[4869]: I0312 15:11:40.691485 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d99fc9df9-bblrn" event={"ID":"052cf777-9cf7-44bf-bbc7-5cc89c1d5e22","Type":"ContainerStarted","Data":"ff74e8857d9f5c518f2c44f165ebe203e14bcb4d2d19574817b869228b8a0397"} Mar 12 15:11:40 crc kubenswrapper[4869]: I0312 15:11:40.691857 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d99fc9df9-bblrn" Mar 12 15:11:40 crc kubenswrapper[4869]: I0312 15:11:40.713010 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d99fc9df9-bblrn" podStartSLOduration=3.712989945 podStartE2EDuration="3.712989945s" podCreationTimestamp="2026-03-12 15:11:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:11:40.709733502 +0000 UTC m=+1452.994958820" watchObservedRunningTime="2026-03-12 15:11:40.712989945 +0000 UTC m=+1452.998215233" Mar 12 15:11:41 crc kubenswrapper[4869]: I0312 15:11:41.795062 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.055429 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d99fc9df9-bblrn" Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.111322 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5559d4f67f-lcvmk"] Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.111654 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" podUID="a6ae89fe-68c9-47df-b557-553e1237a1a1" containerName="dnsmasq-dns" containerID="cri-o://efe16279a2ec4460df4970b43b5073e0024e15384512a125c7388432ffadc90c" gracePeriod=10 Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.680983 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.781010 4869 generic.go:334] "Generic (PLEG): container finished" podID="a6ae89fe-68c9-47df-b557-553e1237a1a1" containerID="efe16279a2ec4460df4970b43b5073e0024e15384512a125c7388432ffadc90c" exitCode=0 Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.781074 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" event={"ID":"a6ae89fe-68c9-47df-b557-553e1237a1a1","Type":"ContainerDied","Data":"efe16279a2ec4460df4970b43b5073e0024e15384512a125c7388432ffadc90c"} Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.781121 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" event={"ID":"a6ae89fe-68c9-47df-b557-553e1237a1a1","Type":"ContainerDied","Data":"415d56e7a36777ff91b2aa75a470a2ff00f22d8c060b8b1813f2b7b6715180ae"} Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.781147 4869 scope.go:117] "RemoveContainer" containerID="efe16279a2ec4460df4970b43b5073e0024e15384512a125c7388432ffadc90c" Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.781172 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5559d4f67f-lcvmk" Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.808401 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-dns-svc\") pod \"a6ae89fe-68c9-47df-b557-553e1237a1a1\" (UID: \"a6ae89fe-68c9-47df-b557-553e1237a1a1\") " Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.808454 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-dns-swift-storage-0\") pod \"a6ae89fe-68c9-47df-b557-553e1237a1a1\" (UID: \"a6ae89fe-68c9-47df-b557-553e1237a1a1\") " Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.808477 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-config\") pod \"a6ae89fe-68c9-47df-b557-553e1237a1a1\" (UID: \"a6ae89fe-68c9-47df-b557-553e1237a1a1\") " Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.808500 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwj8l\" (UniqueName: \"kubernetes.io/projected/a6ae89fe-68c9-47df-b557-553e1237a1a1-kube-api-access-dwj8l\") pod \"a6ae89fe-68c9-47df-b557-553e1237a1a1\" (UID: \"a6ae89fe-68c9-47df-b557-553e1237a1a1\") " Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.808531 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-openstack-edpm-ipam\") pod \"a6ae89fe-68c9-47df-b557-553e1237a1a1\" (UID: \"a6ae89fe-68c9-47df-b557-553e1237a1a1\") " Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.808757 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-ovsdbserver-nb\") pod \"a6ae89fe-68c9-47df-b557-553e1237a1a1\" (UID: \"a6ae89fe-68c9-47df-b557-553e1237a1a1\") " Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.808860 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-ovsdbserver-sb\") pod \"a6ae89fe-68c9-47df-b557-553e1237a1a1\" (UID: \"a6ae89fe-68c9-47df-b557-553e1237a1a1\") " Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.808962 4869 scope.go:117] "RemoveContainer" containerID="430cba1c7e7a89867929f28f200343cbf6660e96f8c45d48d98e4a43a899e90a" Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.827570 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6ae89fe-68c9-47df-b557-553e1237a1a1-kube-api-access-dwj8l" (OuterVolumeSpecName: "kube-api-access-dwj8l") pod "a6ae89fe-68c9-47df-b557-553e1237a1a1" (UID: "a6ae89fe-68c9-47df-b557-553e1237a1a1"). InnerVolumeSpecName "kube-api-access-dwj8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.837337 4869 scope.go:117] "RemoveContainer" containerID="efe16279a2ec4460df4970b43b5073e0024e15384512a125c7388432ffadc90c" Mar 12 15:11:48 crc kubenswrapper[4869]: E0312 15:11:48.839315 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efe16279a2ec4460df4970b43b5073e0024e15384512a125c7388432ffadc90c\": container with ID starting with efe16279a2ec4460df4970b43b5073e0024e15384512a125c7388432ffadc90c not found: ID does not exist" containerID="efe16279a2ec4460df4970b43b5073e0024e15384512a125c7388432ffadc90c" Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.839653 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efe16279a2ec4460df4970b43b5073e0024e15384512a125c7388432ffadc90c"} err="failed to get container status \"efe16279a2ec4460df4970b43b5073e0024e15384512a125c7388432ffadc90c\": rpc error: code = NotFound desc = could not find container \"efe16279a2ec4460df4970b43b5073e0024e15384512a125c7388432ffadc90c\": container with ID starting with efe16279a2ec4460df4970b43b5073e0024e15384512a125c7388432ffadc90c not found: ID does not exist" Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.839776 4869 scope.go:117] "RemoveContainer" containerID="430cba1c7e7a89867929f28f200343cbf6660e96f8c45d48d98e4a43a899e90a" Mar 12 15:11:48 crc kubenswrapper[4869]: E0312 15:11:48.840496 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"430cba1c7e7a89867929f28f200343cbf6660e96f8c45d48d98e4a43a899e90a\": container with ID starting with 430cba1c7e7a89867929f28f200343cbf6660e96f8c45d48d98e4a43a899e90a not found: ID does not exist" containerID="430cba1c7e7a89867929f28f200343cbf6660e96f8c45d48d98e4a43a899e90a" Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.840532 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"430cba1c7e7a89867929f28f200343cbf6660e96f8c45d48d98e4a43a899e90a"} err="failed to get container status \"430cba1c7e7a89867929f28f200343cbf6660e96f8c45d48d98e4a43a899e90a\": rpc error: code = NotFound desc = could not find container \"430cba1c7e7a89867929f28f200343cbf6660e96f8c45d48d98e4a43a899e90a\": container with ID starting with 430cba1c7e7a89867929f28f200343cbf6660e96f8c45d48d98e4a43a899e90a not found: ID does not exist" Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.888854 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a6ae89fe-68c9-47df-b557-553e1237a1a1" (UID: "a6ae89fe-68c9-47df-b557-553e1237a1a1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.890168 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a6ae89fe-68c9-47df-b557-553e1237a1a1" (UID: "a6ae89fe-68c9-47df-b557-553e1237a1a1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.891711 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "a6ae89fe-68c9-47df-b557-553e1237a1a1" (UID: "a6ae89fe-68c9-47df-b557-553e1237a1a1"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.902412 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-config" (OuterVolumeSpecName: "config") pod "a6ae89fe-68c9-47df-b557-553e1237a1a1" (UID: "a6ae89fe-68c9-47df-b557-553e1237a1a1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.912017 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.912060 4869 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.912076 4869 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-config\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.912089 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwj8l\" (UniqueName: \"kubernetes.io/projected/a6ae89fe-68c9-47df-b557-553e1237a1a1-kube-api-access-dwj8l\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.912102 4869 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.915422 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a6ae89fe-68c9-47df-b557-553e1237a1a1" (UID: "a6ae89fe-68c9-47df-b557-553e1237a1a1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:11:48 crc kubenswrapper[4869]: I0312 15:11:48.939700 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a6ae89fe-68c9-47df-b557-553e1237a1a1" (UID: "a6ae89fe-68c9-47df-b557-553e1237a1a1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:11:49 crc kubenswrapper[4869]: I0312 15:11:49.014319 4869 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:49 crc kubenswrapper[4869]: I0312 15:11:49.014354 4869 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6ae89fe-68c9-47df-b557-553e1237a1a1-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 15:11:49 crc kubenswrapper[4869]: I0312 15:11:49.121244 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5559d4f67f-lcvmk"] Mar 12 15:11:49 crc kubenswrapper[4869]: I0312 15:11:49.133000 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5559d4f67f-lcvmk"] Mar 12 15:11:49 crc kubenswrapper[4869]: I0312 15:11:49.684533 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:11:49 crc kubenswrapper[4869]: I0312 15:11:49.684606 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:11:50 crc kubenswrapper[4869]: I0312 15:11:50.348401 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6ae89fe-68c9-47df-b557-553e1237a1a1" path="/var/lib/kubelet/pods/a6ae89fe-68c9-47df-b557-553e1237a1a1/volumes" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.129911 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555472-6np76"] Mar 12 15:12:00 crc kubenswrapper[4869]: E0312 15:12:00.130807 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3301f303-5379-498d-978f-6606497ae3da" containerName="init" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.130822 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="3301f303-5379-498d-978f-6606497ae3da" containerName="init" Mar 12 15:12:00 crc kubenswrapper[4869]: E0312 15:12:00.130843 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ae89fe-68c9-47df-b557-553e1237a1a1" containerName="init" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.130851 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ae89fe-68c9-47df-b557-553e1237a1a1" containerName="init" Mar 12 15:12:00 crc kubenswrapper[4869]: E0312 15:12:00.130866 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ae89fe-68c9-47df-b557-553e1237a1a1" containerName="dnsmasq-dns" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.130874 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ae89fe-68c9-47df-b557-553e1237a1a1" containerName="dnsmasq-dns" Mar 12 15:12:00 crc kubenswrapper[4869]: E0312 15:12:00.130890 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3301f303-5379-498d-978f-6606497ae3da" containerName="dnsmasq-dns" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.130897 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="3301f303-5379-498d-978f-6606497ae3da" containerName="dnsmasq-dns" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.131086 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="3301f303-5379-498d-978f-6606497ae3da" containerName="dnsmasq-dns" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.131106 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6ae89fe-68c9-47df-b557-553e1237a1a1" containerName="dnsmasq-dns" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.131761 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555472-6np76" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.135793 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.136062 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.138250 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.139907 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555472-6np76"] Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.236760 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq72f\" (UniqueName: \"kubernetes.io/projected/63ea2272-6e47-4761-9a89-c6c2d0768aee-kube-api-access-fq72f\") pod \"auto-csr-approver-29555472-6np76\" (UID: \"63ea2272-6e47-4761-9a89-c6c2d0768aee\") " pod="openshift-infra/auto-csr-approver-29555472-6np76" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.288152 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq"] Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.289856 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.292022 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.292238 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cxsgq" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.292647 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.294706 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.313319 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq"] Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.338637 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq72f\" (UniqueName: \"kubernetes.io/projected/63ea2272-6e47-4761-9a89-c6c2d0768aee-kube-api-access-fq72f\") pod \"auto-csr-approver-29555472-6np76\" (UID: \"63ea2272-6e47-4761-9a89-c6c2d0768aee\") " pod="openshift-infra/auto-csr-approver-29555472-6np76" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.359251 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq72f\" (UniqueName: \"kubernetes.io/projected/63ea2272-6e47-4761-9a89-c6c2d0768aee-kube-api-access-fq72f\") pod \"auto-csr-approver-29555472-6np76\" (UID: \"63ea2272-6e47-4761-9a89-c6c2d0768aee\") " pod="openshift-infra/auto-csr-approver-29555472-6np76" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.442147 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e27517ee-4f44-4cfe-9b05-5cd9dd21165a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq\" (UID: \"e27517ee-4f44-4cfe-9b05-5cd9dd21165a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.442219 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e27517ee-4f44-4cfe-9b05-5cd9dd21165a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq\" (UID: \"e27517ee-4f44-4cfe-9b05-5cd9dd21165a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.442328 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e27517ee-4f44-4cfe-9b05-5cd9dd21165a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq\" (UID: \"e27517ee-4f44-4cfe-9b05-5cd9dd21165a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.442380 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp28w\" (UniqueName: \"kubernetes.io/projected/e27517ee-4f44-4cfe-9b05-5cd9dd21165a-kube-api-access-xp28w\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq\" (UID: \"e27517ee-4f44-4cfe-9b05-5cd9dd21165a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.472976 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555472-6np76" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.544656 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e27517ee-4f44-4cfe-9b05-5cd9dd21165a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq\" (UID: \"e27517ee-4f44-4cfe-9b05-5cd9dd21165a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.544745 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp28w\" (UniqueName: \"kubernetes.io/projected/e27517ee-4f44-4cfe-9b05-5cd9dd21165a-kube-api-access-xp28w\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq\" (UID: \"e27517ee-4f44-4cfe-9b05-5cd9dd21165a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.544830 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e27517ee-4f44-4cfe-9b05-5cd9dd21165a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq\" (UID: \"e27517ee-4f44-4cfe-9b05-5cd9dd21165a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.544878 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e27517ee-4f44-4cfe-9b05-5cd9dd21165a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq\" (UID: \"e27517ee-4f44-4cfe-9b05-5cd9dd21165a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.553006 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e27517ee-4f44-4cfe-9b05-5cd9dd21165a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq\" (UID: \"e27517ee-4f44-4cfe-9b05-5cd9dd21165a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.553195 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e27517ee-4f44-4cfe-9b05-5cd9dd21165a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq\" (UID: \"e27517ee-4f44-4cfe-9b05-5cd9dd21165a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.553344 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e27517ee-4f44-4cfe-9b05-5cd9dd21165a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq\" (UID: \"e27517ee-4f44-4cfe-9b05-5cd9dd21165a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.561660 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp28w\" (UniqueName: \"kubernetes.io/projected/e27517ee-4f44-4cfe-9b05-5cd9dd21165a-kube-api-access-xp28w\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq\" (UID: \"e27517ee-4f44-4cfe-9b05-5cd9dd21165a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.613369 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq" Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.894506 4869 generic.go:334] "Generic (PLEG): container finished" podID="be06ec61-ee02-4ab3-8e42-d98f94af4a87" containerID="3240b8f8ba437219c1fabe3d7871964a08b1ad28b4678f8b752914aaa360e26e" exitCode=0 Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.894739 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"be06ec61-ee02-4ab3-8e42-d98f94af4a87","Type":"ContainerDied","Data":"3240b8f8ba437219c1fabe3d7871964a08b1ad28b4678f8b752914aaa360e26e"} Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.897934 4869 generic.go:334] "Generic (PLEG): container finished" podID="f1182ade-04e6-4329-b41f-eda38443a859" containerID="b8e03577a6f18acc56c02e4f1efe3dc2a3c4b38a0667f1d0b0678e263e2e33a6" exitCode=0 Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.897982 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f1182ade-04e6-4329-b41f-eda38443a859","Type":"ContainerDied","Data":"b8e03577a6f18acc56c02e4f1efe3dc2a3c4b38a0667f1d0b0678e263e2e33a6"} Mar 12 15:12:00 crc kubenswrapper[4869]: I0312 15:12:00.971805 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555472-6np76"] Mar 12 15:12:01 crc kubenswrapper[4869]: W0312 15:12:01.172822 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode27517ee_4f44_4cfe_9b05_5cd9dd21165a.slice/crio-d23ffa151c7c86268eefb9ef76b3348f068c4edbd335a6fab8e948368bcdf452 WatchSource:0}: Error finding container d23ffa151c7c86268eefb9ef76b3348f068c4edbd335a6fab8e948368bcdf452: Status 404 returned error can't find the container with id d23ffa151c7c86268eefb9ef76b3348f068c4edbd335a6fab8e948368bcdf452 Mar 12 15:12:01 crc kubenswrapper[4869]: I0312 15:12:01.173083 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq"] Mar 12 15:12:01 crc kubenswrapper[4869]: I0312 15:12:01.912741 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"be06ec61-ee02-4ab3-8e42-d98f94af4a87","Type":"ContainerStarted","Data":"ea05c195f7d12ef63ce5824737e71ab409b33996f47f3fbf9f17aaa46a2c56a8"} Mar 12 15:12:01 crc kubenswrapper[4869]: I0312 15:12:01.913424 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:12:01 crc kubenswrapper[4869]: I0312 15:12:01.916280 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f1182ade-04e6-4329-b41f-eda38443a859","Type":"ContainerStarted","Data":"1227fa16d58748f3deaabf240f964062c67f82378818456030845e5e367fd2c1"} Mar 12 15:12:01 crc kubenswrapper[4869]: I0312 15:12:01.916984 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 12 15:12:01 crc kubenswrapper[4869]: I0312 15:12:01.921212 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq" event={"ID":"e27517ee-4f44-4cfe-9b05-5cd9dd21165a","Type":"ContainerStarted","Data":"d23ffa151c7c86268eefb9ef76b3348f068c4edbd335a6fab8e948368bcdf452"} Mar 12 15:12:01 crc kubenswrapper[4869]: I0312 15:12:01.922882 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555472-6np76" event={"ID":"63ea2272-6e47-4761-9a89-c6c2d0768aee","Type":"ContainerStarted","Data":"a71823365ded53445ad050470f73174556e99d1e6b1c0156f2c8465220c2cfb1"} Mar 12 15:12:01 crc kubenswrapper[4869]: I0312 15:12:01.959411 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.959387155 podStartE2EDuration="36.959387155s" podCreationTimestamp="2026-03-12 15:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:12:01.949067551 +0000 UTC m=+1474.234292839" watchObservedRunningTime="2026-03-12 15:12:01.959387155 +0000 UTC m=+1474.244612433" Mar 12 15:12:01 crc kubenswrapper[4869]: I0312 15:12:01.981455 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.981431163 podStartE2EDuration="36.981431163s" podCreationTimestamp="2026-03-12 15:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:12:01.976884363 +0000 UTC m=+1474.262109641" watchObservedRunningTime="2026-03-12 15:12:01.981431163 +0000 UTC m=+1474.266656441" Mar 12 15:12:02 crc kubenswrapper[4869]: I0312 15:12:02.935929 4869 generic.go:334] "Generic (PLEG): container finished" podID="63ea2272-6e47-4761-9a89-c6c2d0768aee" containerID="f50a5fe27635df7f9f7c005365851251f0111e896a624e12eefe7984ae07ddef" exitCode=0 Mar 12 15:12:02 crc kubenswrapper[4869]: I0312 15:12:02.936049 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555472-6np76" event={"ID":"63ea2272-6e47-4761-9a89-c6c2d0768aee","Type":"ContainerDied","Data":"f50a5fe27635df7f9f7c005365851251f0111e896a624e12eefe7984ae07ddef"} Mar 12 15:12:04 crc kubenswrapper[4869]: I0312 15:12:04.321224 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555472-6np76" Mar 12 15:12:04 crc kubenswrapper[4869]: I0312 15:12:04.430522 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fq72f\" (UniqueName: \"kubernetes.io/projected/63ea2272-6e47-4761-9a89-c6c2d0768aee-kube-api-access-fq72f\") pod \"63ea2272-6e47-4761-9a89-c6c2d0768aee\" (UID: \"63ea2272-6e47-4761-9a89-c6c2d0768aee\") " Mar 12 15:12:04 crc kubenswrapper[4869]: I0312 15:12:04.437670 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63ea2272-6e47-4761-9a89-c6c2d0768aee-kube-api-access-fq72f" (OuterVolumeSpecName: "kube-api-access-fq72f") pod "63ea2272-6e47-4761-9a89-c6c2d0768aee" (UID: "63ea2272-6e47-4761-9a89-c6c2d0768aee"). InnerVolumeSpecName "kube-api-access-fq72f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:12:04 crc kubenswrapper[4869]: I0312 15:12:04.534427 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fq72f\" (UniqueName: \"kubernetes.io/projected/63ea2272-6e47-4761-9a89-c6c2d0768aee-kube-api-access-fq72f\") on node \"crc\" DevicePath \"\"" Mar 12 15:12:04 crc kubenswrapper[4869]: I0312 15:12:04.963473 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555472-6np76" event={"ID":"63ea2272-6e47-4761-9a89-c6c2d0768aee","Type":"ContainerDied","Data":"a71823365ded53445ad050470f73174556e99d1e6b1c0156f2c8465220c2cfb1"} Mar 12 15:12:04 crc kubenswrapper[4869]: I0312 15:12:04.963514 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a71823365ded53445ad050470f73174556e99d1e6b1c0156f2c8465220c2cfb1" Mar 12 15:12:04 crc kubenswrapper[4869]: I0312 15:12:04.963579 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555472-6np76" Mar 12 15:12:05 crc kubenswrapper[4869]: I0312 15:12:05.392728 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555466-j7fkr"] Mar 12 15:12:05 crc kubenswrapper[4869]: I0312 15:12:05.403514 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555466-j7fkr"] Mar 12 15:12:06 crc kubenswrapper[4869]: I0312 15:12:06.348808 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b691259-763f-4535-98af-ef6fa62d8a0d" path="/var/lib/kubelet/pods/1b691259-763f-4535-98af-ef6fa62d8a0d/volumes" Mar 12 15:12:11 crc kubenswrapper[4869]: I0312 15:12:11.054248 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq" event={"ID":"e27517ee-4f44-4cfe-9b05-5cd9dd21165a","Type":"ContainerStarted","Data":"20215b8791960d963d77de1a92df61013c565f4b2c32cbbfcf3b7b583ae6eb32"} Mar 12 15:12:11 crc kubenswrapper[4869]: I0312 15:12:11.082167 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq" podStartSLOduration=1.6320834469999999 podStartE2EDuration="11.082147153s" podCreationTimestamp="2026-03-12 15:12:00 +0000 UTC" firstStartedPulling="2026-03-12 15:12:01.176564216 +0000 UTC m=+1473.461789494" lastFinishedPulling="2026-03-12 15:12:10.626627922 +0000 UTC m=+1482.911853200" observedRunningTime="2026-03-12 15:12:11.077440269 +0000 UTC m=+1483.362665547" watchObservedRunningTime="2026-03-12 15:12:11.082147153 +0000 UTC m=+1483.367372431" Mar 12 15:12:15 crc kubenswrapper[4869]: I0312 15:12:15.984838 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 12 15:12:16 crc kubenswrapper[4869]: I0312 15:12:16.032805 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 12 15:12:19 crc kubenswrapper[4869]: I0312 15:12:19.684147 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:12:19 crc kubenswrapper[4869]: I0312 15:12:19.684754 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:12:19 crc kubenswrapper[4869]: I0312 15:12:19.684811 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" Mar 12 15:12:19 crc kubenswrapper[4869]: I0312 15:12:19.685876 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e494c12edd1eec941b7037231b079cb054af121efa06d3e86355c27776905fd6"} pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 15:12:19 crc kubenswrapper[4869]: I0312 15:12:19.685976 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" containerID="cri-o://e494c12edd1eec941b7037231b079cb054af121efa06d3e86355c27776905fd6" gracePeriod=600 Mar 12 15:12:20 crc kubenswrapper[4869]: I0312 15:12:20.143097 4869 generic.go:334] "Generic (PLEG): container finished" podID="1621c994-94d2-4105-a988-f4739518ba91" containerID="e494c12edd1eec941b7037231b079cb054af121efa06d3e86355c27776905fd6" exitCode=0 Mar 12 15:12:20 crc kubenswrapper[4869]: I0312 15:12:20.143178 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerDied","Data":"e494c12edd1eec941b7037231b079cb054af121efa06d3e86355c27776905fd6"} Mar 12 15:12:20 crc kubenswrapper[4869]: I0312 15:12:20.143462 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerStarted","Data":"bc737999b30693337d1e520b47de07490eab1ec628aefabcaeb8e63a17e676eb"} Mar 12 15:12:20 crc kubenswrapper[4869]: I0312 15:12:20.143484 4869 scope.go:117] "RemoveContainer" containerID="2f5452570a2d00afc7e7591fc67b3884055af23168ca0e1d9b4ff0e5dcdc6950" Mar 12 15:12:21 crc kubenswrapper[4869]: I0312 15:12:21.153620 4869 generic.go:334] "Generic (PLEG): container finished" podID="e27517ee-4f44-4cfe-9b05-5cd9dd21165a" containerID="20215b8791960d963d77de1a92df61013c565f4b2c32cbbfcf3b7b583ae6eb32" exitCode=0 Mar 12 15:12:21 crc kubenswrapper[4869]: I0312 15:12:21.153717 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq" event={"ID":"e27517ee-4f44-4cfe-9b05-5cd9dd21165a","Type":"ContainerDied","Data":"20215b8791960d963d77de1a92df61013c565f4b2c32cbbfcf3b7b583ae6eb32"} Mar 12 15:12:22 crc kubenswrapper[4869]: I0312 15:12:22.620223 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq" Mar 12 15:12:22 crc kubenswrapper[4869]: I0312 15:12:22.703640 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e27517ee-4f44-4cfe-9b05-5cd9dd21165a-ssh-key-openstack-edpm-ipam\") pod \"e27517ee-4f44-4cfe-9b05-5cd9dd21165a\" (UID: \"e27517ee-4f44-4cfe-9b05-5cd9dd21165a\") " Mar 12 15:12:22 crc kubenswrapper[4869]: I0312 15:12:22.703699 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e27517ee-4f44-4cfe-9b05-5cd9dd21165a-repo-setup-combined-ca-bundle\") pod \"e27517ee-4f44-4cfe-9b05-5cd9dd21165a\" (UID: \"e27517ee-4f44-4cfe-9b05-5cd9dd21165a\") " Mar 12 15:12:22 crc kubenswrapper[4869]: I0312 15:12:22.703753 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e27517ee-4f44-4cfe-9b05-5cd9dd21165a-inventory\") pod \"e27517ee-4f44-4cfe-9b05-5cd9dd21165a\" (UID: \"e27517ee-4f44-4cfe-9b05-5cd9dd21165a\") " Mar 12 15:12:22 crc kubenswrapper[4869]: I0312 15:12:22.703788 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xp28w\" (UniqueName: \"kubernetes.io/projected/e27517ee-4f44-4cfe-9b05-5cd9dd21165a-kube-api-access-xp28w\") pod \"e27517ee-4f44-4cfe-9b05-5cd9dd21165a\" (UID: \"e27517ee-4f44-4cfe-9b05-5cd9dd21165a\") " Mar 12 15:12:22 crc kubenswrapper[4869]: I0312 15:12:22.709474 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e27517ee-4f44-4cfe-9b05-5cd9dd21165a-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "e27517ee-4f44-4cfe-9b05-5cd9dd21165a" (UID: "e27517ee-4f44-4cfe-9b05-5cd9dd21165a"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:12:22 crc kubenswrapper[4869]: I0312 15:12:22.713046 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e27517ee-4f44-4cfe-9b05-5cd9dd21165a-kube-api-access-xp28w" (OuterVolumeSpecName: "kube-api-access-xp28w") pod "e27517ee-4f44-4cfe-9b05-5cd9dd21165a" (UID: "e27517ee-4f44-4cfe-9b05-5cd9dd21165a"). InnerVolumeSpecName "kube-api-access-xp28w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:12:22 crc kubenswrapper[4869]: I0312 15:12:22.732903 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e27517ee-4f44-4cfe-9b05-5cd9dd21165a-inventory" (OuterVolumeSpecName: "inventory") pod "e27517ee-4f44-4cfe-9b05-5cd9dd21165a" (UID: "e27517ee-4f44-4cfe-9b05-5cd9dd21165a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:12:22 crc kubenswrapper[4869]: I0312 15:12:22.739079 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e27517ee-4f44-4cfe-9b05-5cd9dd21165a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e27517ee-4f44-4cfe-9b05-5cd9dd21165a" (UID: "e27517ee-4f44-4cfe-9b05-5cd9dd21165a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:12:22 crc kubenswrapper[4869]: I0312 15:12:22.806375 4869 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e27517ee-4f44-4cfe-9b05-5cd9dd21165a-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 15:12:22 crc kubenswrapper[4869]: I0312 15:12:22.806416 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xp28w\" (UniqueName: \"kubernetes.io/projected/e27517ee-4f44-4cfe-9b05-5cd9dd21165a-kube-api-access-xp28w\") on node \"crc\" DevicePath \"\"" Mar 12 15:12:22 crc kubenswrapper[4869]: I0312 15:12:22.806434 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e27517ee-4f44-4cfe-9b05-5cd9dd21165a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 15:12:22 crc kubenswrapper[4869]: I0312 15:12:22.806447 4869 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e27517ee-4f44-4cfe-9b05-5cd9dd21165a-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:12:23 crc kubenswrapper[4869]: I0312 15:12:23.175383 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq" event={"ID":"e27517ee-4f44-4cfe-9b05-5cd9dd21165a","Type":"ContainerDied","Data":"d23ffa151c7c86268eefb9ef76b3348f068c4edbd335a6fab8e948368bcdf452"} Mar 12 15:12:23 crc kubenswrapper[4869]: I0312 15:12:23.175765 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d23ffa151c7c86268eefb9ef76b3348f068c4edbd335a6fab8e948368bcdf452" Mar 12 15:12:23 crc kubenswrapper[4869]: I0312 15:12:23.175445 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq" Mar 12 15:12:23 crc kubenswrapper[4869]: I0312 15:12:23.257581 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-x6ckl"] Mar 12 15:12:23 crc kubenswrapper[4869]: E0312 15:12:23.258360 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63ea2272-6e47-4761-9a89-c6c2d0768aee" containerName="oc" Mar 12 15:12:23 crc kubenswrapper[4869]: I0312 15:12:23.258444 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="63ea2272-6e47-4761-9a89-c6c2d0768aee" containerName="oc" Mar 12 15:12:23 crc kubenswrapper[4869]: E0312 15:12:23.258562 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e27517ee-4f44-4cfe-9b05-5cd9dd21165a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 12 15:12:23 crc kubenswrapper[4869]: I0312 15:12:23.258627 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e27517ee-4f44-4cfe-9b05-5cd9dd21165a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 12 15:12:23 crc kubenswrapper[4869]: I0312 15:12:23.258907 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="e27517ee-4f44-4cfe-9b05-5cd9dd21165a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 12 15:12:23 crc kubenswrapper[4869]: I0312 15:12:23.259014 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="63ea2272-6e47-4761-9a89-c6c2d0768aee" containerName="oc" Mar 12 15:12:23 crc kubenswrapper[4869]: I0312 15:12:23.259988 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x6ckl" Mar 12 15:12:23 crc kubenswrapper[4869]: I0312 15:12:23.262759 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 15:12:23 crc kubenswrapper[4869]: I0312 15:12:23.262952 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 15:12:23 crc kubenswrapper[4869]: I0312 15:12:23.263110 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 15:12:23 crc kubenswrapper[4869]: I0312 15:12:23.263258 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cxsgq" Mar 12 15:12:23 crc kubenswrapper[4869]: I0312 15:12:23.267531 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-x6ckl"] Mar 12 15:12:23 crc kubenswrapper[4869]: I0312 15:12:23.423487 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/230d1ac1-539b-4bb6-9560-ce764540f933-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-x6ckl\" (UID: \"230d1ac1-539b-4bb6-9560-ce764540f933\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x6ckl" Mar 12 15:12:23 crc kubenswrapper[4869]: I0312 15:12:23.423600 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wns28\" (UniqueName: \"kubernetes.io/projected/230d1ac1-539b-4bb6-9560-ce764540f933-kube-api-access-wns28\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-x6ckl\" (UID: \"230d1ac1-539b-4bb6-9560-ce764540f933\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x6ckl" Mar 12 15:12:23 crc kubenswrapper[4869]: I0312 15:12:23.423770 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/230d1ac1-539b-4bb6-9560-ce764540f933-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-x6ckl\" (UID: \"230d1ac1-539b-4bb6-9560-ce764540f933\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x6ckl" Mar 12 15:12:23 crc kubenswrapper[4869]: I0312 15:12:23.525115 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wns28\" (UniqueName: \"kubernetes.io/projected/230d1ac1-539b-4bb6-9560-ce764540f933-kube-api-access-wns28\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-x6ckl\" (UID: \"230d1ac1-539b-4bb6-9560-ce764540f933\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x6ckl" Mar 12 15:12:23 crc kubenswrapper[4869]: I0312 15:12:23.525296 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/230d1ac1-539b-4bb6-9560-ce764540f933-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-x6ckl\" (UID: \"230d1ac1-539b-4bb6-9560-ce764540f933\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x6ckl" Mar 12 15:12:23 crc kubenswrapper[4869]: I0312 15:12:23.525384 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/230d1ac1-539b-4bb6-9560-ce764540f933-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-x6ckl\" (UID: \"230d1ac1-539b-4bb6-9560-ce764540f933\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x6ckl" Mar 12 15:12:23 crc kubenswrapper[4869]: I0312 15:12:23.531025 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/230d1ac1-539b-4bb6-9560-ce764540f933-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-x6ckl\" (UID: \"230d1ac1-539b-4bb6-9560-ce764540f933\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x6ckl" Mar 12 15:12:23 crc kubenswrapper[4869]: I0312 15:12:23.532240 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/230d1ac1-539b-4bb6-9560-ce764540f933-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-x6ckl\" (UID: \"230d1ac1-539b-4bb6-9560-ce764540f933\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x6ckl" Mar 12 15:12:23 crc kubenswrapper[4869]: I0312 15:12:23.542132 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wns28\" (UniqueName: \"kubernetes.io/projected/230d1ac1-539b-4bb6-9560-ce764540f933-kube-api-access-wns28\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-x6ckl\" (UID: \"230d1ac1-539b-4bb6-9560-ce764540f933\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x6ckl" Mar 12 15:12:23 crc kubenswrapper[4869]: I0312 15:12:23.630901 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x6ckl" Mar 12 15:12:24 crc kubenswrapper[4869]: I0312 15:12:24.164742 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-x6ckl"] Mar 12 15:12:24 crc kubenswrapper[4869]: I0312 15:12:24.186481 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x6ckl" event={"ID":"230d1ac1-539b-4bb6-9560-ce764540f933","Type":"ContainerStarted","Data":"fedb83d78bfd27cec77c5d2a0913d67cca70d495ea2ca513ff33832f77cd22bd"} Mar 12 15:12:25 crc kubenswrapper[4869]: I0312 15:12:25.198376 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x6ckl" event={"ID":"230d1ac1-539b-4bb6-9560-ce764540f933","Type":"ContainerStarted","Data":"c25a45c2e8493857498f2354a76f2a7b2036e622c2538e3cf705f006b181693c"} Mar 12 15:12:25 crc kubenswrapper[4869]: I0312 15:12:25.224031 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x6ckl" podStartSLOduration=1.809938909 podStartE2EDuration="2.224004268s" podCreationTimestamp="2026-03-12 15:12:23 +0000 UTC" firstStartedPulling="2026-03-12 15:12:24.174312983 +0000 UTC m=+1496.459538261" lastFinishedPulling="2026-03-12 15:12:24.588378342 +0000 UTC m=+1496.873603620" observedRunningTime="2026-03-12 15:12:25.21597046 +0000 UTC m=+1497.501195738" watchObservedRunningTime="2026-03-12 15:12:25.224004268 +0000 UTC m=+1497.509229556" Mar 12 15:12:27 crc kubenswrapper[4869]: I0312 15:12:27.216193 4869 generic.go:334] "Generic (PLEG): container finished" podID="230d1ac1-539b-4bb6-9560-ce764540f933" containerID="c25a45c2e8493857498f2354a76f2a7b2036e622c2538e3cf705f006b181693c" exitCode=0 Mar 12 15:12:27 crc kubenswrapper[4869]: I0312 15:12:27.216296 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x6ckl" event={"ID":"230d1ac1-539b-4bb6-9560-ce764540f933","Type":"ContainerDied","Data":"c25a45c2e8493857498f2354a76f2a7b2036e622c2538e3cf705f006b181693c"} Mar 12 15:12:28 crc kubenswrapper[4869]: I0312 15:12:28.697425 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x6ckl" Mar 12 15:12:28 crc kubenswrapper[4869]: I0312 15:12:28.824672 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/230d1ac1-539b-4bb6-9560-ce764540f933-ssh-key-openstack-edpm-ipam\") pod \"230d1ac1-539b-4bb6-9560-ce764540f933\" (UID: \"230d1ac1-539b-4bb6-9560-ce764540f933\") " Mar 12 15:12:28 crc kubenswrapper[4869]: I0312 15:12:28.824762 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wns28\" (UniqueName: \"kubernetes.io/projected/230d1ac1-539b-4bb6-9560-ce764540f933-kube-api-access-wns28\") pod \"230d1ac1-539b-4bb6-9560-ce764540f933\" (UID: \"230d1ac1-539b-4bb6-9560-ce764540f933\") " Mar 12 15:12:28 crc kubenswrapper[4869]: I0312 15:12:28.824879 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/230d1ac1-539b-4bb6-9560-ce764540f933-inventory\") pod \"230d1ac1-539b-4bb6-9560-ce764540f933\" (UID: \"230d1ac1-539b-4bb6-9560-ce764540f933\") " Mar 12 15:12:28 crc kubenswrapper[4869]: I0312 15:12:28.831917 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/230d1ac1-539b-4bb6-9560-ce764540f933-kube-api-access-wns28" (OuterVolumeSpecName: "kube-api-access-wns28") pod "230d1ac1-539b-4bb6-9560-ce764540f933" (UID: "230d1ac1-539b-4bb6-9560-ce764540f933"). InnerVolumeSpecName "kube-api-access-wns28". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:12:28 crc kubenswrapper[4869]: I0312 15:12:28.857512 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/230d1ac1-539b-4bb6-9560-ce764540f933-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "230d1ac1-539b-4bb6-9560-ce764540f933" (UID: "230d1ac1-539b-4bb6-9560-ce764540f933"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:12:28 crc kubenswrapper[4869]: I0312 15:12:28.860011 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/230d1ac1-539b-4bb6-9560-ce764540f933-inventory" (OuterVolumeSpecName: "inventory") pod "230d1ac1-539b-4bb6-9560-ce764540f933" (UID: "230d1ac1-539b-4bb6-9560-ce764540f933"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:12:28 crc kubenswrapper[4869]: I0312 15:12:28.927262 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/230d1ac1-539b-4bb6-9560-ce764540f933-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 15:12:28 crc kubenswrapper[4869]: I0312 15:12:28.927298 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wns28\" (UniqueName: \"kubernetes.io/projected/230d1ac1-539b-4bb6-9560-ce764540f933-kube-api-access-wns28\") on node \"crc\" DevicePath \"\"" Mar 12 15:12:28 crc kubenswrapper[4869]: I0312 15:12:28.927310 4869 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/230d1ac1-539b-4bb6-9560-ce764540f933-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 15:12:29 crc kubenswrapper[4869]: I0312 15:12:29.234980 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x6ckl" event={"ID":"230d1ac1-539b-4bb6-9560-ce764540f933","Type":"ContainerDied","Data":"fedb83d78bfd27cec77c5d2a0913d67cca70d495ea2ca513ff33832f77cd22bd"} Mar 12 15:12:29 crc kubenswrapper[4869]: I0312 15:12:29.235050 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fedb83d78bfd27cec77c5d2a0913d67cca70d495ea2ca513ff33832f77cd22bd" Mar 12 15:12:29 crc kubenswrapper[4869]: I0312 15:12:29.235049 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-x6ckl" Mar 12 15:12:29 crc kubenswrapper[4869]: I0312 15:12:29.816958 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q"] Mar 12 15:12:29 crc kubenswrapper[4869]: E0312 15:12:29.817682 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="230d1ac1-539b-4bb6-9560-ce764540f933" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 12 15:12:29 crc kubenswrapper[4869]: I0312 15:12:29.817698 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="230d1ac1-539b-4bb6-9560-ce764540f933" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 12 15:12:29 crc kubenswrapper[4869]: I0312 15:12:29.817890 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="230d1ac1-539b-4bb6-9560-ce764540f933" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 12 15:12:29 crc kubenswrapper[4869]: I0312 15:12:29.818570 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q" Mar 12 15:12:29 crc kubenswrapper[4869]: I0312 15:12:29.820632 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 15:12:29 crc kubenswrapper[4869]: I0312 15:12:29.823802 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cxsgq" Mar 12 15:12:29 crc kubenswrapper[4869]: I0312 15:12:29.825588 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 15:12:29 crc kubenswrapper[4869]: I0312 15:12:29.826700 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 15:12:29 crc kubenswrapper[4869]: I0312 15:12:29.832899 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q"] Mar 12 15:12:29 crc kubenswrapper[4869]: I0312 15:12:29.952328 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/300f0918-acb3-42ed-a67a-560608d31eda-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q\" (UID: \"300f0918-acb3-42ed-a67a-560608d31eda\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q" Mar 12 15:12:29 crc kubenswrapper[4869]: I0312 15:12:29.952401 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jflll\" (UniqueName: \"kubernetes.io/projected/300f0918-acb3-42ed-a67a-560608d31eda-kube-api-access-jflll\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q\" (UID: \"300f0918-acb3-42ed-a67a-560608d31eda\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q" Mar 12 15:12:29 crc kubenswrapper[4869]: I0312 15:12:29.952475 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/300f0918-acb3-42ed-a67a-560608d31eda-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q\" (UID: \"300f0918-acb3-42ed-a67a-560608d31eda\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q" Mar 12 15:12:29 crc kubenswrapper[4869]: I0312 15:12:29.952525 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/300f0918-acb3-42ed-a67a-560608d31eda-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q\" (UID: \"300f0918-acb3-42ed-a67a-560608d31eda\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q" Mar 12 15:12:30 crc kubenswrapper[4869]: I0312 15:12:30.053886 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jflll\" (UniqueName: \"kubernetes.io/projected/300f0918-acb3-42ed-a67a-560608d31eda-kube-api-access-jflll\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q\" (UID: \"300f0918-acb3-42ed-a67a-560608d31eda\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q" Mar 12 15:12:30 crc kubenswrapper[4869]: I0312 15:12:30.053996 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/300f0918-acb3-42ed-a67a-560608d31eda-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q\" (UID: \"300f0918-acb3-42ed-a67a-560608d31eda\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q" Mar 12 15:12:30 crc kubenswrapper[4869]: I0312 15:12:30.054059 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/300f0918-acb3-42ed-a67a-560608d31eda-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q\" (UID: \"300f0918-acb3-42ed-a67a-560608d31eda\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q" Mar 12 15:12:30 crc kubenswrapper[4869]: I0312 15:12:30.054196 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/300f0918-acb3-42ed-a67a-560608d31eda-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q\" (UID: \"300f0918-acb3-42ed-a67a-560608d31eda\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q" Mar 12 15:12:30 crc kubenswrapper[4869]: I0312 15:12:30.059758 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/300f0918-acb3-42ed-a67a-560608d31eda-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q\" (UID: \"300f0918-acb3-42ed-a67a-560608d31eda\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q" Mar 12 15:12:30 crc kubenswrapper[4869]: I0312 15:12:30.060012 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/300f0918-acb3-42ed-a67a-560608d31eda-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q\" (UID: \"300f0918-acb3-42ed-a67a-560608d31eda\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q" Mar 12 15:12:30 crc kubenswrapper[4869]: I0312 15:12:30.064422 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/300f0918-acb3-42ed-a67a-560608d31eda-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q\" (UID: \"300f0918-acb3-42ed-a67a-560608d31eda\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q" Mar 12 15:12:30 crc kubenswrapper[4869]: I0312 15:12:30.077597 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jflll\" (UniqueName: \"kubernetes.io/projected/300f0918-acb3-42ed-a67a-560608d31eda-kube-api-access-jflll\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q\" (UID: \"300f0918-acb3-42ed-a67a-560608d31eda\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q" Mar 12 15:12:30 crc kubenswrapper[4869]: I0312 15:12:30.152663 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q" Mar 12 15:12:30 crc kubenswrapper[4869]: I0312 15:12:30.672504 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q"] Mar 12 15:12:30 crc kubenswrapper[4869]: W0312 15:12:30.675691 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod300f0918_acb3_42ed_a67a_560608d31eda.slice/crio-726b79b3b5011a74d7744011fd11634a1c2aff7cc3df1e1b0048657fa70293da WatchSource:0}: Error finding container 726b79b3b5011a74d7744011fd11634a1c2aff7cc3df1e1b0048657fa70293da: Status 404 returned error can't find the container with id 726b79b3b5011a74d7744011fd11634a1c2aff7cc3df1e1b0048657fa70293da Mar 12 15:12:31 crc kubenswrapper[4869]: I0312 15:12:31.217415 4869 scope.go:117] "RemoveContainer" containerID="6b711caada6fcc2094c40c2ad7cddf69eeefb7243782e53a79bc55dfcb83f8c6" Mar 12 15:12:31 crc kubenswrapper[4869]: I0312 15:12:31.272934 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q" event={"ID":"300f0918-acb3-42ed-a67a-560608d31eda","Type":"ContainerStarted","Data":"726b79b3b5011a74d7744011fd11634a1c2aff7cc3df1e1b0048657fa70293da"} Mar 12 15:12:32 crc kubenswrapper[4869]: I0312 15:12:32.303147 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q" event={"ID":"300f0918-acb3-42ed-a67a-560608d31eda","Type":"ContainerStarted","Data":"0c6b388172e5ba939e3f8b3ecd2a84648bffb1d3009f4b56cd39dcf9d71f4f84"} Mar 12 15:12:32 crc kubenswrapper[4869]: I0312 15:12:32.331190 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q" podStartSLOduration=2.863257919 podStartE2EDuration="3.331173774s" podCreationTimestamp="2026-03-12 15:12:29 +0000 UTC" firstStartedPulling="2026-03-12 15:12:30.677843339 +0000 UTC m=+1502.963068617" lastFinishedPulling="2026-03-12 15:12:31.145759194 +0000 UTC m=+1503.430984472" observedRunningTime="2026-03-12 15:12:32.324296979 +0000 UTC m=+1504.609522257" watchObservedRunningTime="2026-03-12 15:12:32.331173774 +0000 UTC m=+1504.616399042" Mar 12 15:13:31 crc kubenswrapper[4869]: I0312 15:13:31.388102 4869 scope.go:117] "RemoveContainer" containerID="7f4f5d1dfdf83a25a682596cf7fa1dab03645541b1862af39fad01aa7f207ea7" Mar 12 15:13:31 crc kubenswrapper[4869]: I0312 15:13:31.448911 4869 scope.go:117] "RemoveContainer" containerID="c707e908255bc7787c8e32007a043fb2f818792e04634519ab5b21f0e75e694d" Mar 12 15:13:31 crc kubenswrapper[4869]: I0312 15:13:31.505602 4869 scope.go:117] "RemoveContainer" containerID="85f0ae6aebb26417371e54af05957b6d246bf047dc19680a33d5123e212433be" Mar 12 15:13:31 crc kubenswrapper[4869]: I0312 15:13:31.544232 4869 scope.go:117] "RemoveContainer" containerID="9afe1fe8028fdeb1af82340d7026017ffb74f59432a4797d0a4773a4fa812c5a" Mar 12 15:13:31 crc kubenswrapper[4869]: I0312 15:13:31.596261 4869 scope.go:117] "RemoveContainer" containerID="03e8197617894e351966949eef740a763d3ad9420224bf1ce16c77c5602dbf98" Mar 12 15:14:00 crc kubenswrapper[4869]: I0312 15:14:00.142636 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555474-6lm5p"] Mar 12 15:14:00 crc kubenswrapper[4869]: I0312 15:14:00.144570 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555474-6lm5p" Mar 12 15:14:00 crc kubenswrapper[4869]: I0312 15:14:00.146839 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 15:14:00 crc kubenswrapper[4869]: I0312 15:14:00.146854 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:14:00 crc kubenswrapper[4869]: I0312 15:14:00.147769 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:14:00 crc kubenswrapper[4869]: I0312 15:14:00.151627 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555474-6lm5p"] Mar 12 15:14:00 crc kubenswrapper[4869]: I0312 15:14:00.286114 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5wff\" (UniqueName: \"kubernetes.io/projected/dae66448-0cc3-4feb-b232-387110fba260-kube-api-access-c5wff\") pod \"auto-csr-approver-29555474-6lm5p\" (UID: \"dae66448-0cc3-4feb-b232-387110fba260\") " pod="openshift-infra/auto-csr-approver-29555474-6lm5p" Mar 12 15:14:00 crc kubenswrapper[4869]: I0312 15:14:00.387632 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5wff\" (UniqueName: \"kubernetes.io/projected/dae66448-0cc3-4feb-b232-387110fba260-kube-api-access-c5wff\") pod \"auto-csr-approver-29555474-6lm5p\" (UID: \"dae66448-0cc3-4feb-b232-387110fba260\") " pod="openshift-infra/auto-csr-approver-29555474-6lm5p" Mar 12 15:14:00 crc kubenswrapper[4869]: I0312 15:14:00.417592 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5wff\" (UniqueName: \"kubernetes.io/projected/dae66448-0cc3-4feb-b232-387110fba260-kube-api-access-c5wff\") pod \"auto-csr-approver-29555474-6lm5p\" (UID: \"dae66448-0cc3-4feb-b232-387110fba260\") " pod="openshift-infra/auto-csr-approver-29555474-6lm5p" Mar 12 15:14:00 crc kubenswrapper[4869]: I0312 15:14:00.465769 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555474-6lm5p" Mar 12 15:14:00 crc kubenswrapper[4869]: I0312 15:14:00.959191 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555474-6lm5p"] Mar 12 15:14:01 crc kubenswrapper[4869]: I0312 15:14:01.155920 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555474-6lm5p" event={"ID":"dae66448-0cc3-4feb-b232-387110fba260","Type":"ContainerStarted","Data":"139550d0d4900cca94650393fb2417a19444b64eec239d2c4af4d0608606f0c7"} Mar 12 15:14:03 crc kubenswrapper[4869]: I0312 15:14:03.179950 4869 generic.go:334] "Generic (PLEG): container finished" podID="dae66448-0cc3-4feb-b232-387110fba260" containerID="d3242ae1f0643ab17059ec452626a3afde6a26864cee76aec8d75d9e20fb05f9" exitCode=0 Mar 12 15:14:03 crc kubenswrapper[4869]: I0312 15:14:03.180069 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555474-6lm5p" event={"ID":"dae66448-0cc3-4feb-b232-387110fba260","Type":"ContainerDied","Data":"d3242ae1f0643ab17059ec452626a3afde6a26864cee76aec8d75d9e20fb05f9"} Mar 12 15:14:04 crc kubenswrapper[4869]: I0312 15:14:04.541099 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555474-6lm5p" Mar 12 15:14:04 crc kubenswrapper[4869]: I0312 15:14:04.581723 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5wff\" (UniqueName: \"kubernetes.io/projected/dae66448-0cc3-4feb-b232-387110fba260-kube-api-access-c5wff\") pod \"dae66448-0cc3-4feb-b232-387110fba260\" (UID: \"dae66448-0cc3-4feb-b232-387110fba260\") " Mar 12 15:14:04 crc kubenswrapper[4869]: I0312 15:14:04.620575 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dae66448-0cc3-4feb-b232-387110fba260-kube-api-access-c5wff" (OuterVolumeSpecName: "kube-api-access-c5wff") pod "dae66448-0cc3-4feb-b232-387110fba260" (UID: "dae66448-0cc3-4feb-b232-387110fba260"). InnerVolumeSpecName "kube-api-access-c5wff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:14:04 crc kubenswrapper[4869]: I0312 15:14:04.684126 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5wff\" (UniqueName: \"kubernetes.io/projected/dae66448-0cc3-4feb-b232-387110fba260-kube-api-access-c5wff\") on node \"crc\" DevicePath \"\"" Mar 12 15:14:05 crc kubenswrapper[4869]: I0312 15:14:05.105253 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4nvms"] Mar 12 15:14:05 crc kubenswrapper[4869]: E0312 15:14:05.106221 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dae66448-0cc3-4feb-b232-387110fba260" containerName="oc" Mar 12 15:14:05 crc kubenswrapper[4869]: I0312 15:14:05.106238 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="dae66448-0cc3-4feb-b232-387110fba260" containerName="oc" Mar 12 15:14:05 crc kubenswrapper[4869]: I0312 15:14:05.106417 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="dae66448-0cc3-4feb-b232-387110fba260" containerName="oc" Mar 12 15:14:05 crc kubenswrapper[4869]: I0312 15:14:05.107854 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4nvms" Mar 12 15:14:05 crc kubenswrapper[4869]: I0312 15:14:05.117358 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4nvms"] Mar 12 15:14:05 crc kubenswrapper[4869]: I0312 15:14:05.194665 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb94af03-0501-4e6c-b71e-d1600c87add4-utilities\") pod \"community-operators-4nvms\" (UID: \"fb94af03-0501-4e6c-b71e-d1600c87add4\") " pod="openshift-marketplace/community-operators-4nvms" Mar 12 15:14:05 crc kubenswrapper[4869]: I0312 15:14:05.194788 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmbvd\" (UniqueName: \"kubernetes.io/projected/fb94af03-0501-4e6c-b71e-d1600c87add4-kube-api-access-kmbvd\") pod \"community-operators-4nvms\" (UID: \"fb94af03-0501-4e6c-b71e-d1600c87add4\") " pod="openshift-marketplace/community-operators-4nvms" Mar 12 15:14:05 crc kubenswrapper[4869]: I0312 15:14:05.194844 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb94af03-0501-4e6c-b71e-d1600c87add4-catalog-content\") pod \"community-operators-4nvms\" (UID: \"fb94af03-0501-4e6c-b71e-d1600c87add4\") " pod="openshift-marketplace/community-operators-4nvms" Mar 12 15:14:05 crc kubenswrapper[4869]: I0312 15:14:05.202973 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555474-6lm5p" event={"ID":"dae66448-0cc3-4feb-b232-387110fba260","Type":"ContainerDied","Data":"139550d0d4900cca94650393fb2417a19444b64eec239d2c4af4d0608606f0c7"} Mar 12 15:14:05 crc kubenswrapper[4869]: I0312 15:14:05.203020 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="139550d0d4900cca94650393fb2417a19444b64eec239d2c4af4d0608606f0c7" Mar 12 15:14:05 crc kubenswrapper[4869]: I0312 15:14:05.203067 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555474-6lm5p" Mar 12 15:14:05 crc kubenswrapper[4869]: I0312 15:14:05.295636 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmbvd\" (UniqueName: \"kubernetes.io/projected/fb94af03-0501-4e6c-b71e-d1600c87add4-kube-api-access-kmbvd\") pod \"community-operators-4nvms\" (UID: \"fb94af03-0501-4e6c-b71e-d1600c87add4\") " pod="openshift-marketplace/community-operators-4nvms" Mar 12 15:14:05 crc kubenswrapper[4869]: I0312 15:14:05.295938 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb94af03-0501-4e6c-b71e-d1600c87add4-catalog-content\") pod \"community-operators-4nvms\" (UID: \"fb94af03-0501-4e6c-b71e-d1600c87add4\") " pod="openshift-marketplace/community-operators-4nvms" Mar 12 15:14:05 crc kubenswrapper[4869]: I0312 15:14:05.296061 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb94af03-0501-4e6c-b71e-d1600c87add4-utilities\") pod \"community-operators-4nvms\" (UID: \"fb94af03-0501-4e6c-b71e-d1600c87add4\") " pod="openshift-marketplace/community-operators-4nvms" Mar 12 15:14:05 crc kubenswrapper[4869]: I0312 15:14:05.296649 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb94af03-0501-4e6c-b71e-d1600c87add4-utilities\") pod \"community-operators-4nvms\" (UID: \"fb94af03-0501-4e6c-b71e-d1600c87add4\") " pod="openshift-marketplace/community-operators-4nvms" Mar 12 15:14:05 crc kubenswrapper[4869]: I0312 15:14:05.296869 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb94af03-0501-4e6c-b71e-d1600c87add4-catalog-content\") pod \"community-operators-4nvms\" (UID: \"fb94af03-0501-4e6c-b71e-d1600c87add4\") " pod="openshift-marketplace/community-operators-4nvms" Mar 12 15:14:05 crc kubenswrapper[4869]: I0312 15:14:05.314257 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmbvd\" (UniqueName: \"kubernetes.io/projected/fb94af03-0501-4e6c-b71e-d1600c87add4-kube-api-access-kmbvd\") pod \"community-operators-4nvms\" (UID: \"fb94af03-0501-4e6c-b71e-d1600c87add4\") " pod="openshift-marketplace/community-operators-4nvms" Mar 12 15:14:05 crc kubenswrapper[4869]: I0312 15:14:05.438123 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4nvms" Mar 12 15:14:05 crc kubenswrapper[4869]: I0312 15:14:05.646131 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555468-sjc6s"] Mar 12 15:14:05 crc kubenswrapper[4869]: I0312 15:14:05.660530 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555468-sjc6s"] Mar 12 15:14:06 crc kubenswrapper[4869]: I0312 15:14:06.049448 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4nvms"] Mar 12 15:14:06 crc kubenswrapper[4869]: I0312 15:14:06.212319 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4nvms" event={"ID":"fb94af03-0501-4e6c-b71e-d1600c87add4","Type":"ContainerStarted","Data":"b93250e198ae2b3e4bf8a26f7360049c9bc6f1641ebbbbed6ef9a5be3e75fec1"} Mar 12 15:14:06 crc kubenswrapper[4869]: I0312 15:14:06.346990 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="762c4ca7-73e4-4585-be50-f2308260ce36" path="/var/lib/kubelet/pods/762c4ca7-73e4-4585-be50-f2308260ce36/volumes" Mar 12 15:14:07 crc kubenswrapper[4869]: I0312 15:14:07.222873 4869 generic.go:334] "Generic (PLEG): container finished" podID="fb94af03-0501-4e6c-b71e-d1600c87add4" containerID="94535d1d7da6841d49f5258c5db1503bffc3ea0ce9b6c1992d5fd3796d2f5895" exitCode=0 Mar 12 15:14:07 crc kubenswrapper[4869]: I0312 15:14:07.223061 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4nvms" event={"ID":"fb94af03-0501-4e6c-b71e-d1600c87add4","Type":"ContainerDied","Data":"94535d1d7da6841d49f5258c5db1503bffc3ea0ce9b6c1992d5fd3796d2f5895"} Mar 12 15:14:09 crc kubenswrapper[4869]: I0312 15:14:09.247845 4869 generic.go:334] "Generic (PLEG): container finished" podID="fb94af03-0501-4e6c-b71e-d1600c87add4" containerID="016d49d2ac1843fe817de6c7149e9ab5befbe6faade9b1f3b5d069a6b9a1bfb3" exitCode=0 Mar 12 15:14:09 crc kubenswrapper[4869]: I0312 15:14:09.247906 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4nvms" event={"ID":"fb94af03-0501-4e6c-b71e-d1600c87add4","Type":"ContainerDied","Data":"016d49d2ac1843fe817de6c7149e9ab5befbe6faade9b1f3b5d069a6b9a1bfb3"} Mar 12 15:14:10 crc kubenswrapper[4869]: I0312 15:14:10.261708 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4nvms" event={"ID":"fb94af03-0501-4e6c-b71e-d1600c87add4","Type":"ContainerStarted","Data":"48014c430c136da0e3c1ff64c07753e54b7579401f604111899c4faf62033ab6"} Mar 12 15:14:10 crc kubenswrapper[4869]: I0312 15:14:10.291300 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4nvms" podStartSLOduration=2.841627823 podStartE2EDuration="5.291277776s" podCreationTimestamp="2026-03-12 15:14:05 +0000 UTC" firstStartedPulling="2026-03-12 15:14:07.224954315 +0000 UTC m=+1599.510179583" lastFinishedPulling="2026-03-12 15:14:09.674604268 +0000 UTC m=+1601.959829536" observedRunningTime="2026-03-12 15:14:10.282420636 +0000 UTC m=+1602.567645934" watchObservedRunningTime="2026-03-12 15:14:10.291277776 +0000 UTC m=+1602.576503064" Mar 12 15:14:15 crc kubenswrapper[4869]: I0312 15:14:15.439228 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4nvms" Mar 12 15:14:15 crc kubenswrapper[4869]: I0312 15:14:15.439828 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4nvms" Mar 12 15:14:15 crc kubenswrapper[4869]: I0312 15:14:15.489870 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4nvms" Mar 12 15:14:16 crc kubenswrapper[4869]: I0312 15:14:16.358832 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4nvms" Mar 12 15:14:16 crc kubenswrapper[4869]: I0312 15:14:16.408414 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4nvms"] Mar 12 15:14:18 crc kubenswrapper[4869]: I0312 15:14:18.330318 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4nvms" podUID="fb94af03-0501-4e6c-b71e-d1600c87add4" containerName="registry-server" containerID="cri-o://48014c430c136da0e3c1ff64c07753e54b7579401f604111899c4faf62033ab6" gracePeriod=2 Mar 12 15:14:18 crc kubenswrapper[4869]: I0312 15:14:18.779915 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4nvms" Mar 12 15:14:18 crc kubenswrapper[4869]: I0312 15:14:18.967372 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmbvd\" (UniqueName: \"kubernetes.io/projected/fb94af03-0501-4e6c-b71e-d1600c87add4-kube-api-access-kmbvd\") pod \"fb94af03-0501-4e6c-b71e-d1600c87add4\" (UID: \"fb94af03-0501-4e6c-b71e-d1600c87add4\") " Mar 12 15:14:18 crc kubenswrapper[4869]: I0312 15:14:18.967508 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb94af03-0501-4e6c-b71e-d1600c87add4-utilities\") pod \"fb94af03-0501-4e6c-b71e-d1600c87add4\" (UID: \"fb94af03-0501-4e6c-b71e-d1600c87add4\") " Mar 12 15:14:18 crc kubenswrapper[4869]: I0312 15:14:18.967591 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb94af03-0501-4e6c-b71e-d1600c87add4-catalog-content\") pod \"fb94af03-0501-4e6c-b71e-d1600c87add4\" (UID: \"fb94af03-0501-4e6c-b71e-d1600c87add4\") " Mar 12 15:14:18 crc kubenswrapper[4869]: I0312 15:14:18.968524 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb94af03-0501-4e6c-b71e-d1600c87add4-utilities" (OuterVolumeSpecName: "utilities") pod "fb94af03-0501-4e6c-b71e-d1600c87add4" (UID: "fb94af03-0501-4e6c-b71e-d1600c87add4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:14:18 crc kubenswrapper[4869]: I0312 15:14:18.973904 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb94af03-0501-4e6c-b71e-d1600c87add4-kube-api-access-kmbvd" (OuterVolumeSpecName: "kube-api-access-kmbvd") pod "fb94af03-0501-4e6c-b71e-d1600c87add4" (UID: "fb94af03-0501-4e6c-b71e-d1600c87add4"). InnerVolumeSpecName "kube-api-access-kmbvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:14:19 crc kubenswrapper[4869]: I0312 15:14:19.026922 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb94af03-0501-4e6c-b71e-d1600c87add4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb94af03-0501-4e6c-b71e-d1600c87add4" (UID: "fb94af03-0501-4e6c-b71e-d1600c87add4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:14:19 crc kubenswrapper[4869]: I0312 15:14:19.070424 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmbvd\" (UniqueName: \"kubernetes.io/projected/fb94af03-0501-4e6c-b71e-d1600c87add4-kube-api-access-kmbvd\") on node \"crc\" DevicePath \"\"" Mar 12 15:14:19 crc kubenswrapper[4869]: I0312 15:14:19.070458 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb94af03-0501-4e6c-b71e-d1600c87add4-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:14:19 crc kubenswrapper[4869]: I0312 15:14:19.070467 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb94af03-0501-4e6c-b71e-d1600c87add4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:14:19 crc kubenswrapper[4869]: I0312 15:14:19.345898 4869 generic.go:334] "Generic (PLEG): container finished" podID="fb94af03-0501-4e6c-b71e-d1600c87add4" containerID="48014c430c136da0e3c1ff64c07753e54b7579401f604111899c4faf62033ab6" exitCode=0 Mar 12 15:14:19 crc kubenswrapper[4869]: I0312 15:14:19.345963 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4nvms" event={"ID":"fb94af03-0501-4e6c-b71e-d1600c87add4","Type":"ContainerDied","Data":"48014c430c136da0e3c1ff64c07753e54b7579401f604111899c4faf62033ab6"} Mar 12 15:14:19 crc kubenswrapper[4869]: I0312 15:14:19.346010 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4nvms" event={"ID":"fb94af03-0501-4e6c-b71e-d1600c87add4","Type":"ContainerDied","Data":"b93250e198ae2b3e4bf8a26f7360049c9bc6f1641ebbbbed6ef9a5be3e75fec1"} Mar 12 15:14:19 crc kubenswrapper[4869]: I0312 15:14:19.346033 4869 scope.go:117] "RemoveContainer" containerID="48014c430c136da0e3c1ff64c07753e54b7579401f604111899c4faf62033ab6" Mar 12 15:14:19 crc kubenswrapper[4869]: I0312 15:14:19.346053 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4nvms" Mar 12 15:14:19 crc kubenswrapper[4869]: I0312 15:14:19.388349 4869 scope.go:117] "RemoveContainer" containerID="016d49d2ac1843fe817de6c7149e9ab5befbe6faade9b1f3b5d069a6b9a1bfb3" Mar 12 15:14:19 crc kubenswrapper[4869]: I0312 15:14:19.397650 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4nvms"] Mar 12 15:14:19 crc kubenswrapper[4869]: I0312 15:14:19.409283 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4nvms"] Mar 12 15:14:19 crc kubenswrapper[4869]: I0312 15:14:19.415322 4869 scope.go:117] "RemoveContainer" containerID="94535d1d7da6841d49f5258c5db1503bffc3ea0ce9b6c1992d5fd3796d2f5895" Mar 12 15:14:19 crc kubenswrapper[4869]: I0312 15:14:19.462017 4869 scope.go:117] "RemoveContainer" containerID="48014c430c136da0e3c1ff64c07753e54b7579401f604111899c4faf62033ab6" Mar 12 15:14:19 crc kubenswrapper[4869]: E0312 15:14:19.462510 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48014c430c136da0e3c1ff64c07753e54b7579401f604111899c4faf62033ab6\": container with ID starting with 48014c430c136da0e3c1ff64c07753e54b7579401f604111899c4faf62033ab6 not found: ID does not exist" containerID="48014c430c136da0e3c1ff64c07753e54b7579401f604111899c4faf62033ab6" Mar 12 15:14:19 crc kubenswrapper[4869]: I0312 15:14:19.462565 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48014c430c136da0e3c1ff64c07753e54b7579401f604111899c4faf62033ab6"} err="failed to get container status \"48014c430c136da0e3c1ff64c07753e54b7579401f604111899c4faf62033ab6\": rpc error: code = NotFound desc = could not find container \"48014c430c136da0e3c1ff64c07753e54b7579401f604111899c4faf62033ab6\": container with ID starting with 48014c430c136da0e3c1ff64c07753e54b7579401f604111899c4faf62033ab6 not found: ID does not exist" Mar 12 15:14:19 crc kubenswrapper[4869]: I0312 15:14:19.462595 4869 scope.go:117] "RemoveContainer" containerID="016d49d2ac1843fe817de6c7149e9ab5befbe6faade9b1f3b5d069a6b9a1bfb3" Mar 12 15:14:19 crc kubenswrapper[4869]: E0312 15:14:19.462961 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"016d49d2ac1843fe817de6c7149e9ab5befbe6faade9b1f3b5d069a6b9a1bfb3\": container with ID starting with 016d49d2ac1843fe817de6c7149e9ab5befbe6faade9b1f3b5d069a6b9a1bfb3 not found: ID does not exist" containerID="016d49d2ac1843fe817de6c7149e9ab5befbe6faade9b1f3b5d069a6b9a1bfb3" Mar 12 15:14:19 crc kubenswrapper[4869]: I0312 15:14:19.463014 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"016d49d2ac1843fe817de6c7149e9ab5befbe6faade9b1f3b5d069a6b9a1bfb3"} err="failed to get container status \"016d49d2ac1843fe817de6c7149e9ab5befbe6faade9b1f3b5d069a6b9a1bfb3\": rpc error: code = NotFound desc = could not find container \"016d49d2ac1843fe817de6c7149e9ab5befbe6faade9b1f3b5d069a6b9a1bfb3\": container with ID starting with 016d49d2ac1843fe817de6c7149e9ab5befbe6faade9b1f3b5d069a6b9a1bfb3 not found: ID does not exist" Mar 12 15:14:19 crc kubenswrapper[4869]: I0312 15:14:19.463077 4869 scope.go:117] "RemoveContainer" containerID="94535d1d7da6841d49f5258c5db1503bffc3ea0ce9b6c1992d5fd3796d2f5895" Mar 12 15:14:19 crc kubenswrapper[4869]: E0312 15:14:19.463414 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94535d1d7da6841d49f5258c5db1503bffc3ea0ce9b6c1992d5fd3796d2f5895\": container with ID starting with 94535d1d7da6841d49f5258c5db1503bffc3ea0ce9b6c1992d5fd3796d2f5895 not found: ID does not exist" containerID="94535d1d7da6841d49f5258c5db1503bffc3ea0ce9b6c1992d5fd3796d2f5895" Mar 12 15:14:19 crc kubenswrapper[4869]: I0312 15:14:19.463469 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94535d1d7da6841d49f5258c5db1503bffc3ea0ce9b6c1992d5fd3796d2f5895"} err="failed to get container status \"94535d1d7da6841d49f5258c5db1503bffc3ea0ce9b6c1992d5fd3796d2f5895\": rpc error: code = NotFound desc = could not find container \"94535d1d7da6841d49f5258c5db1503bffc3ea0ce9b6c1992d5fd3796d2f5895\": container with ID starting with 94535d1d7da6841d49f5258c5db1503bffc3ea0ce9b6c1992d5fd3796d2f5895 not found: ID does not exist" Mar 12 15:14:20 crc kubenswrapper[4869]: I0312 15:14:20.354904 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb94af03-0501-4e6c-b71e-d1600c87add4" path="/var/lib/kubelet/pods/fb94af03-0501-4e6c-b71e-d1600c87add4/volumes" Mar 12 15:14:31 crc kubenswrapper[4869]: I0312 15:14:31.760898 4869 scope.go:117] "RemoveContainer" containerID="8ee38970a43dd928d743c1150be98a5af45f8d202531e2c2a85553d9c175dd60" Mar 12 15:14:31 crc kubenswrapper[4869]: I0312 15:14:31.811709 4869 scope.go:117] "RemoveContainer" containerID="68f8228c41c7a1e89b5fa127af5baed49ccb31777ec001ffb327de077e00fa70" Mar 12 15:14:31 crc kubenswrapper[4869]: I0312 15:14:31.837909 4869 scope.go:117] "RemoveContainer" containerID="4329c2f75308269fa25f20e3a946f5880a52e98019bae0743d891f3e785ef8ff" Mar 12 15:14:31 crc kubenswrapper[4869]: I0312 15:14:31.884656 4869 scope.go:117] "RemoveContainer" containerID="bbeb4784032f18091ab3ebb3557765d719c3cdb93ad1cb66ba79c60042c1a016" Mar 12 15:14:49 crc kubenswrapper[4869]: I0312 15:14:49.683738 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:14:49 crc kubenswrapper[4869]: I0312 15:14:49.684230 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:15:00 crc kubenswrapper[4869]: I0312 15:15:00.140376 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555475-zv8vj"] Mar 12 15:15:00 crc kubenswrapper[4869]: E0312 15:15:00.141342 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb94af03-0501-4e6c-b71e-d1600c87add4" containerName="extract-content" Mar 12 15:15:00 crc kubenswrapper[4869]: I0312 15:15:00.141361 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb94af03-0501-4e6c-b71e-d1600c87add4" containerName="extract-content" Mar 12 15:15:00 crc kubenswrapper[4869]: E0312 15:15:00.141375 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb94af03-0501-4e6c-b71e-d1600c87add4" containerName="extract-utilities" Mar 12 15:15:00 crc kubenswrapper[4869]: I0312 15:15:00.141382 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb94af03-0501-4e6c-b71e-d1600c87add4" containerName="extract-utilities" Mar 12 15:15:00 crc kubenswrapper[4869]: E0312 15:15:00.141393 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb94af03-0501-4e6c-b71e-d1600c87add4" containerName="registry-server" Mar 12 15:15:00 crc kubenswrapper[4869]: I0312 15:15:00.141401 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb94af03-0501-4e6c-b71e-d1600c87add4" containerName="registry-server" Mar 12 15:15:00 crc kubenswrapper[4869]: I0312 15:15:00.141648 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb94af03-0501-4e6c-b71e-d1600c87add4" containerName="registry-server" Mar 12 15:15:00 crc kubenswrapper[4869]: I0312 15:15:00.142366 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555475-zv8vj" Mar 12 15:15:00 crc kubenswrapper[4869]: I0312 15:15:00.144672 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 12 15:15:00 crc kubenswrapper[4869]: I0312 15:15:00.144905 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 12 15:15:00 crc kubenswrapper[4869]: I0312 15:15:00.155481 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555475-zv8vj"] Mar 12 15:15:00 crc kubenswrapper[4869]: I0312 15:15:00.283966 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f762edf1-d60b-44cf-a949-f8a8d53599c3-secret-volume\") pod \"collect-profiles-29555475-zv8vj\" (UID: \"f762edf1-d60b-44cf-a949-f8a8d53599c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555475-zv8vj" Mar 12 15:15:00 crc kubenswrapper[4869]: I0312 15:15:00.284049 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p58d5\" (UniqueName: \"kubernetes.io/projected/f762edf1-d60b-44cf-a949-f8a8d53599c3-kube-api-access-p58d5\") pod \"collect-profiles-29555475-zv8vj\" (UID: \"f762edf1-d60b-44cf-a949-f8a8d53599c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555475-zv8vj" Mar 12 15:15:00 crc kubenswrapper[4869]: I0312 15:15:00.284132 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f762edf1-d60b-44cf-a949-f8a8d53599c3-config-volume\") pod \"collect-profiles-29555475-zv8vj\" (UID: \"f762edf1-d60b-44cf-a949-f8a8d53599c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555475-zv8vj" Mar 12 15:15:00 crc kubenswrapper[4869]: I0312 15:15:00.386690 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f762edf1-d60b-44cf-a949-f8a8d53599c3-secret-volume\") pod \"collect-profiles-29555475-zv8vj\" (UID: \"f762edf1-d60b-44cf-a949-f8a8d53599c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555475-zv8vj" Mar 12 15:15:00 crc kubenswrapper[4869]: I0312 15:15:00.386751 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p58d5\" (UniqueName: \"kubernetes.io/projected/f762edf1-d60b-44cf-a949-f8a8d53599c3-kube-api-access-p58d5\") pod \"collect-profiles-29555475-zv8vj\" (UID: \"f762edf1-d60b-44cf-a949-f8a8d53599c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555475-zv8vj" Mar 12 15:15:00 crc kubenswrapper[4869]: I0312 15:15:00.386810 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f762edf1-d60b-44cf-a949-f8a8d53599c3-config-volume\") pod \"collect-profiles-29555475-zv8vj\" (UID: \"f762edf1-d60b-44cf-a949-f8a8d53599c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555475-zv8vj" Mar 12 15:15:00 crc kubenswrapper[4869]: I0312 15:15:00.387745 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f762edf1-d60b-44cf-a949-f8a8d53599c3-config-volume\") pod \"collect-profiles-29555475-zv8vj\" (UID: \"f762edf1-d60b-44cf-a949-f8a8d53599c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555475-zv8vj" Mar 12 15:15:00 crc kubenswrapper[4869]: I0312 15:15:00.400984 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f762edf1-d60b-44cf-a949-f8a8d53599c3-secret-volume\") pod \"collect-profiles-29555475-zv8vj\" (UID: \"f762edf1-d60b-44cf-a949-f8a8d53599c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555475-zv8vj" Mar 12 15:15:00 crc kubenswrapper[4869]: I0312 15:15:00.404715 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p58d5\" (UniqueName: \"kubernetes.io/projected/f762edf1-d60b-44cf-a949-f8a8d53599c3-kube-api-access-p58d5\") pod \"collect-profiles-29555475-zv8vj\" (UID: \"f762edf1-d60b-44cf-a949-f8a8d53599c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555475-zv8vj" Mar 12 15:15:00 crc kubenswrapper[4869]: I0312 15:15:00.467862 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555475-zv8vj" Mar 12 15:15:00 crc kubenswrapper[4869]: I0312 15:15:00.873718 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555475-zv8vj"] Mar 12 15:15:01 crc kubenswrapper[4869]: I0312 15:15:01.765755 4869 generic.go:334] "Generic (PLEG): container finished" podID="f762edf1-d60b-44cf-a949-f8a8d53599c3" containerID="20b506cbc56dc3d139f6dd738d14366b6c8d4771496087708289001b10cbbe7a" exitCode=0 Mar 12 15:15:01 crc kubenswrapper[4869]: I0312 15:15:01.765813 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555475-zv8vj" event={"ID":"f762edf1-d60b-44cf-a949-f8a8d53599c3","Type":"ContainerDied","Data":"20b506cbc56dc3d139f6dd738d14366b6c8d4771496087708289001b10cbbe7a"} Mar 12 15:15:01 crc kubenswrapper[4869]: I0312 15:15:01.765890 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555475-zv8vj" event={"ID":"f762edf1-d60b-44cf-a949-f8a8d53599c3","Type":"ContainerStarted","Data":"9d2197465bfd2707c9941321a30f4245a73fcd1820d2c49823f132359f9a3ae1"} Mar 12 15:15:03 crc kubenswrapper[4869]: I0312 15:15:03.091868 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555475-zv8vj" Mar 12 15:15:03 crc kubenswrapper[4869]: I0312 15:15:03.240917 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f762edf1-d60b-44cf-a949-f8a8d53599c3-config-volume\") pod \"f762edf1-d60b-44cf-a949-f8a8d53599c3\" (UID: \"f762edf1-d60b-44cf-a949-f8a8d53599c3\") " Mar 12 15:15:03 crc kubenswrapper[4869]: I0312 15:15:03.241254 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f762edf1-d60b-44cf-a949-f8a8d53599c3-secret-volume\") pod \"f762edf1-d60b-44cf-a949-f8a8d53599c3\" (UID: \"f762edf1-d60b-44cf-a949-f8a8d53599c3\") " Mar 12 15:15:03 crc kubenswrapper[4869]: I0312 15:15:03.241287 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p58d5\" (UniqueName: \"kubernetes.io/projected/f762edf1-d60b-44cf-a949-f8a8d53599c3-kube-api-access-p58d5\") pod \"f762edf1-d60b-44cf-a949-f8a8d53599c3\" (UID: \"f762edf1-d60b-44cf-a949-f8a8d53599c3\") " Mar 12 15:15:03 crc kubenswrapper[4869]: I0312 15:15:03.241825 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f762edf1-d60b-44cf-a949-f8a8d53599c3-config-volume" (OuterVolumeSpecName: "config-volume") pod "f762edf1-d60b-44cf-a949-f8a8d53599c3" (UID: "f762edf1-d60b-44cf-a949-f8a8d53599c3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:15:03 crc kubenswrapper[4869]: I0312 15:15:03.246795 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f762edf1-d60b-44cf-a949-f8a8d53599c3-kube-api-access-p58d5" (OuterVolumeSpecName: "kube-api-access-p58d5") pod "f762edf1-d60b-44cf-a949-f8a8d53599c3" (UID: "f762edf1-d60b-44cf-a949-f8a8d53599c3"). InnerVolumeSpecName "kube-api-access-p58d5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:15:03 crc kubenswrapper[4869]: I0312 15:15:03.248030 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f762edf1-d60b-44cf-a949-f8a8d53599c3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f762edf1-d60b-44cf-a949-f8a8d53599c3" (UID: "f762edf1-d60b-44cf-a949-f8a8d53599c3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:15:03 crc kubenswrapper[4869]: I0312 15:15:03.343871 4869 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f762edf1-d60b-44cf-a949-f8a8d53599c3-config-volume\") on node \"crc\" DevicePath \"\"" Mar 12 15:15:03 crc kubenswrapper[4869]: I0312 15:15:03.343906 4869 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f762edf1-d60b-44cf-a949-f8a8d53599c3-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 12 15:15:03 crc kubenswrapper[4869]: I0312 15:15:03.343915 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p58d5\" (UniqueName: \"kubernetes.io/projected/f762edf1-d60b-44cf-a949-f8a8d53599c3-kube-api-access-p58d5\") on node \"crc\" DevicePath \"\"" Mar 12 15:15:03 crc kubenswrapper[4869]: I0312 15:15:03.783492 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555475-zv8vj" event={"ID":"f762edf1-d60b-44cf-a949-f8a8d53599c3","Type":"ContainerDied","Data":"9d2197465bfd2707c9941321a30f4245a73fcd1820d2c49823f132359f9a3ae1"} Mar 12 15:15:03 crc kubenswrapper[4869]: I0312 15:15:03.783834 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d2197465bfd2707c9941321a30f4245a73fcd1820d2c49823f132359f9a3ae1" Mar 12 15:15:03 crc kubenswrapper[4869]: I0312 15:15:03.783558 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555475-zv8vj" Mar 12 15:15:19 crc kubenswrapper[4869]: I0312 15:15:19.683830 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:15:19 crc kubenswrapper[4869]: I0312 15:15:19.684347 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:15:23 crc kubenswrapper[4869]: I0312 15:15:23.951261 4869 generic.go:334] "Generic (PLEG): container finished" podID="300f0918-acb3-42ed-a67a-560608d31eda" containerID="0c6b388172e5ba939e3f8b3ecd2a84648bffb1d3009f4b56cd39dcf9d71f4f84" exitCode=0 Mar 12 15:15:23 crc kubenswrapper[4869]: I0312 15:15:23.951356 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q" event={"ID":"300f0918-acb3-42ed-a67a-560608d31eda","Type":"ContainerDied","Data":"0c6b388172e5ba939e3f8b3ecd2a84648bffb1d3009f4b56cd39dcf9d71f4f84"} Mar 12 15:15:25 crc kubenswrapper[4869]: I0312 15:15:25.395893 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q" Mar 12 15:15:25 crc kubenswrapper[4869]: I0312 15:15:25.425747 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/300f0918-acb3-42ed-a67a-560608d31eda-inventory\") pod \"300f0918-acb3-42ed-a67a-560608d31eda\" (UID: \"300f0918-acb3-42ed-a67a-560608d31eda\") " Mar 12 15:15:25 crc kubenswrapper[4869]: I0312 15:15:25.425873 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jflll\" (UniqueName: \"kubernetes.io/projected/300f0918-acb3-42ed-a67a-560608d31eda-kube-api-access-jflll\") pod \"300f0918-acb3-42ed-a67a-560608d31eda\" (UID: \"300f0918-acb3-42ed-a67a-560608d31eda\") " Mar 12 15:15:25 crc kubenswrapper[4869]: I0312 15:15:25.425916 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/300f0918-acb3-42ed-a67a-560608d31eda-ssh-key-openstack-edpm-ipam\") pod \"300f0918-acb3-42ed-a67a-560608d31eda\" (UID: \"300f0918-acb3-42ed-a67a-560608d31eda\") " Mar 12 15:15:25 crc kubenswrapper[4869]: I0312 15:15:25.425978 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/300f0918-acb3-42ed-a67a-560608d31eda-bootstrap-combined-ca-bundle\") pod \"300f0918-acb3-42ed-a67a-560608d31eda\" (UID: \"300f0918-acb3-42ed-a67a-560608d31eda\") " Mar 12 15:15:25 crc kubenswrapper[4869]: I0312 15:15:25.440643 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/300f0918-acb3-42ed-a67a-560608d31eda-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "300f0918-acb3-42ed-a67a-560608d31eda" (UID: "300f0918-acb3-42ed-a67a-560608d31eda"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:15:25 crc kubenswrapper[4869]: I0312 15:15:25.440772 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/300f0918-acb3-42ed-a67a-560608d31eda-kube-api-access-jflll" (OuterVolumeSpecName: "kube-api-access-jflll") pod "300f0918-acb3-42ed-a67a-560608d31eda" (UID: "300f0918-acb3-42ed-a67a-560608d31eda"). InnerVolumeSpecName "kube-api-access-jflll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:15:25 crc kubenswrapper[4869]: E0312 15:15:25.456624 4869 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/300f0918-acb3-42ed-a67a-560608d31eda-inventory podName:300f0918-acb3-42ed-a67a-560608d31eda nodeName:}" failed. No retries permitted until 2026-03-12 15:15:25.956590738 +0000 UTC m=+1678.241816016 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/300f0918-acb3-42ed-a67a-560608d31eda-inventory") pod "300f0918-acb3-42ed-a67a-560608d31eda" (UID: "300f0918-acb3-42ed-a67a-560608d31eda") : error deleting /var/lib/kubelet/pods/300f0918-acb3-42ed-a67a-560608d31eda/volume-subpaths: remove /var/lib/kubelet/pods/300f0918-acb3-42ed-a67a-560608d31eda/volume-subpaths: no such file or directory Mar 12 15:15:25 crc kubenswrapper[4869]: I0312 15:15:25.459162 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/300f0918-acb3-42ed-a67a-560608d31eda-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "300f0918-acb3-42ed-a67a-560608d31eda" (UID: "300f0918-acb3-42ed-a67a-560608d31eda"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:15:25 crc kubenswrapper[4869]: I0312 15:15:25.528789 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jflll\" (UniqueName: \"kubernetes.io/projected/300f0918-acb3-42ed-a67a-560608d31eda-kube-api-access-jflll\") on node \"crc\" DevicePath \"\"" Mar 12 15:15:25 crc kubenswrapper[4869]: I0312 15:15:25.529651 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/300f0918-acb3-42ed-a67a-560608d31eda-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 15:15:25 crc kubenswrapper[4869]: I0312 15:15:25.529711 4869 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/300f0918-acb3-42ed-a67a-560608d31eda-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:15:25 crc kubenswrapper[4869]: I0312 15:15:25.972851 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q" event={"ID":"300f0918-acb3-42ed-a67a-560608d31eda","Type":"ContainerDied","Data":"726b79b3b5011a74d7744011fd11634a1c2aff7cc3df1e1b0048657fa70293da"} Mar 12 15:15:25 crc kubenswrapper[4869]: I0312 15:15:25.972923 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="726b79b3b5011a74d7744011fd11634a1c2aff7cc3df1e1b0048657fa70293da" Mar 12 15:15:25 crc kubenswrapper[4869]: I0312 15:15:25.973031 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q" Mar 12 15:15:26 crc kubenswrapper[4869]: I0312 15:15:26.040276 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/300f0918-acb3-42ed-a67a-560608d31eda-inventory\") pod \"300f0918-acb3-42ed-a67a-560608d31eda\" (UID: \"300f0918-acb3-42ed-a67a-560608d31eda\") " Mar 12 15:15:26 crc kubenswrapper[4869]: I0312 15:15:26.045525 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/300f0918-acb3-42ed-a67a-560608d31eda-inventory" (OuterVolumeSpecName: "inventory") pod "300f0918-acb3-42ed-a67a-560608d31eda" (UID: "300f0918-acb3-42ed-a67a-560608d31eda"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:15:26 crc kubenswrapper[4869]: I0312 15:15:26.060291 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-57nnj"] Mar 12 15:15:26 crc kubenswrapper[4869]: E0312 15:15:26.060801 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f762edf1-d60b-44cf-a949-f8a8d53599c3" containerName="collect-profiles" Mar 12 15:15:26 crc kubenswrapper[4869]: I0312 15:15:26.060823 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f762edf1-d60b-44cf-a949-f8a8d53599c3" containerName="collect-profiles" Mar 12 15:15:26 crc kubenswrapper[4869]: E0312 15:15:26.060838 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="300f0918-acb3-42ed-a67a-560608d31eda" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 12 15:15:26 crc kubenswrapper[4869]: I0312 15:15:26.060846 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="300f0918-acb3-42ed-a67a-560608d31eda" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 12 15:15:26 crc kubenswrapper[4869]: I0312 15:15:26.061145 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="300f0918-acb3-42ed-a67a-560608d31eda" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 12 15:15:26 crc kubenswrapper[4869]: I0312 15:15:26.061165 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f762edf1-d60b-44cf-a949-f8a8d53599c3" containerName="collect-profiles" Mar 12 15:15:26 crc kubenswrapper[4869]: I0312 15:15:26.061893 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-57nnj" Mar 12 15:15:26 crc kubenswrapper[4869]: I0312 15:15:26.076154 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-57nnj"] Mar 12 15:15:26 crc kubenswrapper[4869]: I0312 15:15:26.142649 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-57nnj\" (UID: \"26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-57nnj" Mar 12 15:15:26 crc kubenswrapper[4869]: I0312 15:15:26.142708 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsnnh\" (UniqueName: \"kubernetes.io/projected/26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f-kube-api-access-dsnnh\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-57nnj\" (UID: \"26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-57nnj" Mar 12 15:15:26 crc kubenswrapper[4869]: I0312 15:15:26.142842 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-57nnj\" (UID: \"26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-57nnj" Mar 12 15:15:26 crc kubenswrapper[4869]: I0312 15:15:26.143188 4869 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/300f0918-acb3-42ed-a67a-560608d31eda-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 15:15:26 crc kubenswrapper[4869]: I0312 15:15:26.244659 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-57nnj\" (UID: \"26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-57nnj" Mar 12 15:15:26 crc kubenswrapper[4869]: I0312 15:15:26.244703 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsnnh\" (UniqueName: \"kubernetes.io/projected/26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f-kube-api-access-dsnnh\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-57nnj\" (UID: \"26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-57nnj" Mar 12 15:15:26 crc kubenswrapper[4869]: I0312 15:15:26.244747 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-57nnj\" (UID: \"26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-57nnj" Mar 12 15:15:26 crc kubenswrapper[4869]: I0312 15:15:26.248614 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-57nnj\" (UID: \"26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-57nnj" Mar 12 15:15:26 crc kubenswrapper[4869]: I0312 15:15:26.248901 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-57nnj\" (UID: \"26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-57nnj" Mar 12 15:15:26 crc kubenswrapper[4869]: I0312 15:15:26.261669 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsnnh\" (UniqueName: \"kubernetes.io/projected/26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f-kube-api-access-dsnnh\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-57nnj\" (UID: \"26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-57nnj" Mar 12 15:15:26 crc kubenswrapper[4869]: I0312 15:15:26.422173 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-57nnj" Mar 12 15:15:26 crc kubenswrapper[4869]: I0312 15:15:26.962012 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-57nnj"] Mar 12 15:15:26 crc kubenswrapper[4869]: I0312 15:15:26.984482 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-57nnj" event={"ID":"26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f","Type":"ContainerStarted","Data":"9cf1d8461ba70f7fccb880587980fad9c0879f4580ba08b76f327efa92a58de9"} Mar 12 15:15:28 crc kubenswrapper[4869]: I0312 15:15:28.000336 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-57nnj" event={"ID":"26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f","Type":"ContainerStarted","Data":"74b593d3deb65194f982fe464ae4c91a05c9146a8201ba054f30f5b58beff680"} Mar 12 15:15:28 crc kubenswrapper[4869]: I0312 15:15:28.021759 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-57nnj" podStartSLOduration=1.316823015 podStartE2EDuration="2.02173676s" podCreationTimestamp="2026-03-12 15:15:26 +0000 UTC" firstStartedPulling="2026-03-12 15:15:26.96956928 +0000 UTC m=+1679.254794558" lastFinishedPulling="2026-03-12 15:15:27.674483025 +0000 UTC m=+1679.959708303" observedRunningTime="2026-03-12 15:15:28.01682373 +0000 UTC m=+1680.302049018" watchObservedRunningTime="2026-03-12 15:15:28.02173676 +0000 UTC m=+1680.306962048" Mar 12 15:15:31 crc kubenswrapper[4869]: I0312 15:15:31.947342 4869 scope.go:117] "RemoveContainer" containerID="4c8079d6956ec3b6846f4db43100a55ba972a96b2ad8077d35664cb37bb0fdfb" Mar 12 15:15:31 crc kubenswrapper[4869]: I0312 15:15:31.972864 4869 scope.go:117] "RemoveContainer" containerID="7f81bd9fade21a35c31dd509835b1c12df5880f2683e340a53bc73ea3406d1ca" Mar 12 15:15:31 crc kubenswrapper[4869]: I0312 15:15:31.993721 4869 scope.go:117] "RemoveContainer" containerID="8024b6d2f80b1844ef74c5d435773a5be37299846584c993574bc63e515f0c67" Mar 12 15:15:32 crc kubenswrapper[4869]: I0312 15:15:32.016314 4869 scope.go:117] "RemoveContainer" containerID="938b163b843f4e17857b4a072034c1018ac7a8eb550a4473aec0f820d93c1ca8" Mar 12 15:15:32 crc kubenswrapper[4869]: I0312 15:15:32.048474 4869 scope.go:117] "RemoveContainer" containerID="e14e6076d0352ac1b10d52a6bfe7de727ec71a4c462e25668d80f34c59523246" Mar 12 15:15:49 crc kubenswrapper[4869]: I0312 15:15:49.684286 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:15:49 crc kubenswrapper[4869]: I0312 15:15:49.685021 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:15:49 crc kubenswrapper[4869]: I0312 15:15:49.685077 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" Mar 12 15:15:49 crc kubenswrapper[4869]: I0312 15:15:49.685887 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bc737999b30693337d1e520b47de07490eab1ec628aefabcaeb8e63a17e676eb"} pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 15:15:49 crc kubenswrapper[4869]: I0312 15:15:49.685933 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" containerID="cri-o://bc737999b30693337d1e520b47de07490eab1ec628aefabcaeb8e63a17e676eb" gracePeriod=600 Mar 12 15:15:49 crc kubenswrapper[4869]: E0312 15:15:49.808964 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:15:50 crc kubenswrapper[4869]: I0312 15:15:50.198131 4869 generic.go:334] "Generic (PLEG): container finished" podID="1621c994-94d2-4105-a988-f4739518ba91" containerID="bc737999b30693337d1e520b47de07490eab1ec628aefabcaeb8e63a17e676eb" exitCode=0 Mar 12 15:15:50 crc kubenswrapper[4869]: I0312 15:15:50.198193 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerDied","Data":"bc737999b30693337d1e520b47de07490eab1ec628aefabcaeb8e63a17e676eb"} Mar 12 15:15:50 crc kubenswrapper[4869]: I0312 15:15:50.198806 4869 scope.go:117] "RemoveContainer" containerID="e494c12edd1eec941b7037231b079cb054af121efa06d3e86355c27776905fd6" Mar 12 15:15:50 crc kubenswrapper[4869]: I0312 15:15:50.199406 4869 scope.go:117] "RemoveContainer" containerID="bc737999b30693337d1e520b47de07490eab1ec628aefabcaeb8e63a17e676eb" Mar 12 15:15:50 crc kubenswrapper[4869]: E0312 15:15:50.199686 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:16:00 crc kubenswrapper[4869]: I0312 15:16:00.143732 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555476-rt59v"] Mar 12 15:16:00 crc kubenswrapper[4869]: I0312 15:16:00.145983 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555476-rt59v" Mar 12 15:16:00 crc kubenswrapper[4869]: I0312 15:16:00.148672 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 15:16:00 crc kubenswrapper[4869]: I0312 15:16:00.148749 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:16:00 crc kubenswrapper[4869]: I0312 15:16:00.148800 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:16:00 crc kubenswrapper[4869]: I0312 15:16:00.154157 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555476-rt59v"] Mar 12 15:16:00 crc kubenswrapper[4869]: I0312 15:16:00.246986 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp4fq\" (UniqueName: \"kubernetes.io/projected/e96d5205-aee2-4126-8c64-9de951feffc7-kube-api-access-vp4fq\") pod \"auto-csr-approver-29555476-rt59v\" (UID: \"e96d5205-aee2-4126-8c64-9de951feffc7\") " pod="openshift-infra/auto-csr-approver-29555476-rt59v" Mar 12 15:16:00 crc kubenswrapper[4869]: I0312 15:16:00.348939 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp4fq\" (UniqueName: \"kubernetes.io/projected/e96d5205-aee2-4126-8c64-9de951feffc7-kube-api-access-vp4fq\") pod \"auto-csr-approver-29555476-rt59v\" (UID: \"e96d5205-aee2-4126-8c64-9de951feffc7\") " pod="openshift-infra/auto-csr-approver-29555476-rt59v" Mar 12 15:16:00 crc kubenswrapper[4869]: I0312 15:16:00.366646 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp4fq\" (UniqueName: \"kubernetes.io/projected/e96d5205-aee2-4126-8c64-9de951feffc7-kube-api-access-vp4fq\") pod \"auto-csr-approver-29555476-rt59v\" (UID: \"e96d5205-aee2-4126-8c64-9de951feffc7\") " pod="openshift-infra/auto-csr-approver-29555476-rt59v" Mar 12 15:16:00 crc kubenswrapper[4869]: I0312 15:16:00.467561 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555476-rt59v" Mar 12 15:16:00 crc kubenswrapper[4869]: I0312 15:16:00.922206 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555476-rt59v"] Mar 12 15:16:01 crc kubenswrapper[4869]: I0312 15:16:01.327099 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555476-rt59v" event={"ID":"e96d5205-aee2-4126-8c64-9de951feffc7","Type":"ContainerStarted","Data":"9304d4f9dc91549a91b3185a6a5f9bb9c33c5b0c8c638f3234b61056f5428afa"} Mar 12 15:16:02 crc kubenswrapper[4869]: I0312 15:16:02.360463 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555476-rt59v" event={"ID":"e96d5205-aee2-4126-8c64-9de951feffc7","Type":"ContainerStarted","Data":"d6c951f1a88c0beec45caad509423a3b6e84f11c515c38560b98ea0c2a140ae6"} Mar 12 15:16:02 crc kubenswrapper[4869]: I0312 15:16:02.379038 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555476-rt59v" podStartSLOduration=1.300181284 podStartE2EDuration="2.379020469s" podCreationTimestamp="2026-03-12 15:16:00 +0000 UTC" firstStartedPulling="2026-03-12 15:16:00.926055798 +0000 UTC m=+1713.211281076" lastFinishedPulling="2026-03-12 15:16:02.004894983 +0000 UTC m=+1714.290120261" observedRunningTime="2026-03-12 15:16:02.374364697 +0000 UTC m=+1714.659589975" watchObservedRunningTime="2026-03-12 15:16:02.379020469 +0000 UTC m=+1714.664245747" Mar 12 15:16:03 crc kubenswrapper[4869]: I0312 15:16:03.336675 4869 scope.go:117] "RemoveContainer" containerID="bc737999b30693337d1e520b47de07490eab1ec628aefabcaeb8e63a17e676eb" Mar 12 15:16:03 crc kubenswrapper[4869]: E0312 15:16:03.337060 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:16:03 crc kubenswrapper[4869]: I0312 15:16:03.371532 4869 generic.go:334] "Generic (PLEG): container finished" podID="e96d5205-aee2-4126-8c64-9de951feffc7" containerID="d6c951f1a88c0beec45caad509423a3b6e84f11c515c38560b98ea0c2a140ae6" exitCode=0 Mar 12 15:16:03 crc kubenswrapper[4869]: I0312 15:16:03.371610 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555476-rt59v" event={"ID":"e96d5205-aee2-4126-8c64-9de951feffc7","Type":"ContainerDied","Data":"d6c951f1a88c0beec45caad509423a3b6e84f11c515c38560b98ea0c2a140ae6"} Mar 12 15:16:04 crc kubenswrapper[4869]: I0312 15:16:04.663572 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555476-rt59v" Mar 12 15:16:04 crc kubenswrapper[4869]: I0312 15:16:04.861160 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vp4fq\" (UniqueName: \"kubernetes.io/projected/e96d5205-aee2-4126-8c64-9de951feffc7-kube-api-access-vp4fq\") pod \"e96d5205-aee2-4126-8c64-9de951feffc7\" (UID: \"e96d5205-aee2-4126-8c64-9de951feffc7\") " Mar 12 15:16:04 crc kubenswrapper[4869]: I0312 15:16:04.866541 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e96d5205-aee2-4126-8c64-9de951feffc7-kube-api-access-vp4fq" (OuterVolumeSpecName: "kube-api-access-vp4fq") pod "e96d5205-aee2-4126-8c64-9de951feffc7" (UID: "e96d5205-aee2-4126-8c64-9de951feffc7"). InnerVolumeSpecName "kube-api-access-vp4fq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:16:04 crc kubenswrapper[4869]: I0312 15:16:04.964472 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vp4fq\" (UniqueName: \"kubernetes.io/projected/e96d5205-aee2-4126-8c64-9de951feffc7-kube-api-access-vp4fq\") on node \"crc\" DevicePath \"\"" Mar 12 15:16:05 crc kubenswrapper[4869]: I0312 15:16:05.410721 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555476-rt59v" event={"ID":"e96d5205-aee2-4126-8c64-9de951feffc7","Type":"ContainerDied","Data":"9304d4f9dc91549a91b3185a6a5f9bb9c33c5b0c8c638f3234b61056f5428afa"} Mar 12 15:16:05 crc kubenswrapper[4869]: I0312 15:16:05.410765 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9304d4f9dc91549a91b3185a6a5f9bb9c33c5b0c8c638f3234b61056f5428afa" Mar 12 15:16:05 crc kubenswrapper[4869]: I0312 15:16:05.410825 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555476-rt59v" Mar 12 15:16:05 crc kubenswrapper[4869]: I0312 15:16:05.444519 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555470-wzk9x"] Mar 12 15:16:05 crc kubenswrapper[4869]: I0312 15:16:05.453234 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555470-wzk9x"] Mar 12 15:16:06 crc kubenswrapper[4869]: I0312 15:16:06.347425 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84874bd4-e292-42eb-bb62-edbd0500482f" path="/var/lib/kubelet/pods/84874bd4-e292-42eb-bb62-edbd0500482f/volumes" Mar 12 15:16:18 crc kubenswrapper[4869]: I0312 15:16:18.343023 4869 scope.go:117] "RemoveContainer" containerID="bc737999b30693337d1e520b47de07490eab1ec628aefabcaeb8e63a17e676eb" Mar 12 15:16:18 crc kubenswrapper[4869]: E0312 15:16:18.343982 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:16:32 crc kubenswrapper[4869]: I0312 15:16:32.188266 4869 scope.go:117] "RemoveContainer" containerID="07488d27a0bf22f7f29266dbf115e85fb87fb51aa90e5154a49f24c7dc895676" Mar 12 15:16:32 crc kubenswrapper[4869]: I0312 15:16:32.337681 4869 scope.go:117] "RemoveContainer" containerID="bc737999b30693337d1e520b47de07490eab1ec628aefabcaeb8e63a17e676eb" Mar 12 15:16:32 crc kubenswrapper[4869]: E0312 15:16:32.337963 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:16:41 crc kubenswrapper[4869]: I0312 15:16:41.065333 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-hsxhc"] Mar 12 15:16:41 crc kubenswrapper[4869]: I0312 15:16:41.082040 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-6c3f-account-create-update-pm6md"] Mar 12 15:16:41 crc kubenswrapper[4869]: I0312 15:16:41.090678 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-z6tjp"] Mar 12 15:16:41 crc kubenswrapper[4869]: I0312 15:16:41.106363 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-hsxhc"] Mar 12 15:16:41 crc kubenswrapper[4869]: I0312 15:16:41.121238 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-z6tjp"] Mar 12 15:16:41 crc kubenswrapper[4869]: I0312 15:16:41.138610 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-6c3f-account-create-update-pm6md"] Mar 12 15:16:42 crc kubenswrapper[4869]: I0312 15:16:42.045901 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-379e-account-create-update-4fhtb"] Mar 12 15:16:42 crc kubenswrapper[4869]: I0312 15:16:42.056475 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-379e-account-create-update-4fhtb"] Mar 12 15:16:42 crc kubenswrapper[4869]: I0312 15:16:42.066173 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-0104-account-create-update-c8czb"] Mar 12 15:16:42 crc kubenswrapper[4869]: I0312 15:16:42.074663 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-8j2j6"] Mar 12 15:16:42 crc kubenswrapper[4869]: I0312 15:16:42.082520 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-0104-account-create-update-c8czb"] Mar 12 15:16:42 crc kubenswrapper[4869]: I0312 15:16:42.090337 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-8j2j6"] Mar 12 15:16:42 crc kubenswrapper[4869]: I0312 15:16:42.348777 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="031a0a0a-23c0-4e63-959a-a18b826d985c" path="/var/lib/kubelet/pods/031a0a0a-23c0-4e63-959a-a18b826d985c/volumes" Mar 12 15:16:42 crc kubenswrapper[4869]: I0312 15:16:42.349428 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="153f24a9-9165-4c37-87da-8b316a1a64e2" path="/var/lib/kubelet/pods/153f24a9-9165-4c37-87da-8b316a1a64e2/volumes" Mar 12 15:16:42 crc kubenswrapper[4869]: I0312 15:16:42.350387 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47703057-3858-435c-876e-79d208f7e023" path="/var/lib/kubelet/pods/47703057-3858-435c-876e-79d208f7e023/volumes" Mar 12 15:16:42 crc kubenswrapper[4869]: I0312 15:16:42.351216 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90fff1a6-7608-44d4-907b-25b03dc787f1" path="/var/lib/kubelet/pods/90fff1a6-7608-44d4-907b-25b03dc787f1/volumes" Mar 12 15:16:42 crc kubenswrapper[4869]: I0312 15:16:42.353343 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac8a7aa9-c42d-4074-a665-a6b2bdc572ae" path="/var/lib/kubelet/pods/ac8a7aa9-c42d-4074-a665-a6b2bdc572ae/volumes" Mar 12 15:16:42 crc kubenswrapper[4869]: I0312 15:16:42.354074 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aca9b94a-b7c4-46ef-afe9-a61bdf789138" path="/var/lib/kubelet/pods/aca9b94a-b7c4-46ef-afe9-a61bdf789138/volumes" Mar 12 15:16:45 crc kubenswrapper[4869]: I0312 15:16:45.336426 4869 scope.go:117] "RemoveContainer" containerID="bc737999b30693337d1e520b47de07490eab1ec628aefabcaeb8e63a17e676eb" Mar 12 15:16:45 crc kubenswrapper[4869]: E0312 15:16:45.337183 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:16:47 crc kubenswrapper[4869]: I0312 15:16:47.793853 4869 generic.go:334] "Generic (PLEG): container finished" podID="26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f" containerID="74b593d3deb65194f982fe464ae4c91a05c9146a8201ba054f30f5b58beff680" exitCode=0 Mar 12 15:16:47 crc kubenswrapper[4869]: I0312 15:16:47.793936 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-57nnj" event={"ID":"26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f","Type":"ContainerDied","Data":"74b593d3deb65194f982fe464ae4c91a05c9146a8201ba054f30f5b58beff680"} Mar 12 15:16:49 crc kubenswrapper[4869]: I0312 15:16:49.202896 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-57nnj" Mar 12 15:16:49 crc kubenswrapper[4869]: I0312 15:16:49.358296 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsnnh\" (UniqueName: \"kubernetes.io/projected/26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f-kube-api-access-dsnnh\") pod \"26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f\" (UID: \"26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f\") " Mar 12 15:16:49 crc kubenswrapper[4869]: I0312 15:16:49.358352 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f-ssh-key-openstack-edpm-ipam\") pod \"26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f\" (UID: \"26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f\") " Mar 12 15:16:49 crc kubenswrapper[4869]: I0312 15:16:49.358564 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f-inventory\") pod \"26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f\" (UID: \"26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f\") " Mar 12 15:16:49 crc kubenswrapper[4869]: I0312 15:16:49.364236 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f-kube-api-access-dsnnh" (OuterVolumeSpecName: "kube-api-access-dsnnh") pod "26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f" (UID: "26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f"). InnerVolumeSpecName "kube-api-access-dsnnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:16:49 crc kubenswrapper[4869]: I0312 15:16:49.388438 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f" (UID: "26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:16:49 crc kubenswrapper[4869]: I0312 15:16:49.396783 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f-inventory" (OuterVolumeSpecName: "inventory") pod "26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f" (UID: "26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:16:49 crc kubenswrapper[4869]: I0312 15:16:49.460370 4869 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 15:16:49 crc kubenswrapper[4869]: I0312 15:16:49.460410 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsnnh\" (UniqueName: \"kubernetes.io/projected/26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f-kube-api-access-dsnnh\") on node \"crc\" DevicePath \"\"" Mar 12 15:16:49 crc kubenswrapper[4869]: I0312 15:16:49.460425 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 15:16:49 crc kubenswrapper[4869]: I0312 15:16:49.812772 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-57nnj" event={"ID":"26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f","Type":"ContainerDied","Data":"9cf1d8461ba70f7fccb880587980fad9c0879f4580ba08b76f327efa92a58de9"} Mar 12 15:16:49 crc kubenswrapper[4869]: I0312 15:16:49.812813 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-57nnj" Mar 12 15:16:49 crc kubenswrapper[4869]: I0312 15:16:49.812833 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cf1d8461ba70f7fccb880587980fad9c0879f4580ba08b76f327efa92a58de9" Mar 12 15:16:49 crc kubenswrapper[4869]: I0312 15:16:49.914930 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lfqq6"] Mar 12 15:16:49 crc kubenswrapper[4869]: E0312 15:16:49.915442 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 12 15:16:49 crc kubenswrapper[4869]: I0312 15:16:49.915467 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 12 15:16:49 crc kubenswrapper[4869]: E0312 15:16:49.915491 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e96d5205-aee2-4126-8c64-9de951feffc7" containerName="oc" Mar 12 15:16:49 crc kubenswrapper[4869]: I0312 15:16:49.915499 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e96d5205-aee2-4126-8c64-9de951feffc7" containerName="oc" Mar 12 15:16:49 crc kubenswrapper[4869]: I0312 15:16:49.915766 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 12 15:16:49 crc kubenswrapper[4869]: I0312 15:16:49.915790 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="e96d5205-aee2-4126-8c64-9de951feffc7" containerName="oc" Mar 12 15:16:49 crc kubenswrapper[4869]: I0312 15:16:49.916604 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lfqq6" Mar 12 15:16:49 crc kubenswrapper[4869]: I0312 15:16:49.927992 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lfqq6"] Mar 12 15:16:49 crc kubenswrapper[4869]: I0312 15:16:49.928266 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cxsgq" Mar 12 15:16:49 crc kubenswrapper[4869]: I0312 15:16:49.928520 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 15:16:49 crc kubenswrapper[4869]: I0312 15:16:49.928729 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 15:16:49 crc kubenswrapper[4869]: I0312 15:16:49.929536 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 15:16:50 crc kubenswrapper[4869]: I0312 15:16:50.073455 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q75cb\" (UniqueName: \"kubernetes.io/projected/7d0293d4-0a27-4535-8a7e-53a6b1c7a835-kube-api-access-q75cb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lfqq6\" (UID: \"7d0293d4-0a27-4535-8a7e-53a6b1c7a835\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lfqq6" Mar 12 15:16:50 crc kubenswrapper[4869]: I0312 15:16:50.073506 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d0293d4-0a27-4535-8a7e-53a6b1c7a835-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lfqq6\" (UID: \"7d0293d4-0a27-4535-8a7e-53a6b1c7a835\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lfqq6" Mar 12 15:16:50 crc kubenswrapper[4869]: I0312 15:16:50.073660 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d0293d4-0a27-4535-8a7e-53a6b1c7a835-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lfqq6\" (UID: \"7d0293d4-0a27-4535-8a7e-53a6b1c7a835\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lfqq6" Mar 12 15:16:50 crc kubenswrapper[4869]: I0312 15:16:50.175094 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q75cb\" (UniqueName: \"kubernetes.io/projected/7d0293d4-0a27-4535-8a7e-53a6b1c7a835-kube-api-access-q75cb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lfqq6\" (UID: \"7d0293d4-0a27-4535-8a7e-53a6b1c7a835\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lfqq6" Mar 12 15:16:50 crc kubenswrapper[4869]: I0312 15:16:50.175153 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d0293d4-0a27-4535-8a7e-53a6b1c7a835-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lfqq6\" (UID: \"7d0293d4-0a27-4535-8a7e-53a6b1c7a835\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lfqq6" Mar 12 15:16:50 crc kubenswrapper[4869]: I0312 15:16:50.175310 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d0293d4-0a27-4535-8a7e-53a6b1c7a835-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lfqq6\" (UID: \"7d0293d4-0a27-4535-8a7e-53a6b1c7a835\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lfqq6" Mar 12 15:16:50 crc kubenswrapper[4869]: I0312 15:16:50.180917 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d0293d4-0a27-4535-8a7e-53a6b1c7a835-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lfqq6\" (UID: \"7d0293d4-0a27-4535-8a7e-53a6b1c7a835\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lfqq6" Mar 12 15:16:50 crc kubenswrapper[4869]: I0312 15:16:50.197667 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q75cb\" (UniqueName: \"kubernetes.io/projected/7d0293d4-0a27-4535-8a7e-53a6b1c7a835-kube-api-access-q75cb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lfqq6\" (UID: \"7d0293d4-0a27-4535-8a7e-53a6b1c7a835\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lfqq6" Mar 12 15:16:50 crc kubenswrapper[4869]: I0312 15:16:50.199285 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d0293d4-0a27-4535-8a7e-53a6b1c7a835-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lfqq6\" (UID: \"7d0293d4-0a27-4535-8a7e-53a6b1c7a835\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lfqq6" Mar 12 15:16:50 crc kubenswrapper[4869]: I0312 15:16:50.240686 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lfqq6" Mar 12 15:16:50 crc kubenswrapper[4869]: I0312 15:16:50.766997 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lfqq6"] Mar 12 15:16:50 crc kubenswrapper[4869]: I0312 15:16:50.781234 4869 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 15:16:50 crc kubenswrapper[4869]: I0312 15:16:50.823642 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lfqq6" event={"ID":"7d0293d4-0a27-4535-8a7e-53a6b1c7a835","Type":"ContainerStarted","Data":"14563f2d3189a7757735c8b538a9d0ddc2381cf0436dbe83beae7ff6f21ce280"} Mar 12 15:16:51 crc kubenswrapper[4869]: I0312 15:16:51.836194 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lfqq6" event={"ID":"7d0293d4-0a27-4535-8a7e-53a6b1c7a835","Type":"ContainerStarted","Data":"d223c5f4356dc14b152aed6f9f399f7a936e5eea00f1336faed04e6c656d6778"} Mar 12 15:16:51 crc kubenswrapper[4869]: I0312 15:16:51.859632 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lfqq6" podStartSLOduration=2.428414094 podStartE2EDuration="2.859612319s" podCreationTimestamp="2026-03-12 15:16:49 +0000 UTC" firstStartedPulling="2026-03-12 15:16:50.780894643 +0000 UTC m=+1763.066119921" lastFinishedPulling="2026-03-12 15:16:51.212092858 +0000 UTC m=+1763.497318146" observedRunningTime="2026-03-12 15:16:51.849625622 +0000 UTC m=+1764.134850910" watchObservedRunningTime="2026-03-12 15:16:51.859612319 +0000 UTC m=+1764.144837597" Mar 12 15:16:59 crc kubenswrapper[4869]: I0312 15:16:59.336144 4869 scope.go:117] "RemoveContainer" containerID="bc737999b30693337d1e520b47de07490eab1ec628aefabcaeb8e63a17e676eb" Mar 12 15:16:59 crc kubenswrapper[4869]: E0312 15:16:59.336846 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:17:01 crc kubenswrapper[4869]: I0312 15:17:01.042662 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-z6955"] Mar 12 15:17:01 crc kubenswrapper[4869]: I0312 15:17:01.076712 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-z6955"] Mar 12 15:17:02 crc kubenswrapper[4869]: I0312 15:17:02.346828 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8d7822f-0666-4c82-9dc3-8f2074cdc5e0" path="/var/lib/kubelet/pods/f8d7822f-0666-4c82-9dc3-8f2074cdc5e0/volumes" Mar 12 15:17:05 crc kubenswrapper[4869]: I0312 15:17:05.026853 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-zn9wl"] Mar 12 15:17:05 crc kubenswrapper[4869]: I0312 15:17:05.036387 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-zn9wl"] Mar 12 15:17:06 crc kubenswrapper[4869]: I0312 15:17:06.350259 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3f34b7c-48b6-4191-b62a-8d8c373b3197" path="/var/lib/kubelet/pods/d3f34b7c-48b6-4191-b62a-8d8c373b3197/volumes" Mar 12 15:17:13 crc kubenswrapper[4869]: I0312 15:17:13.336883 4869 scope.go:117] "RemoveContainer" containerID="bc737999b30693337d1e520b47de07490eab1ec628aefabcaeb8e63a17e676eb" Mar 12 15:17:13 crc kubenswrapper[4869]: E0312 15:17:13.337664 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:17:22 crc kubenswrapper[4869]: I0312 15:17:22.058597 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-aa40-account-create-update-dj2sh"] Mar 12 15:17:22 crc kubenswrapper[4869]: I0312 15:17:22.083249 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-n44jd"] Mar 12 15:17:22 crc kubenswrapper[4869]: I0312 15:17:22.096342 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-318e-account-create-update-zvxwz"] Mar 12 15:17:22 crc kubenswrapper[4869]: I0312 15:17:22.112479 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-n44jd"] Mar 12 15:17:22 crc kubenswrapper[4869]: I0312 15:17:22.123905 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-aa40-account-create-update-dj2sh"] Mar 12 15:17:22 crc kubenswrapper[4869]: I0312 15:17:22.132353 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-318e-account-create-update-zvxwz"] Mar 12 15:17:22 crc kubenswrapper[4869]: I0312 15:17:22.140259 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-468h4"] Mar 12 15:17:22 crc kubenswrapper[4869]: I0312 15:17:22.148865 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-efc2-account-create-update-54dhl"] Mar 12 15:17:22 crc kubenswrapper[4869]: I0312 15:17:22.158022 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-468h4"] Mar 12 15:17:22 crc kubenswrapper[4869]: I0312 15:17:22.167478 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-efc2-account-create-update-54dhl"] Mar 12 15:17:22 crc kubenswrapper[4869]: I0312 15:17:22.176379 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-6589-account-create-update-flcbb"] Mar 12 15:17:22 crc kubenswrapper[4869]: I0312 15:17:22.185433 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-6589-account-create-update-flcbb"] Mar 12 15:17:22 crc kubenswrapper[4869]: I0312 15:17:22.194615 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-gfwwb"] Mar 12 15:17:22 crc kubenswrapper[4869]: I0312 15:17:22.202921 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-gfwwb"] Mar 12 15:17:22 crc kubenswrapper[4869]: I0312 15:17:22.211168 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-4nrzc"] Mar 12 15:17:22 crc kubenswrapper[4869]: I0312 15:17:22.218906 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-4nrzc"] Mar 12 15:17:22 crc kubenswrapper[4869]: I0312 15:17:22.348967 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bcf123a-c619-4d5e-a18a-e9a40db85ce4" path="/var/lib/kubelet/pods/0bcf123a-c619-4d5e-a18a-e9a40db85ce4/volumes" Mar 12 15:17:22 crc kubenswrapper[4869]: I0312 15:17:22.349914 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18c82400-c2cb-44d6-a384-8c604824f269" path="/var/lib/kubelet/pods/18c82400-c2cb-44d6-a384-8c604824f269/volumes" Mar 12 15:17:22 crc kubenswrapper[4869]: I0312 15:17:22.350735 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce" path="/var/lib/kubelet/pods/2552e441-e5f2-4fe6-bd1b-4a92dc9e2cce/volumes" Mar 12 15:17:22 crc kubenswrapper[4869]: I0312 15:17:22.351425 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2917626f-4943-4e2a-989b-3ec041337596" path="/var/lib/kubelet/pods/2917626f-4943-4e2a-989b-3ec041337596/volumes" Mar 12 15:17:22 crc kubenswrapper[4869]: I0312 15:17:22.352686 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b8504b7-ac15-45df-b26a-7c848cf4c068" path="/var/lib/kubelet/pods/7b8504b7-ac15-45df-b26a-7c848cf4c068/volumes" Mar 12 15:17:22 crc kubenswrapper[4869]: I0312 15:17:22.353269 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab13ae17-fd00-4317-bbe7-bac949fbb0bb" path="/var/lib/kubelet/pods/ab13ae17-fd00-4317-bbe7-bac949fbb0bb/volumes" Mar 12 15:17:22 crc kubenswrapper[4869]: I0312 15:17:22.353943 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e31b3289-8593-4def-be89-bed7b813ef1b" path="/var/lib/kubelet/pods/e31b3289-8593-4def-be89-bed7b813ef1b/volumes" Mar 12 15:17:22 crc kubenswrapper[4869]: I0312 15:17:22.355092 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0" path="/var/lib/kubelet/pods/eec2ff73-c9d9-4bd5-9d47-f5e2bcbfdfb0/volumes" Mar 12 15:17:26 crc kubenswrapper[4869]: I0312 15:17:26.032155 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-h89v6"] Mar 12 15:17:26 crc kubenswrapper[4869]: I0312 15:17:26.040320 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-h89v6"] Mar 12 15:17:26 crc kubenswrapper[4869]: I0312 15:17:26.348893 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42953bf3-0bd5-467e-9502-6ddea5e8bf92" path="/var/lib/kubelet/pods/42953bf3-0bd5-467e-9502-6ddea5e8bf92/volumes" Mar 12 15:17:27 crc kubenswrapper[4869]: I0312 15:17:27.336575 4869 scope.go:117] "RemoveContainer" containerID="bc737999b30693337d1e520b47de07490eab1ec628aefabcaeb8e63a17e676eb" Mar 12 15:17:27 crc kubenswrapper[4869]: E0312 15:17:27.337324 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:17:32 crc kubenswrapper[4869]: I0312 15:17:32.259014 4869 scope.go:117] "RemoveContainer" containerID="280546406e7403bb6ec74cebcd27dc92a837e19c2dc31048c2e42d77827e7cc0" Mar 12 15:17:32 crc kubenswrapper[4869]: I0312 15:17:32.298205 4869 scope.go:117] "RemoveContainer" containerID="d7e2802fbcddddc202060a2ad07bea3a697bcfc6c526ac74580df40d5d118e85" Mar 12 15:17:32 crc kubenswrapper[4869]: I0312 15:17:32.362752 4869 scope.go:117] "RemoveContainer" containerID="af10a209726e9809049a6e11300a42688ca9981619c3653fcc5a079eff069a77" Mar 12 15:17:32 crc kubenswrapper[4869]: I0312 15:17:32.407966 4869 scope.go:117] "RemoveContainer" containerID="a481a856bf43915cc69cc9fb7c87bdace3c38e60a67f180da8a40a46b4c0e65e" Mar 12 15:17:32 crc kubenswrapper[4869]: I0312 15:17:32.473896 4869 scope.go:117] "RemoveContainer" containerID="0f11c17d6fda01797911d63a6fe8618956c8b677c30f987289b01ac858fca9f2" Mar 12 15:17:32 crc kubenswrapper[4869]: I0312 15:17:32.498408 4869 scope.go:117] "RemoveContainer" containerID="3a2842f8f11d563df3cf17fa015a2d9eaad47c9fd20005ea4dada238ecf6a1b7" Mar 12 15:17:32 crc kubenswrapper[4869]: I0312 15:17:32.542588 4869 scope.go:117] "RemoveContainer" containerID="dfc729fa701f62a7acbd858e1ce6589b7b5cc7f3e4e0814d46c8b66a576c30bb" Mar 12 15:17:32 crc kubenswrapper[4869]: I0312 15:17:32.562674 4869 scope.go:117] "RemoveContainer" containerID="af39558788478d12f7c7ceee80794db28f43b752c6ee72bd04ea21cb05f4b98f" Mar 12 15:17:32 crc kubenswrapper[4869]: I0312 15:17:32.581809 4869 scope.go:117] "RemoveContainer" containerID="6af08a6ff9b419383ee5c8d8382ee8e3ee2134a04856f377c4930ee11e803f97" Mar 12 15:17:32 crc kubenswrapper[4869]: I0312 15:17:32.604420 4869 scope.go:117] "RemoveContainer" containerID="d6f56cce82a1c2c58f55b9d26a6c8afe3d7b1528ce0cde80535c453bab65d721" Mar 12 15:17:32 crc kubenswrapper[4869]: I0312 15:17:32.627954 4869 scope.go:117] "RemoveContainer" containerID="57f46838f8acc49c64c26dd3e754df9d2ce140ccd168d2986aa03a401f13ba11" Mar 12 15:17:32 crc kubenswrapper[4869]: I0312 15:17:32.653231 4869 scope.go:117] "RemoveContainer" containerID="85aac3668302de3448bcef5f1aed7619f7feddce3efa61660a003a9bd189257b" Mar 12 15:17:32 crc kubenswrapper[4869]: I0312 15:17:32.687242 4869 scope.go:117] "RemoveContainer" containerID="eb0b5225f0b17ea9fd14d44ab11c0e8d9b454f53a8ec345e0a8c5bdf4f90dca2" Mar 12 15:17:32 crc kubenswrapper[4869]: I0312 15:17:32.706877 4869 scope.go:117] "RemoveContainer" containerID="85735fec3a2ab9087b7382ec8eb98e9ade6eecc5dc90c0f8a1262267adcfe7dd" Mar 12 15:17:32 crc kubenswrapper[4869]: I0312 15:17:32.728827 4869 scope.go:117] "RemoveContainer" containerID="f9a0313a123637debb355a3e3a1a19d19df8997691674ff56da3a3483f37dbeb" Mar 12 15:17:32 crc kubenswrapper[4869]: I0312 15:17:32.764415 4869 scope.go:117] "RemoveContainer" containerID="d8b7f085be38379d8095677d18d685b0df4208cd2575cd83e383228f23a6875c" Mar 12 15:17:32 crc kubenswrapper[4869]: I0312 15:17:32.797066 4869 scope.go:117] "RemoveContainer" containerID="e7d3c51e8ec76a1ad414f7a73293decf839862c1d364222ca0eee41115ec500b" Mar 12 15:17:41 crc kubenswrapper[4869]: I0312 15:17:41.336460 4869 scope.go:117] "RemoveContainer" containerID="bc737999b30693337d1e520b47de07490eab1ec628aefabcaeb8e63a17e676eb" Mar 12 15:17:41 crc kubenswrapper[4869]: E0312 15:17:41.337122 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:17:53 crc kubenswrapper[4869]: I0312 15:17:53.050822 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-556xl"] Mar 12 15:17:53 crc kubenswrapper[4869]: I0312 15:17:53.058914 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-556xl"] Mar 12 15:17:53 crc kubenswrapper[4869]: I0312 15:17:53.338291 4869 scope.go:117] "RemoveContainer" containerID="bc737999b30693337d1e520b47de07490eab1ec628aefabcaeb8e63a17e676eb" Mar 12 15:17:53 crc kubenswrapper[4869]: E0312 15:17:53.338951 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:17:54 crc kubenswrapper[4869]: I0312 15:17:54.365619 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50098315-1895-433b-9d66-df198c579b4e" path="/var/lib/kubelet/pods/50098315-1895-433b-9d66-df198c579b4e/volumes" Mar 12 15:17:56 crc kubenswrapper[4869]: I0312 15:17:56.036618 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-vxdx8"] Mar 12 15:17:56 crc kubenswrapper[4869]: I0312 15:17:56.046018 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-vxdx8"] Mar 12 15:17:56 crc kubenswrapper[4869]: I0312 15:17:56.346874 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cadca9f-20ef-432c-8816-e5fea0d0c93e" path="/var/lib/kubelet/pods/2cadca9f-20ef-432c-8816-e5fea0d0c93e/volumes" Mar 12 15:17:56 crc kubenswrapper[4869]: I0312 15:17:56.393704 4869 generic.go:334] "Generic (PLEG): container finished" podID="7d0293d4-0a27-4535-8a7e-53a6b1c7a835" containerID="d223c5f4356dc14b152aed6f9f399f7a936e5eea00f1336faed04e6c656d6778" exitCode=0 Mar 12 15:17:56 crc kubenswrapper[4869]: I0312 15:17:56.393747 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lfqq6" event={"ID":"7d0293d4-0a27-4535-8a7e-53a6b1c7a835","Type":"ContainerDied","Data":"d223c5f4356dc14b152aed6f9f399f7a936e5eea00f1336faed04e6c656d6778"} Mar 12 15:17:57 crc kubenswrapper[4869]: I0312 15:17:57.810056 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lfqq6" Mar 12 15:17:57 crc kubenswrapper[4869]: I0312 15:17:57.876993 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q75cb\" (UniqueName: \"kubernetes.io/projected/7d0293d4-0a27-4535-8a7e-53a6b1c7a835-kube-api-access-q75cb\") pod \"7d0293d4-0a27-4535-8a7e-53a6b1c7a835\" (UID: \"7d0293d4-0a27-4535-8a7e-53a6b1c7a835\") " Mar 12 15:17:57 crc kubenswrapper[4869]: I0312 15:17:57.877099 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d0293d4-0a27-4535-8a7e-53a6b1c7a835-ssh-key-openstack-edpm-ipam\") pod \"7d0293d4-0a27-4535-8a7e-53a6b1c7a835\" (UID: \"7d0293d4-0a27-4535-8a7e-53a6b1c7a835\") " Mar 12 15:17:57 crc kubenswrapper[4869]: I0312 15:17:57.877247 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d0293d4-0a27-4535-8a7e-53a6b1c7a835-inventory\") pod \"7d0293d4-0a27-4535-8a7e-53a6b1c7a835\" (UID: \"7d0293d4-0a27-4535-8a7e-53a6b1c7a835\") " Mar 12 15:17:57 crc kubenswrapper[4869]: I0312 15:17:57.884082 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d0293d4-0a27-4535-8a7e-53a6b1c7a835-kube-api-access-q75cb" (OuterVolumeSpecName: "kube-api-access-q75cb") pod "7d0293d4-0a27-4535-8a7e-53a6b1c7a835" (UID: "7d0293d4-0a27-4535-8a7e-53a6b1c7a835"). InnerVolumeSpecName "kube-api-access-q75cb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:17:57 crc kubenswrapper[4869]: I0312 15:17:57.911170 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d0293d4-0a27-4535-8a7e-53a6b1c7a835-inventory" (OuterVolumeSpecName: "inventory") pod "7d0293d4-0a27-4535-8a7e-53a6b1c7a835" (UID: "7d0293d4-0a27-4535-8a7e-53a6b1c7a835"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:17:57 crc kubenswrapper[4869]: I0312 15:17:57.911488 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d0293d4-0a27-4535-8a7e-53a6b1c7a835-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7d0293d4-0a27-4535-8a7e-53a6b1c7a835" (UID: "7d0293d4-0a27-4535-8a7e-53a6b1c7a835"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:17:57 crc kubenswrapper[4869]: I0312 15:17:57.979934 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q75cb\" (UniqueName: \"kubernetes.io/projected/7d0293d4-0a27-4535-8a7e-53a6b1c7a835-kube-api-access-q75cb\") on node \"crc\" DevicePath \"\"" Mar 12 15:17:57 crc kubenswrapper[4869]: I0312 15:17:57.979972 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7d0293d4-0a27-4535-8a7e-53a6b1c7a835-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 15:17:57 crc kubenswrapper[4869]: I0312 15:17:57.979987 4869 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d0293d4-0a27-4535-8a7e-53a6b1c7a835-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 15:17:58 crc kubenswrapper[4869]: I0312 15:17:58.419150 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lfqq6" event={"ID":"7d0293d4-0a27-4535-8a7e-53a6b1c7a835","Type":"ContainerDied","Data":"14563f2d3189a7757735c8b538a9d0ddc2381cf0436dbe83beae7ff6f21ce280"} Mar 12 15:17:58 crc kubenswrapper[4869]: I0312 15:17:58.419492 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14563f2d3189a7757735c8b538a9d0ddc2381cf0436dbe83beae7ff6f21ce280" Mar 12 15:17:58 crc kubenswrapper[4869]: I0312 15:17:58.419204 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lfqq6" Mar 12 15:17:58 crc kubenswrapper[4869]: I0312 15:17:58.526789 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2794b"] Mar 12 15:17:58 crc kubenswrapper[4869]: E0312 15:17:58.527232 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d0293d4-0a27-4535-8a7e-53a6b1c7a835" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 12 15:17:58 crc kubenswrapper[4869]: I0312 15:17:58.527248 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d0293d4-0a27-4535-8a7e-53a6b1c7a835" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 12 15:17:58 crc kubenswrapper[4869]: I0312 15:17:58.527457 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d0293d4-0a27-4535-8a7e-53a6b1c7a835" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 12 15:17:58 crc kubenswrapper[4869]: I0312 15:17:58.528183 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2794b" Mar 12 15:17:58 crc kubenswrapper[4869]: I0312 15:17:58.530074 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cxsgq" Mar 12 15:17:58 crc kubenswrapper[4869]: I0312 15:17:58.530680 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 15:17:58 crc kubenswrapper[4869]: I0312 15:17:58.530693 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 15:17:58 crc kubenswrapper[4869]: I0312 15:17:58.531794 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 15:17:58 crc kubenswrapper[4869]: I0312 15:17:58.546579 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2794b"] Mar 12 15:17:58 crc kubenswrapper[4869]: I0312 15:17:58.591336 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/69ed9651-a1cb-4470-8629-adeb0dea6377-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2794b\" (UID: \"69ed9651-a1cb-4470-8629-adeb0dea6377\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2794b" Mar 12 15:17:58 crc kubenswrapper[4869]: I0312 15:17:58.591471 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkj59\" (UniqueName: \"kubernetes.io/projected/69ed9651-a1cb-4470-8629-adeb0dea6377-kube-api-access-zkj59\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2794b\" (UID: \"69ed9651-a1cb-4470-8629-adeb0dea6377\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2794b" Mar 12 15:17:58 crc kubenswrapper[4869]: I0312 15:17:58.591523 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69ed9651-a1cb-4470-8629-adeb0dea6377-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2794b\" (UID: \"69ed9651-a1cb-4470-8629-adeb0dea6377\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2794b" Mar 12 15:17:58 crc kubenswrapper[4869]: I0312 15:17:58.693645 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69ed9651-a1cb-4470-8629-adeb0dea6377-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2794b\" (UID: \"69ed9651-a1cb-4470-8629-adeb0dea6377\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2794b" Mar 12 15:17:58 crc kubenswrapper[4869]: I0312 15:17:58.693791 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/69ed9651-a1cb-4470-8629-adeb0dea6377-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2794b\" (UID: \"69ed9651-a1cb-4470-8629-adeb0dea6377\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2794b" Mar 12 15:17:58 crc kubenswrapper[4869]: I0312 15:17:58.694417 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkj59\" (UniqueName: \"kubernetes.io/projected/69ed9651-a1cb-4470-8629-adeb0dea6377-kube-api-access-zkj59\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2794b\" (UID: \"69ed9651-a1cb-4470-8629-adeb0dea6377\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2794b" Mar 12 15:17:58 crc kubenswrapper[4869]: I0312 15:17:58.698964 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/69ed9651-a1cb-4470-8629-adeb0dea6377-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2794b\" (UID: \"69ed9651-a1cb-4470-8629-adeb0dea6377\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2794b" Mar 12 15:17:58 crc kubenswrapper[4869]: I0312 15:17:58.708174 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69ed9651-a1cb-4470-8629-adeb0dea6377-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2794b\" (UID: \"69ed9651-a1cb-4470-8629-adeb0dea6377\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2794b" Mar 12 15:17:58 crc kubenswrapper[4869]: I0312 15:17:58.709818 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkj59\" (UniqueName: \"kubernetes.io/projected/69ed9651-a1cb-4470-8629-adeb0dea6377-kube-api-access-zkj59\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-2794b\" (UID: \"69ed9651-a1cb-4470-8629-adeb0dea6377\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2794b" Mar 12 15:17:58 crc kubenswrapper[4869]: I0312 15:17:58.847696 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2794b" Mar 12 15:17:59 crc kubenswrapper[4869]: I0312 15:17:59.044668 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-jpbgs"] Mar 12 15:17:59 crc kubenswrapper[4869]: I0312 15:17:59.057738 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-jpbgs"] Mar 12 15:17:59 crc kubenswrapper[4869]: I0312 15:17:59.327915 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2794b"] Mar 12 15:17:59 crc kubenswrapper[4869]: I0312 15:17:59.428412 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2794b" event={"ID":"69ed9651-a1cb-4470-8629-adeb0dea6377","Type":"ContainerStarted","Data":"185c4e239ced638ac0076e4a733333416d89b3aee09f11611e0da805a657282e"} Mar 12 15:18:00 crc kubenswrapper[4869]: I0312 15:18:00.143481 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555478-5smpd"] Mar 12 15:18:00 crc kubenswrapper[4869]: I0312 15:18:00.145744 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555478-5smpd" Mar 12 15:18:00 crc kubenswrapper[4869]: I0312 15:18:00.148926 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:18:00 crc kubenswrapper[4869]: I0312 15:18:00.149250 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 15:18:00 crc kubenswrapper[4869]: I0312 15:18:00.149494 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:18:00 crc kubenswrapper[4869]: I0312 15:18:00.160345 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555478-5smpd"] Mar 12 15:18:00 crc kubenswrapper[4869]: I0312 15:18:00.223492 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6frqq\" (UniqueName: \"kubernetes.io/projected/a11410e3-6d91-4cfe-b61d-fa28af63504a-kube-api-access-6frqq\") pod \"auto-csr-approver-29555478-5smpd\" (UID: \"a11410e3-6d91-4cfe-b61d-fa28af63504a\") " pod="openshift-infra/auto-csr-approver-29555478-5smpd" Mar 12 15:18:00 crc kubenswrapper[4869]: I0312 15:18:00.325898 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6frqq\" (UniqueName: \"kubernetes.io/projected/a11410e3-6d91-4cfe-b61d-fa28af63504a-kube-api-access-6frqq\") pod \"auto-csr-approver-29555478-5smpd\" (UID: \"a11410e3-6d91-4cfe-b61d-fa28af63504a\") " pod="openshift-infra/auto-csr-approver-29555478-5smpd" Mar 12 15:18:00 crc kubenswrapper[4869]: I0312 15:18:00.343106 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6frqq\" (UniqueName: \"kubernetes.io/projected/a11410e3-6d91-4cfe-b61d-fa28af63504a-kube-api-access-6frqq\") pod \"auto-csr-approver-29555478-5smpd\" (UID: \"a11410e3-6d91-4cfe-b61d-fa28af63504a\") " pod="openshift-infra/auto-csr-approver-29555478-5smpd" Mar 12 15:18:00 crc kubenswrapper[4869]: I0312 15:18:00.347758 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="011be9f6-aaa2-48f8-8ef0-aa4c21e2d718" path="/var/lib/kubelet/pods/011be9f6-aaa2-48f8-8ef0-aa4c21e2d718/volumes" Mar 12 15:18:00 crc kubenswrapper[4869]: I0312 15:18:00.436608 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2794b" event={"ID":"69ed9651-a1cb-4470-8629-adeb0dea6377","Type":"ContainerStarted","Data":"9e2c8d57431c4f275fe8adf50272d6fa04e56ab425bd2b6c09cc6898ad7a70e6"} Mar 12 15:18:00 crc kubenswrapper[4869]: I0312 15:18:00.452983 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2794b" podStartSLOduration=2.019069071 podStartE2EDuration="2.452966474s" podCreationTimestamp="2026-03-12 15:17:58 +0000 UTC" firstStartedPulling="2026-03-12 15:17:59.331439958 +0000 UTC m=+1831.616665236" lastFinishedPulling="2026-03-12 15:17:59.765337371 +0000 UTC m=+1832.050562639" observedRunningTime="2026-03-12 15:18:00.451369033 +0000 UTC m=+1832.736594311" watchObservedRunningTime="2026-03-12 15:18:00.452966474 +0000 UTC m=+1832.738191752" Mar 12 15:18:00 crc kubenswrapper[4869]: I0312 15:18:00.493253 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555478-5smpd" Mar 12 15:18:00 crc kubenswrapper[4869]: I0312 15:18:00.926499 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555478-5smpd"] Mar 12 15:18:01 crc kubenswrapper[4869]: I0312 15:18:01.449439 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555478-5smpd" event={"ID":"a11410e3-6d91-4cfe-b61d-fa28af63504a","Type":"ContainerStarted","Data":"73d0ba72cefc8d3b6856c41b1a11db4dba96d87ce5e9f876f3780662304ac856"} Mar 12 15:18:02 crc kubenswrapper[4869]: I0312 15:18:02.458666 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555478-5smpd" event={"ID":"a11410e3-6d91-4cfe-b61d-fa28af63504a","Type":"ContainerStarted","Data":"1d076ff963f5c0bce1825f7bbe8cd5b4193df8f5e7c7af916609a77246f4cd77"} Mar 12 15:18:02 crc kubenswrapper[4869]: I0312 15:18:02.476236 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555478-5smpd" podStartSLOduration=1.339118985 podStartE2EDuration="2.476218564s" podCreationTimestamp="2026-03-12 15:18:00 +0000 UTC" firstStartedPulling="2026-03-12 15:18:00.939163584 +0000 UTC m=+1833.224388862" lastFinishedPulling="2026-03-12 15:18:02.076263153 +0000 UTC m=+1834.361488441" observedRunningTime="2026-03-12 15:18:02.472132219 +0000 UTC m=+1834.757357497" watchObservedRunningTime="2026-03-12 15:18:02.476218564 +0000 UTC m=+1834.761443842" Mar 12 15:18:03 crc kubenswrapper[4869]: I0312 15:18:03.467880 4869 generic.go:334] "Generic (PLEG): container finished" podID="a11410e3-6d91-4cfe-b61d-fa28af63504a" containerID="1d076ff963f5c0bce1825f7bbe8cd5b4193df8f5e7c7af916609a77246f4cd77" exitCode=0 Mar 12 15:18:03 crc kubenswrapper[4869]: I0312 15:18:03.468275 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555478-5smpd" event={"ID":"a11410e3-6d91-4cfe-b61d-fa28af63504a","Type":"ContainerDied","Data":"1d076ff963f5c0bce1825f7bbe8cd5b4193df8f5e7c7af916609a77246f4cd77"} Mar 12 15:18:04 crc kubenswrapper[4869]: I0312 15:18:04.477336 4869 generic.go:334] "Generic (PLEG): container finished" podID="69ed9651-a1cb-4470-8629-adeb0dea6377" containerID="9e2c8d57431c4f275fe8adf50272d6fa04e56ab425bd2b6c09cc6898ad7a70e6" exitCode=0 Mar 12 15:18:04 crc kubenswrapper[4869]: I0312 15:18:04.477429 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2794b" event={"ID":"69ed9651-a1cb-4470-8629-adeb0dea6377","Type":"ContainerDied","Data":"9e2c8d57431c4f275fe8adf50272d6fa04e56ab425bd2b6c09cc6898ad7a70e6"} Mar 12 15:18:04 crc kubenswrapper[4869]: I0312 15:18:04.798032 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555478-5smpd" Mar 12 15:18:04 crc kubenswrapper[4869]: I0312 15:18:04.829060 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6frqq\" (UniqueName: \"kubernetes.io/projected/a11410e3-6d91-4cfe-b61d-fa28af63504a-kube-api-access-6frqq\") pod \"a11410e3-6d91-4cfe-b61d-fa28af63504a\" (UID: \"a11410e3-6d91-4cfe-b61d-fa28af63504a\") " Mar 12 15:18:04 crc kubenswrapper[4869]: I0312 15:18:04.834049 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a11410e3-6d91-4cfe-b61d-fa28af63504a-kube-api-access-6frqq" (OuterVolumeSpecName: "kube-api-access-6frqq") pod "a11410e3-6d91-4cfe-b61d-fa28af63504a" (UID: "a11410e3-6d91-4cfe-b61d-fa28af63504a"). InnerVolumeSpecName "kube-api-access-6frqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:18:04 crc kubenswrapper[4869]: I0312 15:18:04.932353 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6frqq\" (UniqueName: \"kubernetes.io/projected/a11410e3-6d91-4cfe-b61d-fa28af63504a-kube-api-access-6frqq\") on node \"crc\" DevicePath \"\"" Mar 12 15:18:05 crc kubenswrapper[4869]: I0312 15:18:05.336949 4869 scope.go:117] "RemoveContainer" containerID="bc737999b30693337d1e520b47de07490eab1ec628aefabcaeb8e63a17e676eb" Mar 12 15:18:05 crc kubenswrapper[4869]: E0312 15:18:05.337219 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:18:05 crc kubenswrapper[4869]: I0312 15:18:05.487637 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555478-5smpd" event={"ID":"a11410e3-6d91-4cfe-b61d-fa28af63504a","Type":"ContainerDied","Data":"73d0ba72cefc8d3b6856c41b1a11db4dba96d87ce5e9f876f3780662304ac856"} Mar 12 15:18:05 crc kubenswrapper[4869]: I0312 15:18:05.487721 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73d0ba72cefc8d3b6856c41b1a11db4dba96d87ce5e9f876f3780662304ac856" Mar 12 15:18:05 crc kubenswrapper[4869]: I0312 15:18:05.487685 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555478-5smpd" Mar 12 15:18:05 crc kubenswrapper[4869]: I0312 15:18:05.856297 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2794b" Mar 12 15:18:05 crc kubenswrapper[4869]: I0312 15:18:05.887005 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555472-6np76"] Mar 12 15:18:05 crc kubenswrapper[4869]: I0312 15:18:05.901765 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555472-6np76"] Mar 12 15:18:05 crc kubenswrapper[4869]: I0312 15:18:05.951665 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69ed9651-a1cb-4470-8629-adeb0dea6377-inventory\") pod \"69ed9651-a1cb-4470-8629-adeb0dea6377\" (UID: \"69ed9651-a1cb-4470-8629-adeb0dea6377\") " Mar 12 15:18:05 crc kubenswrapper[4869]: I0312 15:18:05.951747 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/69ed9651-a1cb-4470-8629-adeb0dea6377-ssh-key-openstack-edpm-ipam\") pod \"69ed9651-a1cb-4470-8629-adeb0dea6377\" (UID: \"69ed9651-a1cb-4470-8629-adeb0dea6377\") " Mar 12 15:18:05 crc kubenswrapper[4869]: I0312 15:18:05.951793 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkj59\" (UniqueName: \"kubernetes.io/projected/69ed9651-a1cb-4470-8629-adeb0dea6377-kube-api-access-zkj59\") pod \"69ed9651-a1cb-4470-8629-adeb0dea6377\" (UID: \"69ed9651-a1cb-4470-8629-adeb0dea6377\") " Mar 12 15:18:05 crc kubenswrapper[4869]: I0312 15:18:05.957743 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69ed9651-a1cb-4470-8629-adeb0dea6377-kube-api-access-zkj59" (OuterVolumeSpecName: "kube-api-access-zkj59") pod "69ed9651-a1cb-4470-8629-adeb0dea6377" (UID: "69ed9651-a1cb-4470-8629-adeb0dea6377"). InnerVolumeSpecName "kube-api-access-zkj59". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:18:05 crc kubenswrapper[4869]: I0312 15:18:05.981129 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ed9651-a1cb-4470-8629-adeb0dea6377-inventory" (OuterVolumeSpecName: "inventory") pod "69ed9651-a1cb-4470-8629-adeb0dea6377" (UID: "69ed9651-a1cb-4470-8629-adeb0dea6377"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:18:05 crc kubenswrapper[4869]: I0312 15:18:05.981686 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ed9651-a1cb-4470-8629-adeb0dea6377-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "69ed9651-a1cb-4470-8629-adeb0dea6377" (UID: "69ed9651-a1cb-4470-8629-adeb0dea6377"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:18:06 crc kubenswrapper[4869]: I0312 15:18:06.053880 4869 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69ed9651-a1cb-4470-8629-adeb0dea6377-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 15:18:06 crc kubenswrapper[4869]: I0312 15:18:06.053911 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/69ed9651-a1cb-4470-8629-adeb0dea6377-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 15:18:06 crc kubenswrapper[4869]: I0312 15:18:06.053923 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkj59\" (UniqueName: \"kubernetes.io/projected/69ed9651-a1cb-4470-8629-adeb0dea6377-kube-api-access-zkj59\") on node \"crc\" DevicePath \"\"" Mar 12 15:18:06 crc kubenswrapper[4869]: I0312 15:18:06.347188 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63ea2272-6e47-4761-9a89-c6c2d0768aee" path="/var/lib/kubelet/pods/63ea2272-6e47-4761-9a89-c6c2d0768aee/volumes" Mar 12 15:18:06 crc kubenswrapper[4869]: I0312 15:18:06.500564 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2794b" event={"ID":"69ed9651-a1cb-4470-8629-adeb0dea6377","Type":"ContainerDied","Data":"185c4e239ced638ac0076e4a733333416d89b3aee09f11611e0da805a657282e"} Mar 12 15:18:06 crc kubenswrapper[4869]: I0312 15:18:06.500937 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="185c4e239ced638ac0076e4a733333416d89b3aee09f11611e0da805a657282e" Mar 12 15:18:06 crc kubenswrapper[4869]: I0312 15:18:06.500756 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-2794b" Mar 12 15:18:06 crc kubenswrapper[4869]: I0312 15:18:06.561673 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xlnsg"] Mar 12 15:18:06 crc kubenswrapper[4869]: E0312 15:18:06.562136 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69ed9651-a1cb-4470-8629-adeb0dea6377" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 12 15:18:06 crc kubenswrapper[4869]: I0312 15:18:06.562149 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="69ed9651-a1cb-4470-8629-adeb0dea6377" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 12 15:18:06 crc kubenswrapper[4869]: E0312 15:18:06.562163 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a11410e3-6d91-4cfe-b61d-fa28af63504a" containerName="oc" Mar 12 15:18:06 crc kubenswrapper[4869]: I0312 15:18:06.562169 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="a11410e3-6d91-4cfe-b61d-fa28af63504a" containerName="oc" Mar 12 15:18:06 crc kubenswrapper[4869]: I0312 15:18:06.562362 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="a11410e3-6d91-4cfe-b61d-fa28af63504a" containerName="oc" Mar 12 15:18:06 crc kubenswrapper[4869]: I0312 15:18:06.562376 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="69ed9651-a1cb-4470-8629-adeb0dea6377" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 12 15:18:06 crc kubenswrapper[4869]: I0312 15:18:06.563051 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xlnsg" Mar 12 15:18:06 crc kubenswrapper[4869]: I0312 15:18:06.568496 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 15:18:06 crc kubenswrapper[4869]: I0312 15:18:06.568736 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 15:18:06 crc kubenswrapper[4869]: I0312 15:18:06.568885 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cxsgq" Mar 12 15:18:06 crc kubenswrapper[4869]: I0312 15:18:06.569532 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 15:18:06 crc kubenswrapper[4869]: I0312 15:18:06.574008 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xlnsg"] Mar 12 15:18:06 crc kubenswrapper[4869]: I0312 15:18:06.665825 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc29ed56-a386-475e-aece-7d5d20a479dd-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xlnsg\" (UID: \"cc29ed56-a386-475e-aece-7d5d20a479dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xlnsg" Mar 12 15:18:06 crc kubenswrapper[4869]: I0312 15:18:06.665888 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cc29ed56-a386-475e-aece-7d5d20a479dd-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xlnsg\" (UID: \"cc29ed56-a386-475e-aece-7d5d20a479dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xlnsg" Mar 12 15:18:06 crc kubenswrapper[4869]: I0312 15:18:06.666350 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rjrb\" (UniqueName: \"kubernetes.io/projected/cc29ed56-a386-475e-aece-7d5d20a479dd-kube-api-access-7rjrb\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xlnsg\" (UID: \"cc29ed56-a386-475e-aece-7d5d20a479dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xlnsg" Mar 12 15:18:06 crc kubenswrapper[4869]: I0312 15:18:06.768141 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rjrb\" (UniqueName: \"kubernetes.io/projected/cc29ed56-a386-475e-aece-7d5d20a479dd-kube-api-access-7rjrb\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xlnsg\" (UID: \"cc29ed56-a386-475e-aece-7d5d20a479dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xlnsg" Mar 12 15:18:06 crc kubenswrapper[4869]: I0312 15:18:06.768454 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc29ed56-a386-475e-aece-7d5d20a479dd-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xlnsg\" (UID: \"cc29ed56-a386-475e-aece-7d5d20a479dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xlnsg" Mar 12 15:18:06 crc kubenswrapper[4869]: I0312 15:18:06.768532 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cc29ed56-a386-475e-aece-7d5d20a479dd-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xlnsg\" (UID: \"cc29ed56-a386-475e-aece-7d5d20a479dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xlnsg" Mar 12 15:18:06 crc kubenswrapper[4869]: I0312 15:18:06.773500 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cc29ed56-a386-475e-aece-7d5d20a479dd-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xlnsg\" (UID: \"cc29ed56-a386-475e-aece-7d5d20a479dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xlnsg" Mar 12 15:18:06 crc kubenswrapper[4869]: I0312 15:18:06.777042 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc29ed56-a386-475e-aece-7d5d20a479dd-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xlnsg\" (UID: \"cc29ed56-a386-475e-aece-7d5d20a479dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xlnsg" Mar 12 15:18:06 crc kubenswrapper[4869]: I0312 15:18:06.787014 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rjrb\" (UniqueName: \"kubernetes.io/projected/cc29ed56-a386-475e-aece-7d5d20a479dd-kube-api-access-7rjrb\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xlnsg\" (UID: \"cc29ed56-a386-475e-aece-7d5d20a479dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xlnsg" Mar 12 15:18:06 crc kubenswrapper[4869]: I0312 15:18:06.888826 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xlnsg" Mar 12 15:18:07 crc kubenswrapper[4869]: I0312 15:18:07.371352 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xlnsg"] Mar 12 15:18:07 crc kubenswrapper[4869]: I0312 15:18:07.508727 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xlnsg" event={"ID":"cc29ed56-a386-475e-aece-7d5d20a479dd","Type":"ContainerStarted","Data":"c75b3704c6bc9bd32dc07b3d45cac00424f078cbcd2111c4824345029028d93f"} Mar 12 15:18:08 crc kubenswrapper[4869]: I0312 15:18:08.520923 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xlnsg" event={"ID":"cc29ed56-a386-475e-aece-7d5d20a479dd","Type":"ContainerStarted","Data":"7681b0bba29b0d009196d9d5e7656c188d5fc87bf92a47da4e23af03d018a505"} Mar 12 15:18:08 crc kubenswrapper[4869]: I0312 15:18:08.536180 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xlnsg" podStartSLOduration=2.126579759 podStartE2EDuration="2.536159298s" podCreationTimestamp="2026-03-12 15:18:06 +0000 UTC" firstStartedPulling="2026-03-12 15:18:07.376580331 +0000 UTC m=+1839.661805609" lastFinishedPulling="2026-03-12 15:18:07.78615987 +0000 UTC m=+1840.071385148" observedRunningTime="2026-03-12 15:18:08.534139576 +0000 UTC m=+1840.819364854" watchObservedRunningTime="2026-03-12 15:18:08.536159298 +0000 UTC m=+1840.821384576" Mar 12 15:18:19 crc kubenswrapper[4869]: I0312 15:18:19.337110 4869 scope.go:117] "RemoveContainer" containerID="bc737999b30693337d1e520b47de07490eab1ec628aefabcaeb8e63a17e676eb" Mar 12 15:18:19 crc kubenswrapper[4869]: E0312 15:18:19.338166 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:18:23 crc kubenswrapper[4869]: I0312 15:18:23.029837 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-4rkhc"] Mar 12 15:18:23 crc kubenswrapper[4869]: I0312 15:18:23.039860 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-4rkhc"] Mar 12 15:18:24 crc kubenswrapper[4869]: I0312 15:18:24.348304 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb0e86fd-502f-4fef-9f29-4d612a8d111f" path="/var/lib/kubelet/pods/bb0e86fd-502f-4fef-9f29-4d612a8d111f/volumes" Mar 12 15:18:32 crc kubenswrapper[4869]: I0312 15:18:32.337450 4869 scope.go:117] "RemoveContainer" containerID="bc737999b30693337d1e520b47de07490eab1ec628aefabcaeb8e63a17e676eb" Mar 12 15:18:32 crc kubenswrapper[4869]: E0312 15:18:32.338441 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:18:33 crc kubenswrapper[4869]: I0312 15:18:33.046512 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-9p9gw"] Mar 12 15:18:33 crc kubenswrapper[4869]: I0312 15:18:33.058788 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-9p9gw"] Mar 12 15:18:33 crc kubenswrapper[4869]: I0312 15:18:33.075742 4869 scope.go:117] "RemoveContainer" containerID="749a7118013cbc6f8c1739816d664a47f147c25b58b38882f1e0f5390a076d26" Mar 12 15:18:33 crc kubenswrapper[4869]: I0312 15:18:33.120556 4869 scope.go:117] "RemoveContainer" containerID="f50a5fe27635df7f9f7c005365851251f0111e896a624e12eefe7984ae07ddef" Mar 12 15:18:33 crc kubenswrapper[4869]: I0312 15:18:33.163305 4869 scope.go:117] "RemoveContainer" containerID="f6cfca39163cd028fe9ac1d3846a8353e6ddb513810a96eec760ed92c908ebaa" Mar 12 15:18:33 crc kubenswrapper[4869]: I0312 15:18:33.188361 4869 scope.go:117] "RemoveContainer" containerID="089f1642db2255e3eebe62fe11a94058c554f002ae443794d01353199c0fce28" Mar 12 15:18:33 crc kubenswrapper[4869]: I0312 15:18:33.231503 4869 scope.go:117] "RemoveContainer" containerID="99ab82161b41e97ab7c1ebd91e6d6c0cf00ba76e190a938a5a292faf679ad0ab" Mar 12 15:18:34 crc kubenswrapper[4869]: I0312 15:18:34.352496 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8721cab-3eb8-4c80-a0c8-79c7e007b614" path="/var/lib/kubelet/pods/e8721cab-3eb8-4c80-a0c8-79c7e007b614/volumes" Mar 12 15:18:40 crc kubenswrapper[4869]: I0312 15:18:40.811995 4869 generic.go:334] "Generic (PLEG): container finished" podID="cc29ed56-a386-475e-aece-7d5d20a479dd" containerID="7681b0bba29b0d009196d9d5e7656c188d5fc87bf92a47da4e23af03d018a505" exitCode=0 Mar 12 15:18:40 crc kubenswrapper[4869]: I0312 15:18:40.812096 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xlnsg" event={"ID":"cc29ed56-a386-475e-aece-7d5d20a479dd","Type":"ContainerDied","Data":"7681b0bba29b0d009196d9d5e7656c188d5fc87bf92a47da4e23af03d018a505"} Mar 12 15:18:42 crc kubenswrapper[4869]: I0312 15:18:42.237420 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xlnsg" Mar 12 15:18:42 crc kubenswrapper[4869]: I0312 15:18:42.283044 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cc29ed56-a386-475e-aece-7d5d20a479dd-ssh-key-openstack-edpm-ipam\") pod \"cc29ed56-a386-475e-aece-7d5d20a479dd\" (UID: \"cc29ed56-a386-475e-aece-7d5d20a479dd\") " Mar 12 15:18:42 crc kubenswrapper[4869]: I0312 15:18:42.283253 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc29ed56-a386-475e-aece-7d5d20a479dd-inventory\") pod \"cc29ed56-a386-475e-aece-7d5d20a479dd\" (UID: \"cc29ed56-a386-475e-aece-7d5d20a479dd\") " Mar 12 15:18:42 crc kubenswrapper[4869]: I0312 15:18:42.283364 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rjrb\" (UniqueName: \"kubernetes.io/projected/cc29ed56-a386-475e-aece-7d5d20a479dd-kube-api-access-7rjrb\") pod \"cc29ed56-a386-475e-aece-7d5d20a479dd\" (UID: \"cc29ed56-a386-475e-aece-7d5d20a479dd\") " Mar 12 15:18:42 crc kubenswrapper[4869]: I0312 15:18:42.289449 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc29ed56-a386-475e-aece-7d5d20a479dd-kube-api-access-7rjrb" (OuterVolumeSpecName: "kube-api-access-7rjrb") pod "cc29ed56-a386-475e-aece-7d5d20a479dd" (UID: "cc29ed56-a386-475e-aece-7d5d20a479dd"). InnerVolumeSpecName "kube-api-access-7rjrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:18:42 crc kubenswrapper[4869]: I0312 15:18:42.311913 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc29ed56-a386-475e-aece-7d5d20a479dd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cc29ed56-a386-475e-aece-7d5d20a479dd" (UID: "cc29ed56-a386-475e-aece-7d5d20a479dd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:18:42 crc kubenswrapper[4869]: I0312 15:18:42.313974 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc29ed56-a386-475e-aece-7d5d20a479dd-inventory" (OuterVolumeSpecName: "inventory") pod "cc29ed56-a386-475e-aece-7d5d20a479dd" (UID: "cc29ed56-a386-475e-aece-7d5d20a479dd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:18:42 crc kubenswrapper[4869]: I0312 15:18:42.386380 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cc29ed56-a386-475e-aece-7d5d20a479dd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 15:18:42 crc kubenswrapper[4869]: I0312 15:18:42.386411 4869 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc29ed56-a386-475e-aece-7d5d20a479dd-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 15:18:42 crc kubenswrapper[4869]: I0312 15:18:42.386423 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rjrb\" (UniqueName: \"kubernetes.io/projected/cc29ed56-a386-475e-aece-7d5d20a479dd-kube-api-access-7rjrb\") on node \"crc\" DevicePath \"\"" Mar 12 15:18:42 crc kubenswrapper[4869]: I0312 15:18:42.830604 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xlnsg" event={"ID":"cc29ed56-a386-475e-aece-7d5d20a479dd","Type":"ContainerDied","Data":"c75b3704c6bc9bd32dc07b3d45cac00424f078cbcd2111c4824345029028d93f"} Mar 12 15:18:42 crc kubenswrapper[4869]: I0312 15:18:42.830657 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c75b3704c6bc9bd32dc07b3d45cac00424f078cbcd2111c4824345029028d93f" Mar 12 15:18:42 crc kubenswrapper[4869]: I0312 15:18:42.831255 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xlnsg" Mar 12 15:18:42 crc kubenswrapper[4869]: I0312 15:18:42.918518 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9lxgf"] Mar 12 15:18:42 crc kubenswrapper[4869]: E0312 15:18:42.919086 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc29ed56-a386-475e-aece-7d5d20a479dd" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 12 15:18:42 crc kubenswrapper[4869]: I0312 15:18:42.919110 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc29ed56-a386-475e-aece-7d5d20a479dd" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 12 15:18:42 crc kubenswrapper[4869]: I0312 15:18:42.919320 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc29ed56-a386-475e-aece-7d5d20a479dd" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 12 15:18:42 crc kubenswrapper[4869]: I0312 15:18:42.920041 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9lxgf" Mar 12 15:18:42 crc kubenswrapper[4869]: I0312 15:18:42.923888 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cxsgq" Mar 12 15:18:42 crc kubenswrapper[4869]: I0312 15:18:42.924008 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 15:18:42 crc kubenswrapper[4869]: I0312 15:18:42.924164 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 15:18:42 crc kubenswrapper[4869]: I0312 15:18:42.924301 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 15:18:42 crc kubenswrapper[4869]: I0312 15:18:42.935404 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9lxgf"] Mar 12 15:18:42 crc kubenswrapper[4869]: I0312 15:18:42.998324 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed6cecf2-9a8d-4bd2-8f07-29410d12dba2-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9lxgf\" (UID: \"ed6cecf2-9a8d-4bd2-8f07-29410d12dba2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9lxgf" Mar 12 15:18:42 crc kubenswrapper[4869]: I0312 15:18:42.998423 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed6cecf2-9a8d-4bd2-8f07-29410d12dba2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9lxgf\" (UID: \"ed6cecf2-9a8d-4bd2-8f07-29410d12dba2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9lxgf" Mar 12 15:18:42 crc kubenswrapper[4869]: I0312 15:18:42.998579 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf7m7\" (UniqueName: \"kubernetes.io/projected/ed6cecf2-9a8d-4bd2-8f07-29410d12dba2-kube-api-access-kf7m7\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9lxgf\" (UID: \"ed6cecf2-9a8d-4bd2-8f07-29410d12dba2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9lxgf" Mar 12 15:18:43 crc kubenswrapper[4869]: I0312 15:18:43.100420 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed6cecf2-9a8d-4bd2-8f07-29410d12dba2-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9lxgf\" (UID: \"ed6cecf2-9a8d-4bd2-8f07-29410d12dba2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9lxgf" Mar 12 15:18:43 crc kubenswrapper[4869]: I0312 15:18:43.100508 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed6cecf2-9a8d-4bd2-8f07-29410d12dba2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9lxgf\" (UID: \"ed6cecf2-9a8d-4bd2-8f07-29410d12dba2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9lxgf" Mar 12 15:18:43 crc kubenswrapper[4869]: I0312 15:18:43.100629 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf7m7\" (UniqueName: \"kubernetes.io/projected/ed6cecf2-9a8d-4bd2-8f07-29410d12dba2-kube-api-access-kf7m7\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9lxgf\" (UID: \"ed6cecf2-9a8d-4bd2-8f07-29410d12dba2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9lxgf" Mar 12 15:18:43 crc kubenswrapper[4869]: I0312 15:18:43.105572 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed6cecf2-9a8d-4bd2-8f07-29410d12dba2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9lxgf\" (UID: \"ed6cecf2-9a8d-4bd2-8f07-29410d12dba2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9lxgf" Mar 12 15:18:43 crc kubenswrapper[4869]: I0312 15:18:43.105688 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed6cecf2-9a8d-4bd2-8f07-29410d12dba2-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9lxgf\" (UID: \"ed6cecf2-9a8d-4bd2-8f07-29410d12dba2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9lxgf" Mar 12 15:18:43 crc kubenswrapper[4869]: I0312 15:18:43.124156 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf7m7\" (UniqueName: \"kubernetes.io/projected/ed6cecf2-9a8d-4bd2-8f07-29410d12dba2-kube-api-access-kf7m7\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9lxgf\" (UID: \"ed6cecf2-9a8d-4bd2-8f07-29410d12dba2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9lxgf" Mar 12 15:18:43 crc kubenswrapper[4869]: I0312 15:18:43.238956 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9lxgf" Mar 12 15:18:43 crc kubenswrapper[4869]: I0312 15:18:43.848128 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9lxgf"] Mar 12 15:18:44 crc kubenswrapper[4869]: I0312 15:18:44.859751 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9lxgf" event={"ID":"ed6cecf2-9a8d-4bd2-8f07-29410d12dba2","Type":"ContainerStarted","Data":"18a135aa67d3c0e072cce584f1ef3e6d5f46726a9582ad22d6ae26896df7efd5"} Mar 12 15:18:44 crc kubenswrapper[4869]: I0312 15:18:44.861364 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9lxgf" event={"ID":"ed6cecf2-9a8d-4bd2-8f07-29410d12dba2","Type":"ContainerStarted","Data":"3bdb2ff10e5f3c62b7f106a7b4460cfaf746d0fe8201b7163c539b096680a425"} Mar 12 15:18:44 crc kubenswrapper[4869]: I0312 15:18:44.875581 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9lxgf" podStartSLOduration=2.474147292 podStartE2EDuration="2.875559532s" podCreationTimestamp="2026-03-12 15:18:42 +0000 UTC" firstStartedPulling="2026-03-12 15:18:43.854057366 +0000 UTC m=+1876.139282654" lastFinishedPulling="2026-03-12 15:18:44.255469616 +0000 UTC m=+1876.540694894" observedRunningTime="2026-03-12 15:18:44.87468541 +0000 UTC m=+1877.159910698" watchObservedRunningTime="2026-03-12 15:18:44.875559532 +0000 UTC m=+1877.160784810" Mar 12 15:18:46 crc kubenswrapper[4869]: I0312 15:18:46.337339 4869 scope.go:117] "RemoveContainer" containerID="bc737999b30693337d1e520b47de07490eab1ec628aefabcaeb8e63a17e676eb" Mar 12 15:18:46 crc kubenswrapper[4869]: E0312 15:18:46.337715 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:18:50 crc kubenswrapper[4869]: I0312 15:18:50.053995 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-t7fgp"] Mar 12 15:18:50 crc kubenswrapper[4869]: I0312 15:18:50.067733 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-t7fgp"] Mar 12 15:18:50 crc kubenswrapper[4869]: I0312 15:18:50.351628 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca1953bb-fc5d-4285-8de4-b67746201d05" path="/var/lib/kubelet/pods/ca1953bb-fc5d-4285-8de4-b67746201d05/volumes" Mar 12 15:18:59 crc kubenswrapper[4869]: I0312 15:18:59.336530 4869 scope.go:117] "RemoveContainer" containerID="bc737999b30693337d1e520b47de07490eab1ec628aefabcaeb8e63a17e676eb" Mar 12 15:18:59 crc kubenswrapper[4869]: E0312 15:18:59.337439 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:19:03 crc kubenswrapper[4869]: I0312 15:19:03.040027 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-1e62-account-create-update-8jrd8"] Mar 12 15:19:03 crc kubenswrapper[4869]: I0312 15:19:03.055099 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-1e62-account-create-update-8jrd8"] Mar 12 15:19:04 crc kubenswrapper[4869]: I0312 15:19:04.033686 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-e653-account-create-update-mjf5p"] Mar 12 15:19:04 crc kubenswrapper[4869]: I0312 15:19:04.050954 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-e653-account-create-update-mjf5p"] Mar 12 15:19:04 crc kubenswrapper[4869]: I0312 15:19:04.348912 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7" path="/var/lib/kubelet/pods/2e0c6ab6-b83f-4d2a-9a5f-30c18a57b0d7/volumes" Mar 12 15:19:04 crc kubenswrapper[4869]: I0312 15:19:04.349454 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8081e56b-635e-4137-982b-c5eafd77af8e" path="/var/lib/kubelet/pods/8081e56b-635e-4137-982b-c5eafd77af8e/volumes" Mar 12 15:19:05 crc kubenswrapper[4869]: I0312 15:19:05.041985 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-pzfq5"] Mar 12 15:19:05 crc kubenswrapper[4869]: I0312 15:19:05.052162 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-43b5-account-create-update-5ggll"] Mar 12 15:19:05 crc kubenswrapper[4869]: I0312 15:19:05.066179 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-mb78b"] Mar 12 15:19:05 crc kubenswrapper[4869]: I0312 15:19:05.075229 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-96j6d"] Mar 12 15:19:05 crc kubenswrapper[4869]: I0312 15:19:05.082703 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-pzfq5"] Mar 12 15:19:05 crc kubenswrapper[4869]: I0312 15:19:05.090462 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-mb78b"] Mar 12 15:19:05 crc kubenswrapper[4869]: I0312 15:19:05.097703 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-96j6d"] Mar 12 15:19:05 crc kubenswrapper[4869]: I0312 15:19:05.105281 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-43b5-account-create-update-5ggll"] Mar 12 15:19:06 crc kubenswrapper[4869]: I0312 15:19:06.361275 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42cdbca9-8e71-4fd8-8389-d4476417217b" path="/var/lib/kubelet/pods/42cdbca9-8e71-4fd8-8389-d4476417217b/volumes" Mar 12 15:19:06 crc kubenswrapper[4869]: I0312 15:19:06.363383 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e396abd5-7a42-405e-827f-85f8426c6ed6" path="/var/lib/kubelet/pods/e396abd5-7a42-405e-827f-85f8426c6ed6/volumes" Mar 12 15:19:06 crc kubenswrapper[4869]: I0312 15:19:06.367181 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2" path="/var/lib/kubelet/pods/ea59d7f5-4e5d-4f4e-8c4f-310db8219fc2/volumes" Mar 12 15:19:06 crc kubenswrapper[4869]: I0312 15:19:06.368097 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec9fe62a-a7c1-4c9b-8520-2046bc10995a" path="/var/lib/kubelet/pods/ec9fe62a-a7c1-4c9b-8520-2046bc10995a/volumes" Mar 12 15:19:14 crc kubenswrapper[4869]: I0312 15:19:14.336879 4869 scope.go:117] "RemoveContainer" containerID="bc737999b30693337d1e520b47de07490eab1ec628aefabcaeb8e63a17e676eb" Mar 12 15:19:14 crc kubenswrapper[4869]: E0312 15:19:14.337975 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:19:18 crc kubenswrapper[4869]: I0312 15:19:18.349530 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j8qs4"] Mar 12 15:19:18 crc kubenswrapper[4869]: I0312 15:19:18.351756 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j8qs4" Mar 12 15:19:18 crc kubenswrapper[4869]: I0312 15:19:18.364382 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j8qs4"] Mar 12 15:19:18 crc kubenswrapper[4869]: I0312 15:19:18.444954 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f53df6ad-8f02-495f-a613-969e90d82bde-catalog-content\") pod \"redhat-operators-j8qs4\" (UID: \"f53df6ad-8f02-495f-a613-969e90d82bde\") " pod="openshift-marketplace/redhat-operators-j8qs4" Mar 12 15:19:18 crc kubenswrapper[4869]: I0312 15:19:18.445049 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f53df6ad-8f02-495f-a613-969e90d82bde-utilities\") pod \"redhat-operators-j8qs4\" (UID: \"f53df6ad-8f02-495f-a613-969e90d82bde\") " pod="openshift-marketplace/redhat-operators-j8qs4" Mar 12 15:19:18 crc kubenswrapper[4869]: I0312 15:19:18.445094 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbp2n\" (UniqueName: \"kubernetes.io/projected/f53df6ad-8f02-495f-a613-969e90d82bde-kube-api-access-sbp2n\") pod \"redhat-operators-j8qs4\" (UID: \"f53df6ad-8f02-495f-a613-969e90d82bde\") " pod="openshift-marketplace/redhat-operators-j8qs4" Mar 12 15:19:18 crc kubenswrapper[4869]: I0312 15:19:18.546047 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f53df6ad-8f02-495f-a613-969e90d82bde-catalog-content\") pod \"redhat-operators-j8qs4\" (UID: \"f53df6ad-8f02-495f-a613-969e90d82bde\") " pod="openshift-marketplace/redhat-operators-j8qs4" Mar 12 15:19:18 crc kubenswrapper[4869]: I0312 15:19:18.546113 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f53df6ad-8f02-495f-a613-969e90d82bde-utilities\") pod \"redhat-operators-j8qs4\" (UID: \"f53df6ad-8f02-495f-a613-969e90d82bde\") " pod="openshift-marketplace/redhat-operators-j8qs4" Mar 12 15:19:18 crc kubenswrapper[4869]: I0312 15:19:18.546141 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbp2n\" (UniqueName: \"kubernetes.io/projected/f53df6ad-8f02-495f-a613-969e90d82bde-kube-api-access-sbp2n\") pod \"redhat-operators-j8qs4\" (UID: \"f53df6ad-8f02-495f-a613-969e90d82bde\") " pod="openshift-marketplace/redhat-operators-j8qs4" Mar 12 15:19:18 crc kubenswrapper[4869]: I0312 15:19:18.546627 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f53df6ad-8f02-495f-a613-969e90d82bde-utilities\") pod \"redhat-operators-j8qs4\" (UID: \"f53df6ad-8f02-495f-a613-969e90d82bde\") " pod="openshift-marketplace/redhat-operators-j8qs4" Mar 12 15:19:18 crc kubenswrapper[4869]: I0312 15:19:18.546667 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f53df6ad-8f02-495f-a613-969e90d82bde-catalog-content\") pod \"redhat-operators-j8qs4\" (UID: \"f53df6ad-8f02-495f-a613-969e90d82bde\") " pod="openshift-marketplace/redhat-operators-j8qs4" Mar 12 15:19:18 crc kubenswrapper[4869]: I0312 15:19:18.565479 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbp2n\" (UniqueName: \"kubernetes.io/projected/f53df6ad-8f02-495f-a613-969e90d82bde-kube-api-access-sbp2n\") pod \"redhat-operators-j8qs4\" (UID: \"f53df6ad-8f02-495f-a613-969e90d82bde\") " pod="openshift-marketplace/redhat-operators-j8qs4" Mar 12 15:19:18 crc kubenswrapper[4869]: I0312 15:19:18.674826 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j8qs4" Mar 12 15:19:19 crc kubenswrapper[4869]: I0312 15:19:19.183520 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j8qs4"] Mar 12 15:19:19 crc kubenswrapper[4869]: I0312 15:19:19.202723 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j8qs4" event={"ID":"f53df6ad-8f02-495f-a613-969e90d82bde","Type":"ContainerStarted","Data":"b0e6e92051bcc6d5ecd3a2d56cd63da60de49c16bc1eb233ac4359a449df04d8"} Mar 12 15:19:20 crc kubenswrapper[4869]: I0312 15:19:20.224197 4869 generic.go:334] "Generic (PLEG): container finished" podID="f53df6ad-8f02-495f-a613-969e90d82bde" containerID="5603f8b34a7d79c2103ceebb155270bc138b6b2ad5d9096a0962e3c5aceb23f0" exitCode=0 Mar 12 15:19:20 crc kubenswrapper[4869]: I0312 15:19:20.224267 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j8qs4" event={"ID":"f53df6ad-8f02-495f-a613-969e90d82bde","Type":"ContainerDied","Data":"5603f8b34a7d79c2103ceebb155270bc138b6b2ad5d9096a0962e3c5aceb23f0"} Mar 12 15:19:21 crc kubenswrapper[4869]: I0312 15:19:21.233935 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j8qs4" event={"ID":"f53df6ad-8f02-495f-a613-969e90d82bde","Type":"ContainerStarted","Data":"e6c8ad6dacd3c979e7ad740f80048a8eb1eff8f75cfaa9e47e5ff3e08f094488"} Mar 12 15:19:26 crc kubenswrapper[4869]: I0312 15:19:26.293293 4869 generic.go:334] "Generic (PLEG): container finished" podID="f53df6ad-8f02-495f-a613-969e90d82bde" containerID="e6c8ad6dacd3c979e7ad740f80048a8eb1eff8f75cfaa9e47e5ff3e08f094488" exitCode=0 Mar 12 15:19:26 crc kubenswrapper[4869]: I0312 15:19:26.293348 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j8qs4" event={"ID":"f53df6ad-8f02-495f-a613-969e90d82bde","Type":"ContainerDied","Data":"e6c8ad6dacd3c979e7ad740f80048a8eb1eff8f75cfaa9e47e5ff3e08f094488"} Mar 12 15:19:27 crc kubenswrapper[4869]: I0312 15:19:27.305571 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j8qs4" event={"ID":"f53df6ad-8f02-495f-a613-969e90d82bde","Type":"ContainerStarted","Data":"890498c178b82a49a1f0cb0940fbbbeaacddd8326b6652e2719abfe1c9856ce9"} Mar 12 15:19:27 crc kubenswrapper[4869]: I0312 15:19:27.327111 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j8qs4" podStartSLOduration=2.849656145 podStartE2EDuration="9.32709519s" podCreationTimestamp="2026-03-12 15:19:18 +0000 UTC" firstStartedPulling="2026-03-12 15:19:20.227403033 +0000 UTC m=+1912.512628341" lastFinishedPulling="2026-03-12 15:19:26.704842108 +0000 UTC m=+1918.990067386" observedRunningTime="2026-03-12 15:19:27.32667811 +0000 UTC m=+1919.611903398" watchObservedRunningTime="2026-03-12 15:19:27.32709519 +0000 UTC m=+1919.612320468" Mar 12 15:19:27 crc kubenswrapper[4869]: I0312 15:19:27.336758 4869 scope.go:117] "RemoveContainer" containerID="bc737999b30693337d1e520b47de07490eab1ec628aefabcaeb8e63a17e676eb" Mar 12 15:19:27 crc kubenswrapper[4869]: E0312 15:19:27.337048 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:19:28 crc kubenswrapper[4869]: I0312 15:19:28.316391 4869 generic.go:334] "Generic (PLEG): container finished" podID="ed6cecf2-9a8d-4bd2-8f07-29410d12dba2" containerID="18a135aa67d3c0e072cce584f1ef3e6d5f46726a9582ad22d6ae26896df7efd5" exitCode=0 Mar 12 15:19:28 crc kubenswrapper[4869]: I0312 15:19:28.316465 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9lxgf" event={"ID":"ed6cecf2-9a8d-4bd2-8f07-29410d12dba2","Type":"ContainerDied","Data":"18a135aa67d3c0e072cce584f1ef3e6d5f46726a9582ad22d6ae26896df7efd5"} Mar 12 15:19:28 crc kubenswrapper[4869]: I0312 15:19:28.675131 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j8qs4" Mar 12 15:19:28 crc kubenswrapper[4869]: I0312 15:19:28.675619 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j8qs4" Mar 12 15:19:29 crc kubenswrapper[4869]: I0312 15:19:29.732277 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-j8qs4" podUID="f53df6ad-8f02-495f-a613-969e90d82bde" containerName="registry-server" probeResult="failure" output=< Mar 12 15:19:29 crc kubenswrapper[4869]: timeout: failed to connect service ":50051" within 1s Mar 12 15:19:29 crc kubenswrapper[4869]: > Mar 12 15:19:29 crc kubenswrapper[4869]: I0312 15:19:29.783640 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9lxgf" Mar 12 15:19:29 crc kubenswrapper[4869]: I0312 15:19:29.869271 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf7m7\" (UniqueName: \"kubernetes.io/projected/ed6cecf2-9a8d-4bd2-8f07-29410d12dba2-kube-api-access-kf7m7\") pod \"ed6cecf2-9a8d-4bd2-8f07-29410d12dba2\" (UID: \"ed6cecf2-9a8d-4bd2-8f07-29410d12dba2\") " Mar 12 15:19:29 crc kubenswrapper[4869]: I0312 15:19:29.869591 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed6cecf2-9a8d-4bd2-8f07-29410d12dba2-ssh-key-openstack-edpm-ipam\") pod \"ed6cecf2-9a8d-4bd2-8f07-29410d12dba2\" (UID: \"ed6cecf2-9a8d-4bd2-8f07-29410d12dba2\") " Mar 12 15:19:29 crc kubenswrapper[4869]: I0312 15:19:29.869631 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed6cecf2-9a8d-4bd2-8f07-29410d12dba2-inventory\") pod \"ed6cecf2-9a8d-4bd2-8f07-29410d12dba2\" (UID: \"ed6cecf2-9a8d-4bd2-8f07-29410d12dba2\") " Mar 12 15:19:29 crc kubenswrapper[4869]: I0312 15:19:29.876766 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed6cecf2-9a8d-4bd2-8f07-29410d12dba2-kube-api-access-kf7m7" (OuterVolumeSpecName: "kube-api-access-kf7m7") pod "ed6cecf2-9a8d-4bd2-8f07-29410d12dba2" (UID: "ed6cecf2-9a8d-4bd2-8f07-29410d12dba2"). InnerVolumeSpecName "kube-api-access-kf7m7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:19:29 crc kubenswrapper[4869]: I0312 15:19:29.899789 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed6cecf2-9a8d-4bd2-8f07-29410d12dba2-inventory" (OuterVolumeSpecName: "inventory") pod "ed6cecf2-9a8d-4bd2-8f07-29410d12dba2" (UID: "ed6cecf2-9a8d-4bd2-8f07-29410d12dba2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:19:29 crc kubenswrapper[4869]: I0312 15:19:29.907019 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed6cecf2-9a8d-4bd2-8f07-29410d12dba2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ed6cecf2-9a8d-4bd2-8f07-29410d12dba2" (UID: "ed6cecf2-9a8d-4bd2-8f07-29410d12dba2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:19:29 crc kubenswrapper[4869]: I0312 15:19:29.973640 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed6cecf2-9a8d-4bd2-8f07-29410d12dba2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 15:19:29 crc kubenswrapper[4869]: I0312 15:19:29.973716 4869 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed6cecf2-9a8d-4bd2-8f07-29410d12dba2-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 15:19:29 crc kubenswrapper[4869]: I0312 15:19:29.973736 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kf7m7\" (UniqueName: \"kubernetes.io/projected/ed6cecf2-9a8d-4bd2-8f07-29410d12dba2-kube-api-access-kf7m7\") on node \"crc\" DevicePath \"\"" Mar 12 15:19:30 crc kubenswrapper[4869]: I0312 15:19:30.332127 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9lxgf" event={"ID":"ed6cecf2-9a8d-4bd2-8f07-29410d12dba2","Type":"ContainerDied","Data":"3bdb2ff10e5f3c62b7f106a7b4460cfaf746d0fe8201b7163c539b096680a425"} Mar 12 15:19:30 crc kubenswrapper[4869]: I0312 15:19:30.332421 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bdb2ff10e5f3c62b7f106a7b4460cfaf746d0fe8201b7163c539b096680a425" Mar 12 15:19:30 crc kubenswrapper[4869]: I0312 15:19:30.332476 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9lxgf" Mar 12 15:19:30 crc kubenswrapper[4869]: I0312 15:19:30.428440 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9hsm8"] Mar 12 15:19:30 crc kubenswrapper[4869]: E0312 15:19:30.428901 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed6cecf2-9a8d-4bd2-8f07-29410d12dba2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 12 15:19:30 crc kubenswrapper[4869]: I0312 15:19:30.428921 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed6cecf2-9a8d-4bd2-8f07-29410d12dba2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 12 15:19:30 crc kubenswrapper[4869]: I0312 15:19:30.429122 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed6cecf2-9a8d-4bd2-8f07-29410d12dba2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 12 15:19:30 crc kubenswrapper[4869]: I0312 15:19:30.429781 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9hsm8" Mar 12 15:19:30 crc kubenswrapper[4869]: I0312 15:19:30.431688 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 15:19:30 crc kubenswrapper[4869]: I0312 15:19:30.432076 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 15:19:30 crc kubenswrapper[4869]: I0312 15:19:30.432183 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 15:19:30 crc kubenswrapper[4869]: I0312 15:19:30.432207 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cxsgq" Mar 12 15:19:30 crc kubenswrapper[4869]: I0312 15:19:30.441812 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9hsm8"] Mar 12 15:19:30 crc kubenswrapper[4869]: I0312 15:19:30.585386 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9hsm8\" (UID: \"e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3\") " pod="openstack/ssh-known-hosts-edpm-deployment-9hsm8" Mar 12 15:19:30 crc kubenswrapper[4869]: I0312 15:19:30.585483 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz9pt\" (UniqueName: \"kubernetes.io/projected/e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3-kube-api-access-rz9pt\") pod \"ssh-known-hosts-edpm-deployment-9hsm8\" (UID: \"e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3\") " pod="openstack/ssh-known-hosts-edpm-deployment-9hsm8" Mar 12 15:19:30 crc kubenswrapper[4869]: I0312 15:19:30.585525 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9hsm8\" (UID: \"e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3\") " pod="openstack/ssh-known-hosts-edpm-deployment-9hsm8" Mar 12 15:19:30 crc kubenswrapper[4869]: I0312 15:19:30.687665 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9hsm8\" (UID: \"e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3\") " pod="openstack/ssh-known-hosts-edpm-deployment-9hsm8" Mar 12 15:19:30 crc kubenswrapper[4869]: I0312 15:19:30.687741 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz9pt\" (UniqueName: \"kubernetes.io/projected/e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3-kube-api-access-rz9pt\") pod \"ssh-known-hosts-edpm-deployment-9hsm8\" (UID: \"e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3\") " pod="openstack/ssh-known-hosts-edpm-deployment-9hsm8" Mar 12 15:19:30 crc kubenswrapper[4869]: I0312 15:19:30.687770 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9hsm8\" (UID: \"e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3\") " pod="openstack/ssh-known-hosts-edpm-deployment-9hsm8" Mar 12 15:19:30 crc kubenswrapper[4869]: I0312 15:19:30.694311 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9hsm8\" (UID: \"e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3\") " pod="openstack/ssh-known-hosts-edpm-deployment-9hsm8" Mar 12 15:19:30 crc kubenswrapper[4869]: I0312 15:19:30.694983 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9hsm8\" (UID: \"e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3\") " pod="openstack/ssh-known-hosts-edpm-deployment-9hsm8" Mar 12 15:19:30 crc kubenswrapper[4869]: I0312 15:19:30.707940 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz9pt\" (UniqueName: \"kubernetes.io/projected/e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3-kube-api-access-rz9pt\") pod \"ssh-known-hosts-edpm-deployment-9hsm8\" (UID: \"e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3\") " pod="openstack/ssh-known-hosts-edpm-deployment-9hsm8" Mar 12 15:19:30 crc kubenswrapper[4869]: I0312 15:19:30.744585 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9hsm8" Mar 12 15:19:31 crc kubenswrapper[4869]: I0312 15:19:31.271676 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9hsm8"] Mar 12 15:19:31 crc kubenswrapper[4869]: W0312 15:19:31.277132 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9d28b57_1b1d_46a0_8a03_3ff1cdcfdcd3.slice/crio-043f73dbe43e92e14797da5d2a5b17cd5d4c0d859a695ca9d5aec900121229f1 WatchSource:0}: Error finding container 043f73dbe43e92e14797da5d2a5b17cd5d4c0d859a695ca9d5aec900121229f1: Status 404 returned error can't find the container with id 043f73dbe43e92e14797da5d2a5b17cd5d4c0d859a695ca9d5aec900121229f1 Mar 12 15:19:31 crc kubenswrapper[4869]: I0312 15:19:31.340824 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9hsm8" event={"ID":"e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3","Type":"ContainerStarted","Data":"043f73dbe43e92e14797da5d2a5b17cd5d4c0d859a695ca9d5aec900121229f1"} Mar 12 15:19:32 crc kubenswrapper[4869]: I0312 15:19:32.351467 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9hsm8" event={"ID":"e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3","Type":"ContainerStarted","Data":"f8e7efc48ffef2579058db35188140b0bb7a9d20624aea17df6ac12f14e51f02"} Mar 12 15:19:32 crc kubenswrapper[4869]: I0312 15:19:32.376984 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-9hsm8" podStartSLOduration=1.93267883 podStartE2EDuration="2.376958611s" podCreationTimestamp="2026-03-12 15:19:30 +0000 UTC" firstStartedPulling="2026-03-12 15:19:31.280045847 +0000 UTC m=+1923.565271125" lastFinishedPulling="2026-03-12 15:19:31.724325628 +0000 UTC m=+1924.009550906" observedRunningTime="2026-03-12 15:19:32.371397788 +0000 UTC m=+1924.656623066" watchObservedRunningTime="2026-03-12 15:19:32.376958611 +0000 UTC m=+1924.662183889" Mar 12 15:19:33 crc kubenswrapper[4869]: I0312 15:19:33.369793 4869 scope.go:117] "RemoveContainer" containerID="b874d3c15299068fb51df91aa3c1b783a965674412046dea094021820c9f052c" Mar 12 15:19:33 crc kubenswrapper[4869]: I0312 15:19:33.392944 4869 scope.go:117] "RemoveContainer" containerID="de32733dca9c691a1b604fb64e3a77cd2eaec7842e4b9d2439487dbab3745bff" Mar 12 15:19:33 crc kubenswrapper[4869]: I0312 15:19:33.450844 4869 scope.go:117] "RemoveContainer" containerID="665d0805fd38b2e1ff328cadab61dfff64c4ea1f95dc5fe946805046dfa08a04" Mar 12 15:19:33 crc kubenswrapper[4869]: I0312 15:19:33.524198 4869 scope.go:117] "RemoveContainer" containerID="16eb13845162e28d419a27553235b31320bb1c2ded11d9bbf92324524bce06d0" Mar 12 15:19:33 crc kubenswrapper[4869]: I0312 15:19:33.543992 4869 scope.go:117] "RemoveContainer" containerID="025491aefa24c237c6480179bf7cd3a53bf1e21bea7a7e81c4244e42724177a2" Mar 12 15:19:33 crc kubenswrapper[4869]: I0312 15:19:33.587150 4869 scope.go:117] "RemoveContainer" containerID="bb8efafcd12a55d0d332b344fd4a1e8bd28d6b8f8fea0f88349c25e3572ae200" Mar 12 15:19:33 crc kubenswrapper[4869]: I0312 15:19:33.632920 4869 scope.go:117] "RemoveContainer" containerID="7ba235e0554dcf21230c00522e982cb5615025c5105428b490d0d2c068ffc132" Mar 12 15:19:33 crc kubenswrapper[4869]: I0312 15:19:33.650049 4869 scope.go:117] "RemoveContainer" containerID="08b04517a54d502fe370e31cd01123f53f5eb592994f193a2f2359fd30ade15f" Mar 12 15:19:38 crc kubenswrapper[4869]: I0312 15:19:38.411246 4869 generic.go:334] "Generic (PLEG): container finished" podID="e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3" containerID="f8e7efc48ffef2579058db35188140b0bb7a9d20624aea17df6ac12f14e51f02" exitCode=0 Mar 12 15:19:38 crc kubenswrapper[4869]: I0312 15:19:38.411372 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9hsm8" event={"ID":"e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3","Type":"ContainerDied","Data":"f8e7efc48ffef2579058db35188140b0bb7a9d20624aea17df6ac12f14e51f02"} Mar 12 15:19:38 crc kubenswrapper[4869]: I0312 15:19:38.725080 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j8qs4" Mar 12 15:19:38 crc kubenswrapper[4869]: I0312 15:19:38.786395 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j8qs4" Mar 12 15:19:39 crc kubenswrapper[4869]: I0312 15:19:39.905017 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9hsm8" Mar 12 15:19:39 crc kubenswrapper[4869]: I0312 15:19:39.972663 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rz9pt\" (UniqueName: \"kubernetes.io/projected/e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3-kube-api-access-rz9pt\") pod \"e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3\" (UID: \"e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3\") " Mar 12 15:19:39 crc kubenswrapper[4869]: I0312 15:19:39.972723 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3-inventory-0\") pod \"e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3\" (UID: \"e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3\") " Mar 12 15:19:39 crc kubenswrapper[4869]: I0312 15:19:39.972854 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3-ssh-key-openstack-edpm-ipam\") pod \"e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3\" (UID: \"e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3\") " Mar 12 15:19:39 crc kubenswrapper[4869]: I0312 15:19:39.981856 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3-kube-api-access-rz9pt" (OuterVolumeSpecName: "kube-api-access-rz9pt") pod "e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3" (UID: "e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3"). InnerVolumeSpecName "kube-api-access-rz9pt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:19:40 crc kubenswrapper[4869]: I0312 15:19:40.004955 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3" (UID: "e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:19:40 crc kubenswrapper[4869]: I0312 15:19:40.006013 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3" (UID: "e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:19:40 crc kubenswrapper[4869]: I0312 15:19:40.075295 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 15:19:40 crc kubenswrapper[4869]: I0312 15:19:40.075336 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rz9pt\" (UniqueName: \"kubernetes.io/projected/e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3-kube-api-access-rz9pt\") on node \"crc\" DevicePath \"\"" Mar 12 15:19:40 crc kubenswrapper[4869]: I0312 15:19:40.075346 4869 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 12 15:19:40 crc kubenswrapper[4869]: I0312 15:19:40.337571 4869 scope.go:117] "RemoveContainer" containerID="bc737999b30693337d1e520b47de07490eab1ec628aefabcaeb8e63a17e676eb" Mar 12 15:19:40 crc kubenswrapper[4869]: E0312 15:19:40.337860 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:19:40 crc kubenswrapper[4869]: I0312 15:19:40.429910 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9hsm8" event={"ID":"e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3","Type":"ContainerDied","Data":"043f73dbe43e92e14797da5d2a5b17cd5d4c0d859a695ca9d5aec900121229f1"} Mar 12 15:19:40 crc kubenswrapper[4869]: I0312 15:19:40.429981 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="043f73dbe43e92e14797da5d2a5b17cd5d4c0d859a695ca9d5aec900121229f1" Mar 12 15:19:40 crc kubenswrapper[4869]: I0312 15:19:40.430045 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9hsm8" Mar 12 15:19:40 crc kubenswrapper[4869]: I0312 15:19:40.500284 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gg2w9"] Mar 12 15:19:40 crc kubenswrapper[4869]: E0312 15:19:40.500675 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3" containerName="ssh-known-hosts-edpm-deployment" Mar 12 15:19:40 crc kubenswrapper[4869]: I0312 15:19:40.500701 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3" containerName="ssh-known-hosts-edpm-deployment" Mar 12 15:19:40 crc kubenswrapper[4869]: I0312 15:19:40.500952 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3" containerName="ssh-known-hosts-edpm-deployment" Mar 12 15:19:40 crc kubenswrapper[4869]: I0312 15:19:40.501626 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gg2w9" Mar 12 15:19:40 crc kubenswrapper[4869]: I0312 15:19:40.506605 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 15:19:40 crc kubenswrapper[4869]: I0312 15:19:40.507967 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cxsgq" Mar 12 15:19:40 crc kubenswrapper[4869]: I0312 15:19:40.508008 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 15:19:40 crc kubenswrapper[4869]: I0312 15:19:40.509532 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 15:19:40 crc kubenswrapper[4869]: I0312 15:19:40.512294 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gg2w9"] Mar 12 15:19:40 crc kubenswrapper[4869]: I0312 15:19:40.551914 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j8qs4"] Mar 12 15:19:40 crc kubenswrapper[4869]: I0312 15:19:40.552178 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j8qs4" podUID="f53df6ad-8f02-495f-a613-969e90d82bde" containerName="registry-server" containerID="cri-o://890498c178b82a49a1f0cb0940fbbbeaacddd8326b6652e2719abfe1c9856ce9" gracePeriod=2 Mar 12 15:19:40 crc kubenswrapper[4869]: I0312 15:19:40.583975 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/21afe139-09ed-4ebc-b81b-1be277a8e199-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gg2w9\" (UID: \"21afe139-09ed-4ebc-b81b-1be277a8e199\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gg2w9" Mar 12 15:19:40 crc kubenswrapper[4869]: I0312 15:19:40.584101 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21afe139-09ed-4ebc-b81b-1be277a8e199-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gg2w9\" (UID: \"21afe139-09ed-4ebc-b81b-1be277a8e199\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gg2w9" Mar 12 15:19:40 crc kubenswrapper[4869]: I0312 15:19:40.584205 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmgn7\" (UniqueName: \"kubernetes.io/projected/21afe139-09ed-4ebc-b81b-1be277a8e199-kube-api-access-pmgn7\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gg2w9\" (UID: \"21afe139-09ed-4ebc-b81b-1be277a8e199\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gg2w9" Mar 12 15:19:40 crc kubenswrapper[4869]: I0312 15:19:40.689379 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmgn7\" (UniqueName: \"kubernetes.io/projected/21afe139-09ed-4ebc-b81b-1be277a8e199-kube-api-access-pmgn7\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gg2w9\" (UID: \"21afe139-09ed-4ebc-b81b-1be277a8e199\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gg2w9" Mar 12 15:19:40 crc kubenswrapper[4869]: I0312 15:19:40.689487 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/21afe139-09ed-4ebc-b81b-1be277a8e199-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gg2w9\" (UID: \"21afe139-09ed-4ebc-b81b-1be277a8e199\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gg2w9" Mar 12 15:19:40 crc kubenswrapper[4869]: I0312 15:19:40.689585 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21afe139-09ed-4ebc-b81b-1be277a8e199-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gg2w9\" (UID: \"21afe139-09ed-4ebc-b81b-1be277a8e199\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gg2w9" Mar 12 15:19:40 crc kubenswrapper[4869]: I0312 15:19:40.694036 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/21afe139-09ed-4ebc-b81b-1be277a8e199-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gg2w9\" (UID: \"21afe139-09ed-4ebc-b81b-1be277a8e199\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gg2w9" Mar 12 15:19:40 crc kubenswrapper[4869]: I0312 15:19:40.694265 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21afe139-09ed-4ebc-b81b-1be277a8e199-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gg2w9\" (UID: \"21afe139-09ed-4ebc-b81b-1be277a8e199\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gg2w9" Mar 12 15:19:40 crc kubenswrapper[4869]: I0312 15:19:40.712552 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmgn7\" (UniqueName: \"kubernetes.io/projected/21afe139-09ed-4ebc-b81b-1be277a8e199-kube-api-access-pmgn7\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gg2w9\" (UID: \"21afe139-09ed-4ebc-b81b-1be277a8e199\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gg2w9" Mar 12 15:19:40 crc kubenswrapper[4869]: I0312 15:19:40.825565 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gg2w9" Mar 12 15:19:41 crc kubenswrapper[4869]: I0312 15:19:41.011045 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j8qs4" Mar 12 15:19:41 crc kubenswrapper[4869]: I0312 15:19:41.056185 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tgt7x"] Mar 12 15:19:41 crc kubenswrapper[4869]: I0312 15:19:41.072860 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tgt7x"] Mar 12 15:19:41 crc kubenswrapper[4869]: I0312 15:19:41.096620 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbp2n\" (UniqueName: \"kubernetes.io/projected/f53df6ad-8f02-495f-a613-969e90d82bde-kube-api-access-sbp2n\") pod \"f53df6ad-8f02-495f-a613-969e90d82bde\" (UID: \"f53df6ad-8f02-495f-a613-969e90d82bde\") " Mar 12 15:19:41 crc kubenswrapper[4869]: I0312 15:19:41.096783 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f53df6ad-8f02-495f-a613-969e90d82bde-utilities\") pod \"f53df6ad-8f02-495f-a613-969e90d82bde\" (UID: \"f53df6ad-8f02-495f-a613-969e90d82bde\") " Mar 12 15:19:41 crc kubenswrapper[4869]: I0312 15:19:41.096861 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f53df6ad-8f02-495f-a613-969e90d82bde-catalog-content\") pod \"f53df6ad-8f02-495f-a613-969e90d82bde\" (UID: \"f53df6ad-8f02-495f-a613-969e90d82bde\") " Mar 12 15:19:41 crc kubenswrapper[4869]: I0312 15:19:41.097666 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f53df6ad-8f02-495f-a613-969e90d82bde-utilities" (OuterVolumeSpecName: "utilities") pod "f53df6ad-8f02-495f-a613-969e90d82bde" (UID: "f53df6ad-8f02-495f-a613-969e90d82bde"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:19:41 crc kubenswrapper[4869]: I0312 15:19:41.101357 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f53df6ad-8f02-495f-a613-969e90d82bde-kube-api-access-sbp2n" (OuterVolumeSpecName: "kube-api-access-sbp2n") pod "f53df6ad-8f02-495f-a613-969e90d82bde" (UID: "f53df6ad-8f02-495f-a613-969e90d82bde"). InnerVolumeSpecName "kube-api-access-sbp2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:19:41 crc kubenswrapper[4869]: I0312 15:19:41.199721 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbp2n\" (UniqueName: \"kubernetes.io/projected/f53df6ad-8f02-495f-a613-969e90d82bde-kube-api-access-sbp2n\") on node \"crc\" DevicePath \"\"" Mar 12 15:19:41 crc kubenswrapper[4869]: I0312 15:19:41.199751 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f53df6ad-8f02-495f-a613-969e90d82bde-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:19:41 crc kubenswrapper[4869]: I0312 15:19:41.264472 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f53df6ad-8f02-495f-a613-969e90d82bde-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f53df6ad-8f02-495f-a613-969e90d82bde" (UID: "f53df6ad-8f02-495f-a613-969e90d82bde"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:19:41 crc kubenswrapper[4869]: I0312 15:19:41.301397 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f53df6ad-8f02-495f-a613-969e90d82bde-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:19:41 crc kubenswrapper[4869]: I0312 15:19:41.364393 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gg2w9"] Mar 12 15:19:41 crc kubenswrapper[4869]: W0312 15:19:41.365396 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21afe139_09ed_4ebc_b81b_1be277a8e199.slice/crio-416346a22b65f6eaadcdb7fa2c4a34188c28d22375a696b27409e3a777ad14d4 WatchSource:0}: Error finding container 416346a22b65f6eaadcdb7fa2c4a34188c28d22375a696b27409e3a777ad14d4: Status 404 returned error can't find the container with id 416346a22b65f6eaadcdb7fa2c4a34188c28d22375a696b27409e3a777ad14d4 Mar 12 15:19:41 crc kubenswrapper[4869]: I0312 15:19:41.440072 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gg2w9" event={"ID":"21afe139-09ed-4ebc-b81b-1be277a8e199","Type":"ContainerStarted","Data":"416346a22b65f6eaadcdb7fa2c4a34188c28d22375a696b27409e3a777ad14d4"} Mar 12 15:19:41 crc kubenswrapper[4869]: I0312 15:19:41.442044 4869 generic.go:334] "Generic (PLEG): container finished" podID="f53df6ad-8f02-495f-a613-969e90d82bde" containerID="890498c178b82a49a1f0cb0940fbbbeaacddd8326b6652e2719abfe1c9856ce9" exitCode=0 Mar 12 15:19:41 crc kubenswrapper[4869]: I0312 15:19:41.442091 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j8qs4" event={"ID":"f53df6ad-8f02-495f-a613-969e90d82bde","Type":"ContainerDied","Data":"890498c178b82a49a1f0cb0940fbbbeaacddd8326b6652e2719abfe1c9856ce9"} Mar 12 15:19:41 crc kubenswrapper[4869]: I0312 15:19:41.442121 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j8qs4" event={"ID":"f53df6ad-8f02-495f-a613-969e90d82bde","Type":"ContainerDied","Data":"b0e6e92051bcc6d5ecd3a2d56cd63da60de49c16bc1eb233ac4359a449df04d8"} Mar 12 15:19:41 crc kubenswrapper[4869]: I0312 15:19:41.442143 4869 scope.go:117] "RemoveContainer" containerID="890498c178b82a49a1f0cb0940fbbbeaacddd8326b6652e2719abfe1c9856ce9" Mar 12 15:19:41 crc kubenswrapper[4869]: I0312 15:19:41.442147 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j8qs4" Mar 12 15:19:41 crc kubenswrapper[4869]: I0312 15:19:41.461396 4869 scope.go:117] "RemoveContainer" containerID="e6c8ad6dacd3c979e7ad740f80048a8eb1eff8f75cfaa9e47e5ff3e08f094488" Mar 12 15:19:41 crc kubenswrapper[4869]: I0312 15:19:41.478453 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j8qs4"] Mar 12 15:19:41 crc kubenswrapper[4869]: I0312 15:19:41.487578 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j8qs4"] Mar 12 15:19:41 crc kubenswrapper[4869]: I0312 15:19:41.505518 4869 scope.go:117] "RemoveContainer" containerID="5603f8b34a7d79c2103ceebb155270bc138b6b2ad5d9096a0962e3c5aceb23f0" Mar 12 15:19:41 crc kubenswrapper[4869]: I0312 15:19:41.528492 4869 scope.go:117] "RemoveContainer" containerID="890498c178b82a49a1f0cb0940fbbbeaacddd8326b6652e2719abfe1c9856ce9" Mar 12 15:19:41 crc kubenswrapper[4869]: E0312 15:19:41.528877 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"890498c178b82a49a1f0cb0940fbbbeaacddd8326b6652e2719abfe1c9856ce9\": container with ID starting with 890498c178b82a49a1f0cb0940fbbbeaacddd8326b6652e2719abfe1c9856ce9 not found: ID does not exist" containerID="890498c178b82a49a1f0cb0940fbbbeaacddd8326b6652e2719abfe1c9856ce9" Mar 12 15:19:41 crc kubenswrapper[4869]: I0312 15:19:41.528908 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"890498c178b82a49a1f0cb0940fbbbeaacddd8326b6652e2719abfe1c9856ce9"} err="failed to get container status \"890498c178b82a49a1f0cb0940fbbbeaacddd8326b6652e2719abfe1c9856ce9\": rpc error: code = NotFound desc = could not find container \"890498c178b82a49a1f0cb0940fbbbeaacddd8326b6652e2719abfe1c9856ce9\": container with ID starting with 890498c178b82a49a1f0cb0940fbbbeaacddd8326b6652e2719abfe1c9856ce9 not found: ID does not exist" Mar 12 15:19:41 crc kubenswrapper[4869]: I0312 15:19:41.528927 4869 scope.go:117] "RemoveContainer" containerID="e6c8ad6dacd3c979e7ad740f80048a8eb1eff8f75cfaa9e47e5ff3e08f094488" Mar 12 15:19:41 crc kubenswrapper[4869]: E0312 15:19:41.529296 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6c8ad6dacd3c979e7ad740f80048a8eb1eff8f75cfaa9e47e5ff3e08f094488\": container with ID starting with e6c8ad6dacd3c979e7ad740f80048a8eb1eff8f75cfaa9e47e5ff3e08f094488 not found: ID does not exist" containerID="e6c8ad6dacd3c979e7ad740f80048a8eb1eff8f75cfaa9e47e5ff3e08f094488" Mar 12 15:19:41 crc kubenswrapper[4869]: I0312 15:19:41.529320 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6c8ad6dacd3c979e7ad740f80048a8eb1eff8f75cfaa9e47e5ff3e08f094488"} err="failed to get container status \"e6c8ad6dacd3c979e7ad740f80048a8eb1eff8f75cfaa9e47e5ff3e08f094488\": rpc error: code = NotFound desc = could not find container \"e6c8ad6dacd3c979e7ad740f80048a8eb1eff8f75cfaa9e47e5ff3e08f094488\": container with ID starting with e6c8ad6dacd3c979e7ad740f80048a8eb1eff8f75cfaa9e47e5ff3e08f094488 not found: ID does not exist" Mar 12 15:19:41 crc kubenswrapper[4869]: I0312 15:19:41.529333 4869 scope.go:117] "RemoveContainer" containerID="5603f8b34a7d79c2103ceebb155270bc138b6b2ad5d9096a0962e3c5aceb23f0" Mar 12 15:19:41 crc kubenswrapper[4869]: E0312 15:19:41.529490 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5603f8b34a7d79c2103ceebb155270bc138b6b2ad5d9096a0962e3c5aceb23f0\": container with ID starting with 5603f8b34a7d79c2103ceebb155270bc138b6b2ad5d9096a0962e3c5aceb23f0 not found: ID does not exist" containerID="5603f8b34a7d79c2103ceebb155270bc138b6b2ad5d9096a0962e3c5aceb23f0" Mar 12 15:19:41 crc kubenswrapper[4869]: I0312 15:19:41.529527 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5603f8b34a7d79c2103ceebb155270bc138b6b2ad5d9096a0962e3c5aceb23f0"} err="failed to get container status \"5603f8b34a7d79c2103ceebb155270bc138b6b2ad5d9096a0962e3c5aceb23f0\": rpc error: code = NotFound desc = could not find container \"5603f8b34a7d79c2103ceebb155270bc138b6b2ad5d9096a0962e3c5aceb23f0\": container with ID starting with 5603f8b34a7d79c2103ceebb155270bc138b6b2ad5d9096a0962e3c5aceb23f0 not found: ID does not exist" Mar 12 15:19:42 crc kubenswrapper[4869]: I0312 15:19:42.348989 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c21e651-1883-4a95-a803-e6a5a2b7f457" path="/var/lib/kubelet/pods/7c21e651-1883-4a95-a803-e6a5a2b7f457/volumes" Mar 12 15:19:42 crc kubenswrapper[4869]: I0312 15:19:42.350166 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f53df6ad-8f02-495f-a613-969e90d82bde" path="/var/lib/kubelet/pods/f53df6ad-8f02-495f-a613-969e90d82bde/volumes" Mar 12 15:19:42 crc kubenswrapper[4869]: I0312 15:19:42.457299 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gg2w9" event={"ID":"21afe139-09ed-4ebc-b81b-1be277a8e199","Type":"ContainerStarted","Data":"942d353ef8af41d8b853ab78cf0251ea12b1b4f0b3e1eb5d859ee6f87cd7e7e2"} Mar 12 15:19:42 crc kubenswrapper[4869]: I0312 15:19:42.474977 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gg2w9" podStartSLOduration=2.002530322 podStartE2EDuration="2.474960246s" podCreationTimestamp="2026-03-12 15:19:40 +0000 UTC" firstStartedPulling="2026-03-12 15:19:41.367348798 +0000 UTC m=+1933.652574076" lastFinishedPulling="2026-03-12 15:19:41.839778712 +0000 UTC m=+1934.125004000" observedRunningTime="2026-03-12 15:19:42.471520947 +0000 UTC m=+1934.756746235" watchObservedRunningTime="2026-03-12 15:19:42.474960246 +0000 UTC m=+1934.760185524" Mar 12 15:19:49 crc kubenswrapper[4869]: I0312 15:19:49.521274 4869 generic.go:334] "Generic (PLEG): container finished" podID="21afe139-09ed-4ebc-b81b-1be277a8e199" containerID="942d353ef8af41d8b853ab78cf0251ea12b1b4f0b3e1eb5d859ee6f87cd7e7e2" exitCode=0 Mar 12 15:19:49 crc kubenswrapper[4869]: I0312 15:19:49.521397 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gg2w9" event={"ID":"21afe139-09ed-4ebc-b81b-1be277a8e199","Type":"ContainerDied","Data":"942d353ef8af41d8b853ab78cf0251ea12b1b4f0b3e1eb5d859ee6f87cd7e7e2"} Mar 12 15:19:50 crc kubenswrapper[4869]: I0312 15:19:50.960168 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gg2w9" Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.009948 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21afe139-09ed-4ebc-b81b-1be277a8e199-inventory\") pod \"21afe139-09ed-4ebc-b81b-1be277a8e199\" (UID: \"21afe139-09ed-4ebc-b81b-1be277a8e199\") " Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.010294 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmgn7\" (UniqueName: \"kubernetes.io/projected/21afe139-09ed-4ebc-b81b-1be277a8e199-kube-api-access-pmgn7\") pod \"21afe139-09ed-4ebc-b81b-1be277a8e199\" (UID: \"21afe139-09ed-4ebc-b81b-1be277a8e199\") " Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.010325 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/21afe139-09ed-4ebc-b81b-1be277a8e199-ssh-key-openstack-edpm-ipam\") pod \"21afe139-09ed-4ebc-b81b-1be277a8e199\" (UID: \"21afe139-09ed-4ebc-b81b-1be277a8e199\") " Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.015428 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21afe139-09ed-4ebc-b81b-1be277a8e199-kube-api-access-pmgn7" (OuterVolumeSpecName: "kube-api-access-pmgn7") pod "21afe139-09ed-4ebc-b81b-1be277a8e199" (UID: "21afe139-09ed-4ebc-b81b-1be277a8e199"). InnerVolumeSpecName "kube-api-access-pmgn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.037329 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21afe139-09ed-4ebc-b81b-1be277a8e199-inventory" (OuterVolumeSpecName: "inventory") pod "21afe139-09ed-4ebc-b81b-1be277a8e199" (UID: "21afe139-09ed-4ebc-b81b-1be277a8e199"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.040060 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21afe139-09ed-4ebc-b81b-1be277a8e199-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "21afe139-09ed-4ebc-b81b-1be277a8e199" (UID: "21afe139-09ed-4ebc-b81b-1be277a8e199"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.113039 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmgn7\" (UniqueName: \"kubernetes.io/projected/21afe139-09ed-4ebc-b81b-1be277a8e199-kube-api-access-pmgn7\") on node \"crc\" DevicePath \"\"" Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.113075 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/21afe139-09ed-4ebc-b81b-1be277a8e199-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.113086 4869 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21afe139-09ed-4ebc-b81b-1be277a8e199-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.540531 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gg2w9" event={"ID":"21afe139-09ed-4ebc-b81b-1be277a8e199","Type":"ContainerDied","Data":"416346a22b65f6eaadcdb7fa2c4a34188c28d22375a696b27409e3a777ad14d4"} Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.541034 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="416346a22b65f6eaadcdb7fa2c4a34188c28d22375a696b27409e3a777ad14d4" Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.541111 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gg2w9" Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.620869 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpxkt"] Mar 12 15:19:51 crc kubenswrapper[4869]: E0312 15:19:51.621348 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f53df6ad-8f02-495f-a613-969e90d82bde" containerName="extract-content" Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.621364 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f53df6ad-8f02-495f-a613-969e90d82bde" containerName="extract-content" Mar 12 15:19:51 crc kubenswrapper[4869]: E0312 15:19:51.621381 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f53df6ad-8f02-495f-a613-969e90d82bde" containerName="registry-server" Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.621388 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f53df6ad-8f02-495f-a613-969e90d82bde" containerName="registry-server" Mar 12 15:19:51 crc kubenswrapper[4869]: E0312 15:19:51.621412 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f53df6ad-8f02-495f-a613-969e90d82bde" containerName="extract-utilities" Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.621420 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f53df6ad-8f02-495f-a613-969e90d82bde" containerName="extract-utilities" Mar 12 15:19:51 crc kubenswrapper[4869]: E0312 15:19:51.621441 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21afe139-09ed-4ebc-b81b-1be277a8e199" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.621447 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="21afe139-09ed-4ebc-b81b-1be277a8e199" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.621646 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f53df6ad-8f02-495f-a613-969e90d82bde" containerName="registry-server" Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.621667 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="21afe139-09ed-4ebc-b81b-1be277a8e199" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.622373 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpxkt" Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.625176 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.625277 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.625391 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.626315 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cxsgq" Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.630088 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpxkt"] Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.728575 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/90ac4b9f-1907-430a-9264-dcab4c86e118-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fpxkt\" (UID: \"90ac4b9f-1907-430a-9264-dcab4c86e118\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpxkt" Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.728671 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90ac4b9f-1907-430a-9264-dcab4c86e118-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fpxkt\" (UID: \"90ac4b9f-1907-430a-9264-dcab4c86e118\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpxkt" Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.729068 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpcsl\" (UniqueName: \"kubernetes.io/projected/90ac4b9f-1907-430a-9264-dcab4c86e118-kube-api-access-tpcsl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fpxkt\" (UID: \"90ac4b9f-1907-430a-9264-dcab4c86e118\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpxkt" Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.831104 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90ac4b9f-1907-430a-9264-dcab4c86e118-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fpxkt\" (UID: \"90ac4b9f-1907-430a-9264-dcab4c86e118\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpxkt" Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.831438 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpcsl\" (UniqueName: \"kubernetes.io/projected/90ac4b9f-1907-430a-9264-dcab4c86e118-kube-api-access-tpcsl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fpxkt\" (UID: \"90ac4b9f-1907-430a-9264-dcab4c86e118\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpxkt" Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.831655 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/90ac4b9f-1907-430a-9264-dcab4c86e118-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fpxkt\" (UID: \"90ac4b9f-1907-430a-9264-dcab4c86e118\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpxkt" Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.835064 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90ac4b9f-1907-430a-9264-dcab4c86e118-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fpxkt\" (UID: \"90ac4b9f-1907-430a-9264-dcab4c86e118\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpxkt" Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.835873 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/90ac4b9f-1907-430a-9264-dcab4c86e118-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fpxkt\" (UID: \"90ac4b9f-1907-430a-9264-dcab4c86e118\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpxkt" Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.855271 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpcsl\" (UniqueName: \"kubernetes.io/projected/90ac4b9f-1907-430a-9264-dcab4c86e118-kube-api-access-tpcsl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-fpxkt\" (UID: \"90ac4b9f-1907-430a-9264-dcab4c86e118\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpxkt" Mar 12 15:19:51 crc kubenswrapper[4869]: I0312 15:19:51.944443 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpxkt" Mar 12 15:19:52 crc kubenswrapper[4869]: I0312 15:19:52.451086 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpxkt"] Mar 12 15:19:52 crc kubenswrapper[4869]: I0312 15:19:52.549304 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpxkt" event={"ID":"90ac4b9f-1907-430a-9264-dcab4c86e118","Type":"ContainerStarted","Data":"55937dcf747e29e396ddb52e63cd6d00c2cfed4434d58134759ae1c4686b33fa"} Mar 12 15:19:53 crc kubenswrapper[4869]: I0312 15:19:53.337269 4869 scope.go:117] "RemoveContainer" containerID="bc737999b30693337d1e520b47de07490eab1ec628aefabcaeb8e63a17e676eb" Mar 12 15:19:53 crc kubenswrapper[4869]: E0312 15:19:53.337833 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:19:53 crc kubenswrapper[4869]: I0312 15:19:53.559910 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpxkt" event={"ID":"90ac4b9f-1907-430a-9264-dcab4c86e118","Type":"ContainerStarted","Data":"2c9f70da075b5f413a1ad0c2a66a06aebccc9810d65461f08c2282d90cd11678"} Mar 12 15:19:53 crc kubenswrapper[4869]: I0312 15:19:53.584126 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpxkt" podStartSLOduration=2.179409227 podStartE2EDuration="2.584105921s" podCreationTimestamp="2026-03-12 15:19:51 +0000 UTC" firstStartedPulling="2026-03-12 15:19:52.459058325 +0000 UTC m=+1944.744283603" lastFinishedPulling="2026-03-12 15:19:52.863755019 +0000 UTC m=+1945.148980297" observedRunningTime="2026-03-12 15:19:53.575470009 +0000 UTC m=+1945.860695297" watchObservedRunningTime="2026-03-12 15:19:53.584105921 +0000 UTC m=+1945.869331199" Mar 12 15:20:00 crc kubenswrapper[4869]: I0312 15:20:00.128533 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555480-bplrp"] Mar 12 15:20:00 crc kubenswrapper[4869]: I0312 15:20:00.130434 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555480-bplrp" Mar 12 15:20:00 crc kubenswrapper[4869]: I0312 15:20:00.134462 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:20:00 crc kubenswrapper[4869]: I0312 15:20:00.134999 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:20:00 crc kubenswrapper[4869]: I0312 15:20:00.135142 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 15:20:00 crc kubenswrapper[4869]: I0312 15:20:00.140788 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555480-bplrp"] Mar 12 15:20:00 crc kubenswrapper[4869]: I0312 15:20:00.306084 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n8nr\" (UniqueName: \"kubernetes.io/projected/a21e3145-7118-4360-90d6-9f32c92208b2-kube-api-access-2n8nr\") pod \"auto-csr-approver-29555480-bplrp\" (UID: \"a21e3145-7118-4360-90d6-9f32c92208b2\") " pod="openshift-infra/auto-csr-approver-29555480-bplrp" Mar 12 15:20:00 crc kubenswrapper[4869]: I0312 15:20:00.408056 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n8nr\" (UniqueName: \"kubernetes.io/projected/a21e3145-7118-4360-90d6-9f32c92208b2-kube-api-access-2n8nr\") pod \"auto-csr-approver-29555480-bplrp\" (UID: \"a21e3145-7118-4360-90d6-9f32c92208b2\") " pod="openshift-infra/auto-csr-approver-29555480-bplrp" Mar 12 15:20:00 crc kubenswrapper[4869]: I0312 15:20:00.429786 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n8nr\" (UniqueName: \"kubernetes.io/projected/a21e3145-7118-4360-90d6-9f32c92208b2-kube-api-access-2n8nr\") pod \"auto-csr-approver-29555480-bplrp\" (UID: \"a21e3145-7118-4360-90d6-9f32c92208b2\") " pod="openshift-infra/auto-csr-approver-29555480-bplrp" Mar 12 15:20:00 crc kubenswrapper[4869]: I0312 15:20:00.459967 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555480-bplrp" Mar 12 15:20:00 crc kubenswrapper[4869]: I0312 15:20:00.891824 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555480-bplrp"] Mar 12 15:20:01 crc kubenswrapper[4869]: I0312 15:20:01.640909 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555480-bplrp" event={"ID":"a21e3145-7118-4360-90d6-9f32c92208b2","Type":"ContainerStarted","Data":"7ccdc4e23667e0ea3518c2b692c8b059185ea4ce3820da607a13d52a435699fc"} Mar 12 15:20:01 crc kubenswrapper[4869]: I0312 15:20:01.643261 4869 generic.go:334] "Generic (PLEG): container finished" podID="90ac4b9f-1907-430a-9264-dcab4c86e118" containerID="2c9f70da075b5f413a1ad0c2a66a06aebccc9810d65461f08c2282d90cd11678" exitCode=0 Mar 12 15:20:01 crc kubenswrapper[4869]: I0312 15:20:01.643294 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpxkt" event={"ID":"90ac4b9f-1907-430a-9264-dcab4c86e118","Type":"ContainerDied","Data":"2c9f70da075b5f413a1ad0c2a66a06aebccc9810d65461f08c2282d90cd11678"} Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.120841 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpxkt" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.275163 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpcsl\" (UniqueName: \"kubernetes.io/projected/90ac4b9f-1907-430a-9264-dcab4c86e118-kube-api-access-tpcsl\") pod \"90ac4b9f-1907-430a-9264-dcab4c86e118\" (UID: \"90ac4b9f-1907-430a-9264-dcab4c86e118\") " Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.275359 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90ac4b9f-1907-430a-9264-dcab4c86e118-inventory\") pod \"90ac4b9f-1907-430a-9264-dcab4c86e118\" (UID: \"90ac4b9f-1907-430a-9264-dcab4c86e118\") " Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.275499 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/90ac4b9f-1907-430a-9264-dcab4c86e118-ssh-key-openstack-edpm-ipam\") pod \"90ac4b9f-1907-430a-9264-dcab4c86e118\" (UID: \"90ac4b9f-1907-430a-9264-dcab4c86e118\") " Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.285296 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90ac4b9f-1907-430a-9264-dcab4c86e118-kube-api-access-tpcsl" (OuterVolumeSpecName: "kube-api-access-tpcsl") pod "90ac4b9f-1907-430a-9264-dcab4c86e118" (UID: "90ac4b9f-1907-430a-9264-dcab4c86e118"). InnerVolumeSpecName "kube-api-access-tpcsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.299957 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90ac4b9f-1907-430a-9264-dcab4c86e118-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "90ac4b9f-1907-430a-9264-dcab4c86e118" (UID: "90ac4b9f-1907-430a-9264-dcab4c86e118"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.304087 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90ac4b9f-1907-430a-9264-dcab4c86e118-inventory" (OuterVolumeSpecName: "inventory") pod "90ac4b9f-1907-430a-9264-dcab4c86e118" (UID: "90ac4b9f-1907-430a-9264-dcab4c86e118"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.378658 4869 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90ac4b9f-1907-430a-9264-dcab4c86e118-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.378694 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/90ac4b9f-1907-430a-9264-dcab4c86e118-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.378704 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpcsl\" (UniqueName: \"kubernetes.io/projected/90ac4b9f-1907-430a-9264-dcab4c86e118-kube-api-access-tpcsl\") on node \"crc\" DevicePath \"\"" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.662734 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpxkt" event={"ID":"90ac4b9f-1907-430a-9264-dcab4c86e118","Type":"ContainerDied","Data":"55937dcf747e29e396ddb52e63cd6d00c2cfed4434d58134759ae1c4686b33fa"} Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.662763 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-fpxkt" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.662785 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55937dcf747e29e396ddb52e63cd6d00c2cfed4434d58134759ae1c4686b33fa" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.665116 4869 generic.go:334] "Generic (PLEG): container finished" podID="a21e3145-7118-4360-90d6-9f32c92208b2" containerID="5c4d37df4b83531931f84bb5a5adfe55bc370852940e13d94da0157dac4bb7b4" exitCode=0 Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.665171 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555480-bplrp" event={"ID":"a21e3145-7118-4360-90d6-9f32c92208b2","Type":"ContainerDied","Data":"5c4d37df4b83531931f84bb5a5adfe55bc370852940e13d94da0157dac4bb7b4"} Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.749108 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g"] Mar 12 15:20:03 crc kubenswrapper[4869]: E0312 15:20:03.750179 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90ac4b9f-1907-430a-9264-dcab4c86e118" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.750212 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="90ac4b9f-1907-430a-9264-dcab4c86e118" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.750976 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="90ac4b9f-1907-430a-9264-dcab4c86e118" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.752493 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.755387 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.755612 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.755682 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cxsgq" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.755973 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.756182 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.756363 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.756570 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.756752 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.775652 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g"] Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.888960 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/29b19fb6-d0a2-4944-972b-0dc4d285fd67-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.889080 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.889139 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2chg\" (UniqueName: \"kubernetes.io/projected/29b19fb6-d0a2-4944-972b-0dc4d285fd67-kube-api-access-t2chg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.889184 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.889216 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/29b19fb6-d0a2-4944-972b-0dc4d285fd67-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.889255 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.889309 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.889358 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.889390 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.889468 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.889498 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.889525 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.889686 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/29b19fb6-d0a2-4944-972b-0dc4d285fd67-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.889761 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/29b19fb6-d0a2-4944-972b-0dc4d285fd67-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.993004 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.993078 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.993769 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.993866 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.993890 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.993920 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.993948 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/29b19fb6-d0a2-4944-972b-0dc4d285fd67-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.994009 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/29b19fb6-d0a2-4944-972b-0dc4d285fd67-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.994076 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/29b19fb6-d0a2-4944-972b-0dc4d285fd67-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.994115 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.994157 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2chg\" (UniqueName: \"kubernetes.io/projected/29b19fb6-d0a2-4944-972b-0dc4d285fd67-kube-api-access-t2chg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.994188 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.994222 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/29b19fb6-d0a2-4944-972b-0dc4d285fd67-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.994256 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.998236 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.998592 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/29b19fb6-d0a2-4944-972b-0dc4d285fd67-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.998838 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.998846 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:03 crc kubenswrapper[4869]: I0312 15:20:03.999615 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/29b19fb6-d0a2-4944-972b-0dc4d285fd67-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:04 crc kubenswrapper[4869]: I0312 15:20:04.000440 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:04 crc kubenswrapper[4869]: I0312 15:20:04.000555 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:04 crc kubenswrapper[4869]: I0312 15:20:04.000717 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:04 crc kubenswrapper[4869]: I0312 15:20:04.000753 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/29b19fb6-d0a2-4944-972b-0dc4d285fd67-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:04 crc kubenswrapper[4869]: I0312 15:20:04.001331 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:04 crc kubenswrapper[4869]: I0312 15:20:04.001783 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:04 crc kubenswrapper[4869]: I0312 15:20:04.002740 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/29b19fb6-d0a2-4944-972b-0dc4d285fd67-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:04 crc kubenswrapper[4869]: I0312 15:20:04.010284 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2chg\" (UniqueName: \"kubernetes.io/projected/29b19fb6-d0a2-4944-972b-0dc4d285fd67-kube-api-access-t2chg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:04 crc kubenswrapper[4869]: I0312 15:20:04.018723 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:04 crc kubenswrapper[4869]: I0312 15:20:04.081436 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:04 crc kubenswrapper[4869]: I0312 15:20:04.642090 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g"] Mar 12 15:20:04 crc kubenswrapper[4869]: I0312 15:20:04.677859 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" event={"ID":"29b19fb6-d0a2-4944-972b-0dc4d285fd67","Type":"ContainerStarted","Data":"386c2279bb32756973c567dc1a069d8896b876ee07520109bc49b1000942816d"} Mar 12 15:20:04 crc kubenswrapper[4869]: I0312 15:20:04.903251 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555480-bplrp" Mar 12 15:20:04 crc kubenswrapper[4869]: I0312 15:20:04.921129 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2n8nr\" (UniqueName: \"kubernetes.io/projected/a21e3145-7118-4360-90d6-9f32c92208b2-kube-api-access-2n8nr\") pod \"a21e3145-7118-4360-90d6-9f32c92208b2\" (UID: \"a21e3145-7118-4360-90d6-9f32c92208b2\") " Mar 12 15:20:04 crc kubenswrapper[4869]: I0312 15:20:04.926621 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a21e3145-7118-4360-90d6-9f32c92208b2-kube-api-access-2n8nr" (OuterVolumeSpecName: "kube-api-access-2n8nr") pod "a21e3145-7118-4360-90d6-9f32c92208b2" (UID: "a21e3145-7118-4360-90d6-9f32c92208b2"). InnerVolumeSpecName "kube-api-access-2n8nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:20:05 crc kubenswrapper[4869]: I0312 15:20:05.024494 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2n8nr\" (UniqueName: \"kubernetes.io/projected/a21e3145-7118-4360-90d6-9f32c92208b2-kube-api-access-2n8nr\") on node \"crc\" DevicePath \"\"" Mar 12 15:20:05 crc kubenswrapper[4869]: I0312 15:20:05.069691 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-bzm2j"] Mar 12 15:20:05 crc kubenswrapper[4869]: I0312 15:20:05.077612 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wdqpj"] Mar 12 15:20:05 crc kubenswrapper[4869]: I0312 15:20:05.086025 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-bzm2j"] Mar 12 15:20:05 crc kubenswrapper[4869]: I0312 15:20:05.095091 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wdqpj"] Mar 12 15:20:05 crc kubenswrapper[4869]: I0312 15:20:05.336990 4869 scope.go:117] "RemoveContainer" containerID="bc737999b30693337d1e520b47de07490eab1ec628aefabcaeb8e63a17e676eb" Mar 12 15:20:05 crc kubenswrapper[4869]: E0312 15:20:05.337222 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:20:05 crc kubenswrapper[4869]: I0312 15:20:05.693949 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555480-bplrp" event={"ID":"a21e3145-7118-4360-90d6-9f32c92208b2","Type":"ContainerDied","Data":"7ccdc4e23667e0ea3518c2b692c8b059185ea4ce3820da607a13d52a435699fc"} Mar 12 15:20:05 crc kubenswrapper[4869]: I0312 15:20:05.693996 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ccdc4e23667e0ea3518c2b692c8b059185ea4ce3820da607a13d52a435699fc" Mar 12 15:20:05 crc kubenswrapper[4869]: I0312 15:20:05.694065 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555480-bplrp" Mar 12 15:20:05 crc kubenswrapper[4869]: I0312 15:20:05.708570 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" event={"ID":"29b19fb6-d0a2-4944-972b-0dc4d285fd67","Type":"ContainerStarted","Data":"4236e466e564fb429746a87ab32d6d1c475f3b1198877db70f85e2589438350b"} Mar 12 15:20:05 crc kubenswrapper[4869]: I0312 15:20:05.741683 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" podStartSLOduration=2.3547003650000002 podStartE2EDuration="2.741669253s" podCreationTimestamp="2026-03-12 15:20:03 +0000 UTC" firstStartedPulling="2026-03-12 15:20:04.644532295 +0000 UTC m=+1956.929757573" lastFinishedPulling="2026-03-12 15:20:05.031501183 +0000 UTC m=+1957.316726461" observedRunningTime="2026-03-12 15:20:05.727481059 +0000 UTC m=+1958.012706357" watchObservedRunningTime="2026-03-12 15:20:05.741669253 +0000 UTC m=+1958.026894531" Mar 12 15:20:05 crc kubenswrapper[4869]: I0312 15:20:05.982483 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555474-6lm5p"] Mar 12 15:20:05 crc kubenswrapper[4869]: I0312 15:20:05.991996 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555474-6lm5p"] Mar 12 15:20:06 crc kubenswrapper[4869]: I0312 15:20:06.348451 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2829ee85-1daa-4418-b99c-90bbebceb7c2" path="/var/lib/kubelet/pods/2829ee85-1daa-4418-b99c-90bbebceb7c2/volumes" Mar 12 15:20:06 crc kubenswrapper[4869]: I0312 15:20:06.349057 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dae66448-0cc3-4feb-b232-387110fba260" path="/var/lib/kubelet/pods/dae66448-0cc3-4feb-b232-387110fba260/volumes" Mar 12 15:20:06 crc kubenswrapper[4869]: I0312 15:20:06.349740 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f547181b-aa83-44ec-9c1a-dfbc394ff6a5" path="/var/lib/kubelet/pods/f547181b-aa83-44ec-9c1a-dfbc394ff6a5/volumes" Mar 12 15:20:18 crc kubenswrapper[4869]: I0312 15:20:18.343988 4869 scope.go:117] "RemoveContainer" containerID="bc737999b30693337d1e520b47de07490eab1ec628aefabcaeb8e63a17e676eb" Mar 12 15:20:18 crc kubenswrapper[4869]: E0312 15:20:18.344873 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:20:31 crc kubenswrapper[4869]: I0312 15:20:31.336264 4869 scope.go:117] "RemoveContainer" containerID="bc737999b30693337d1e520b47de07490eab1ec628aefabcaeb8e63a17e676eb" Mar 12 15:20:31 crc kubenswrapper[4869]: E0312 15:20:31.337033 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:20:33 crc kubenswrapper[4869]: I0312 15:20:33.823284 4869 scope.go:117] "RemoveContainer" containerID="59b4471156ad7b9f1040a028511e1b2918ce761e3e6f4b926fc37d1b83e33231" Mar 12 15:20:33 crc kubenswrapper[4869]: I0312 15:20:33.864337 4869 scope.go:117] "RemoveContainer" containerID="d3242ae1f0643ab17059ec452626a3afde6a26864cee76aec8d75d9e20fb05f9" Mar 12 15:20:33 crc kubenswrapper[4869]: I0312 15:20:33.917469 4869 scope.go:117] "RemoveContainer" containerID="fb8feff685efe1b60d51bd5cf1b516b74dd7980d841d58162ce0d145a168e576" Mar 12 15:20:33 crc kubenswrapper[4869]: I0312 15:20:33.974048 4869 scope.go:117] "RemoveContainer" containerID="4fc0e8e6db5dd590434b9022f05c0318a9911d651083687ec672665e85926ce1" Mar 12 15:20:38 crc kubenswrapper[4869]: I0312 15:20:38.038837 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-4zpf4"] Mar 12 15:20:38 crc kubenswrapper[4869]: I0312 15:20:38.050674 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-4zpf4"] Mar 12 15:20:38 crc kubenswrapper[4869]: I0312 15:20:38.345951 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b423daaf-54ec-4a18-a9b6-572c4f32a207" path="/var/lib/kubelet/pods/b423daaf-54ec-4a18-a9b6-572c4f32a207/volumes" Mar 12 15:20:41 crc kubenswrapper[4869]: I0312 15:20:41.010001 4869 generic.go:334] "Generic (PLEG): container finished" podID="29b19fb6-d0a2-4944-972b-0dc4d285fd67" containerID="4236e466e564fb429746a87ab32d6d1c475f3b1198877db70f85e2589438350b" exitCode=0 Mar 12 15:20:41 crc kubenswrapper[4869]: I0312 15:20:41.010040 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" event={"ID":"29b19fb6-d0a2-4944-972b-0dc4d285fd67","Type":"ContainerDied","Data":"4236e466e564fb429746a87ab32d6d1c475f3b1198877db70f85e2589438350b"} Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.487850 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.654305 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/29b19fb6-d0a2-4944-972b-0dc4d285fd67-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.654429 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-repo-setup-combined-ca-bundle\") pod \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.655770 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-libvirt-combined-ca-bundle\") pod \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.655944 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-neutron-metadata-combined-ca-bundle\") pod \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.656099 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-inventory\") pod \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.656156 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-ssh-key-openstack-edpm-ipam\") pod \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.656185 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/29b19fb6-d0a2-4944-972b-0dc4d285fd67-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.656266 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/29b19fb6-d0a2-4944-972b-0dc4d285fd67-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.656319 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-nova-combined-ca-bundle\") pod \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.656385 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2chg\" (UniqueName: \"kubernetes.io/projected/29b19fb6-d0a2-4944-972b-0dc4d285fd67-kube-api-access-t2chg\") pod \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.656421 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-ovn-combined-ca-bundle\") pod \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.656513 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-bootstrap-combined-ca-bundle\") pod \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.656972 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/29b19fb6-d0a2-4944-972b-0dc4d285fd67-openstack-edpm-ipam-ovn-default-certs-0\") pod \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.657049 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-telemetry-combined-ca-bundle\") pod \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\" (UID: \"29b19fb6-d0a2-4944-972b-0dc4d285fd67\") " Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.662790 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "29b19fb6-d0a2-4944-972b-0dc4d285fd67" (UID: "29b19fb6-d0a2-4944-972b-0dc4d285fd67"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.663038 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29b19fb6-d0a2-4944-972b-0dc4d285fd67-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "29b19fb6-d0a2-4944-972b-0dc4d285fd67" (UID: "29b19fb6-d0a2-4944-972b-0dc4d285fd67"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.663105 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "29b19fb6-d0a2-4944-972b-0dc4d285fd67" (UID: "29b19fb6-d0a2-4944-972b-0dc4d285fd67"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.663237 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29b19fb6-d0a2-4944-972b-0dc4d285fd67-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "29b19fb6-d0a2-4944-972b-0dc4d285fd67" (UID: "29b19fb6-d0a2-4944-972b-0dc4d285fd67"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.663781 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29b19fb6-d0a2-4944-972b-0dc4d285fd67-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "29b19fb6-d0a2-4944-972b-0dc4d285fd67" (UID: "29b19fb6-d0a2-4944-972b-0dc4d285fd67"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.663991 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "29b19fb6-d0a2-4944-972b-0dc4d285fd67" (UID: "29b19fb6-d0a2-4944-972b-0dc4d285fd67"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.665116 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "29b19fb6-d0a2-4944-972b-0dc4d285fd67" (UID: "29b19fb6-d0a2-4944-972b-0dc4d285fd67"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.665300 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "29b19fb6-d0a2-4944-972b-0dc4d285fd67" (UID: "29b19fb6-d0a2-4944-972b-0dc4d285fd67"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.665839 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "29b19fb6-d0a2-4944-972b-0dc4d285fd67" (UID: "29b19fb6-d0a2-4944-972b-0dc4d285fd67"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.667491 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29b19fb6-d0a2-4944-972b-0dc4d285fd67-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "29b19fb6-d0a2-4944-972b-0dc4d285fd67" (UID: "29b19fb6-d0a2-4944-972b-0dc4d285fd67"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.674267 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "29b19fb6-d0a2-4944-972b-0dc4d285fd67" (UID: "29b19fb6-d0a2-4944-972b-0dc4d285fd67"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.691936 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29b19fb6-d0a2-4944-972b-0dc4d285fd67-kube-api-access-t2chg" (OuterVolumeSpecName: "kube-api-access-t2chg") pod "29b19fb6-d0a2-4944-972b-0dc4d285fd67" (UID: "29b19fb6-d0a2-4944-972b-0dc4d285fd67"). InnerVolumeSpecName "kube-api-access-t2chg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.704347 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-inventory" (OuterVolumeSpecName: "inventory") pod "29b19fb6-d0a2-4944-972b-0dc4d285fd67" (UID: "29b19fb6-d0a2-4944-972b-0dc4d285fd67"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.708321 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "29b19fb6-d0a2-4944-972b-0dc4d285fd67" (UID: "29b19fb6-d0a2-4944-972b-0dc4d285fd67"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.759795 4869 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.759836 4869 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.759850 4869 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.759860 4869 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.759869 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.759878 4869 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/29b19fb6-d0a2-4944-972b-0dc4d285fd67-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.759887 4869 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/29b19fb6-d0a2-4944-972b-0dc4d285fd67-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.759897 4869 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.759905 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2chg\" (UniqueName: \"kubernetes.io/projected/29b19fb6-d0a2-4944-972b-0dc4d285fd67-kube-api-access-t2chg\") on node \"crc\" DevicePath \"\"" Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.759915 4869 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.759923 4869 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.759933 4869 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/29b19fb6-d0a2-4944-972b-0dc4d285fd67-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.759941 4869 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29b19fb6-d0a2-4944-972b-0dc4d285fd67-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:20:42 crc kubenswrapper[4869]: I0312 15:20:42.759950 4869 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/29b19fb6-d0a2-4944-972b-0dc4d285fd67-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 12 15:20:43 crc kubenswrapper[4869]: I0312 15:20:43.025601 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" event={"ID":"29b19fb6-d0a2-4944-972b-0dc4d285fd67","Type":"ContainerDied","Data":"386c2279bb32756973c567dc1a069d8896b876ee07520109bc49b1000942816d"} Mar 12 15:20:43 crc kubenswrapper[4869]: I0312 15:20:43.025951 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="386c2279bb32756973c567dc1a069d8896b876ee07520109bc49b1000942816d" Mar 12 15:20:43 crc kubenswrapper[4869]: I0312 15:20:43.025634 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g" Mar 12 15:20:43 crc kubenswrapper[4869]: I0312 15:20:43.229109 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-d9ljv"] Mar 12 15:20:43 crc kubenswrapper[4869]: E0312 15:20:43.229570 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29b19fb6-d0a2-4944-972b-0dc4d285fd67" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 12 15:20:43 crc kubenswrapper[4869]: I0312 15:20:43.229594 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="29b19fb6-d0a2-4944-972b-0dc4d285fd67" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 12 15:20:43 crc kubenswrapper[4869]: E0312 15:20:43.229616 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a21e3145-7118-4360-90d6-9f32c92208b2" containerName="oc" Mar 12 15:20:43 crc kubenswrapper[4869]: I0312 15:20:43.229624 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="a21e3145-7118-4360-90d6-9f32c92208b2" containerName="oc" Mar 12 15:20:43 crc kubenswrapper[4869]: I0312 15:20:43.229852 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="a21e3145-7118-4360-90d6-9f32c92208b2" containerName="oc" Mar 12 15:20:43 crc kubenswrapper[4869]: I0312 15:20:43.229879 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="29b19fb6-d0a2-4944-972b-0dc4d285fd67" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 12 15:20:43 crc kubenswrapper[4869]: I0312 15:20:43.230496 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d9ljv" Mar 12 15:20:43 crc kubenswrapper[4869]: I0312 15:20:43.233616 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 15:20:43 crc kubenswrapper[4869]: I0312 15:20:43.233654 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cxsgq" Mar 12 15:20:43 crc kubenswrapper[4869]: I0312 15:20:43.233616 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 12 15:20:43 crc kubenswrapper[4869]: I0312 15:20:43.233981 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 15:20:43 crc kubenswrapper[4869]: I0312 15:20:43.237952 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 15:20:43 crc kubenswrapper[4869]: I0312 15:20:43.244945 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-d9ljv"] Mar 12 15:20:43 crc kubenswrapper[4869]: I0312 15:20:43.371389 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9846b6b-2f29-45b7-86c0-aff84144a93a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-d9ljv\" (UID: \"f9846b6b-2f29-45b7-86c0-aff84144a93a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d9ljv" Mar 12 15:20:43 crc kubenswrapper[4869]: I0312 15:20:43.371449 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9846b6b-2f29-45b7-86c0-aff84144a93a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-d9ljv\" (UID: \"f9846b6b-2f29-45b7-86c0-aff84144a93a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d9ljv" Mar 12 15:20:43 crc kubenswrapper[4869]: I0312 15:20:43.371614 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f9846b6b-2f29-45b7-86c0-aff84144a93a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-d9ljv\" (UID: \"f9846b6b-2f29-45b7-86c0-aff84144a93a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d9ljv" Mar 12 15:20:43 crc kubenswrapper[4869]: I0312 15:20:43.371649 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hsrm\" (UniqueName: \"kubernetes.io/projected/f9846b6b-2f29-45b7-86c0-aff84144a93a-kube-api-access-2hsrm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-d9ljv\" (UID: \"f9846b6b-2f29-45b7-86c0-aff84144a93a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d9ljv" Mar 12 15:20:43 crc kubenswrapper[4869]: I0312 15:20:43.371795 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9846b6b-2f29-45b7-86c0-aff84144a93a-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-d9ljv\" (UID: \"f9846b6b-2f29-45b7-86c0-aff84144a93a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d9ljv" Mar 12 15:20:43 crc kubenswrapper[4869]: I0312 15:20:43.473424 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f9846b6b-2f29-45b7-86c0-aff84144a93a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-d9ljv\" (UID: \"f9846b6b-2f29-45b7-86c0-aff84144a93a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d9ljv" Mar 12 15:20:43 crc kubenswrapper[4869]: I0312 15:20:43.473495 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hsrm\" (UniqueName: \"kubernetes.io/projected/f9846b6b-2f29-45b7-86c0-aff84144a93a-kube-api-access-2hsrm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-d9ljv\" (UID: \"f9846b6b-2f29-45b7-86c0-aff84144a93a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d9ljv" Mar 12 15:20:43 crc kubenswrapper[4869]: I0312 15:20:43.473555 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9846b6b-2f29-45b7-86c0-aff84144a93a-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-d9ljv\" (UID: \"f9846b6b-2f29-45b7-86c0-aff84144a93a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d9ljv" Mar 12 15:20:43 crc kubenswrapper[4869]: I0312 15:20:43.474371 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f9846b6b-2f29-45b7-86c0-aff84144a93a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-d9ljv\" (UID: \"f9846b6b-2f29-45b7-86c0-aff84144a93a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d9ljv" Mar 12 15:20:43 crc kubenswrapper[4869]: I0312 15:20:43.474375 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9846b6b-2f29-45b7-86c0-aff84144a93a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-d9ljv\" (UID: \"f9846b6b-2f29-45b7-86c0-aff84144a93a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d9ljv" Mar 12 15:20:43 crc kubenswrapper[4869]: I0312 15:20:43.474472 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9846b6b-2f29-45b7-86c0-aff84144a93a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-d9ljv\" (UID: \"f9846b6b-2f29-45b7-86c0-aff84144a93a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d9ljv" Mar 12 15:20:43 crc kubenswrapper[4869]: I0312 15:20:43.484050 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9846b6b-2f29-45b7-86c0-aff84144a93a-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-d9ljv\" (UID: \"f9846b6b-2f29-45b7-86c0-aff84144a93a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d9ljv" Mar 12 15:20:43 crc kubenswrapper[4869]: I0312 15:20:43.484050 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9846b6b-2f29-45b7-86c0-aff84144a93a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-d9ljv\" (UID: \"f9846b6b-2f29-45b7-86c0-aff84144a93a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d9ljv" Mar 12 15:20:43 crc kubenswrapper[4869]: I0312 15:20:43.484674 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9846b6b-2f29-45b7-86c0-aff84144a93a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-d9ljv\" (UID: \"f9846b6b-2f29-45b7-86c0-aff84144a93a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d9ljv" Mar 12 15:20:43 crc kubenswrapper[4869]: I0312 15:20:43.493504 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hsrm\" (UniqueName: \"kubernetes.io/projected/f9846b6b-2f29-45b7-86c0-aff84144a93a-kube-api-access-2hsrm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-d9ljv\" (UID: \"f9846b6b-2f29-45b7-86c0-aff84144a93a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d9ljv" Mar 12 15:20:43 crc kubenswrapper[4869]: I0312 15:20:43.563089 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d9ljv" Mar 12 15:20:44 crc kubenswrapper[4869]: I0312 15:20:44.125483 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-d9ljv"] Mar 12 15:20:45 crc kubenswrapper[4869]: I0312 15:20:45.042051 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d9ljv" event={"ID":"f9846b6b-2f29-45b7-86c0-aff84144a93a","Type":"ContainerStarted","Data":"6edd6843612e32419d5da3fbbefcad25f7e07bbe7a8359e220645e28fee8b7ad"} Mar 12 15:20:45 crc kubenswrapper[4869]: I0312 15:20:45.042104 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d9ljv" event={"ID":"f9846b6b-2f29-45b7-86c0-aff84144a93a","Type":"ContainerStarted","Data":"c8ce8d33602a7f8cfcdd141157a0a111876cc21efb958b7b367f83927228655f"} Mar 12 15:20:45 crc kubenswrapper[4869]: I0312 15:20:45.067671 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d9ljv" podStartSLOduration=1.5758046129999999 podStartE2EDuration="2.067650724s" podCreationTimestamp="2026-03-12 15:20:43 +0000 UTC" firstStartedPulling="2026-03-12 15:20:44.127209501 +0000 UTC m=+1996.412434779" lastFinishedPulling="2026-03-12 15:20:44.619055622 +0000 UTC m=+1996.904280890" observedRunningTime="2026-03-12 15:20:45.062663236 +0000 UTC m=+1997.347888534" watchObservedRunningTime="2026-03-12 15:20:45.067650724 +0000 UTC m=+1997.352876012" Mar 12 15:20:45 crc kubenswrapper[4869]: I0312 15:20:45.336388 4869 scope.go:117] "RemoveContainer" containerID="bc737999b30693337d1e520b47de07490eab1ec628aefabcaeb8e63a17e676eb" Mar 12 15:20:45 crc kubenswrapper[4869]: E0312 15:20:45.336961 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:20:57 crc kubenswrapper[4869]: I0312 15:20:57.170586 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7svp2"] Mar 12 15:20:57 crc kubenswrapper[4869]: I0312 15:20:57.176152 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7svp2" Mar 12 15:20:57 crc kubenswrapper[4869]: I0312 15:20:57.187289 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7svp2"] Mar 12 15:20:57 crc kubenswrapper[4869]: I0312 15:20:57.262178 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2339bbe-49ef-4cd1-806a-4c839fefdcd8-utilities\") pod \"certified-operators-7svp2\" (UID: \"e2339bbe-49ef-4cd1-806a-4c839fefdcd8\") " pod="openshift-marketplace/certified-operators-7svp2" Mar 12 15:20:57 crc kubenswrapper[4869]: I0312 15:20:57.262270 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2339bbe-49ef-4cd1-806a-4c839fefdcd8-catalog-content\") pod \"certified-operators-7svp2\" (UID: \"e2339bbe-49ef-4cd1-806a-4c839fefdcd8\") " pod="openshift-marketplace/certified-operators-7svp2" Mar 12 15:20:57 crc kubenswrapper[4869]: I0312 15:20:57.262345 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl6cx\" (UniqueName: \"kubernetes.io/projected/e2339bbe-49ef-4cd1-806a-4c839fefdcd8-kube-api-access-tl6cx\") pod \"certified-operators-7svp2\" (UID: \"e2339bbe-49ef-4cd1-806a-4c839fefdcd8\") " pod="openshift-marketplace/certified-operators-7svp2" Mar 12 15:20:57 crc kubenswrapper[4869]: I0312 15:20:57.364757 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2339bbe-49ef-4cd1-806a-4c839fefdcd8-catalog-content\") pod \"certified-operators-7svp2\" (UID: \"e2339bbe-49ef-4cd1-806a-4c839fefdcd8\") " pod="openshift-marketplace/certified-operators-7svp2" Mar 12 15:20:57 crc kubenswrapper[4869]: I0312 15:20:57.365110 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl6cx\" (UniqueName: \"kubernetes.io/projected/e2339bbe-49ef-4cd1-806a-4c839fefdcd8-kube-api-access-tl6cx\") pod \"certified-operators-7svp2\" (UID: \"e2339bbe-49ef-4cd1-806a-4c839fefdcd8\") " pod="openshift-marketplace/certified-operators-7svp2" Mar 12 15:20:57 crc kubenswrapper[4869]: I0312 15:20:57.365290 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2339bbe-49ef-4cd1-806a-4c839fefdcd8-utilities\") pod \"certified-operators-7svp2\" (UID: \"e2339bbe-49ef-4cd1-806a-4c839fefdcd8\") " pod="openshift-marketplace/certified-operators-7svp2" Mar 12 15:20:57 crc kubenswrapper[4869]: I0312 15:20:57.365298 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2339bbe-49ef-4cd1-806a-4c839fefdcd8-catalog-content\") pod \"certified-operators-7svp2\" (UID: \"e2339bbe-49ef-4cd1-806a-4c839fefdcd8\") " pod="openshift-marketplace/certified-operators-7svp2" Mar 12 15:20:57 crc kubenswrapper[4869]: I0312 15:20:57.365500 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2339bbe-49ef-4cd1-806a-4c839fefdcd8-utilities\") pod \"certified-operators-7svp2\" (UID: \"e2339bbe-49ef-4cd1-806a-4c839fefdcd8\") " pod="openshift-marketplace/certified-operators-7svp2" Mar 12 15:20:57 crc kubenswrapper[4869]: I0312 15:20:57.388423 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl6cx\" (UniqueName: \"kubernetes.io/projected/e2339bbe-49ef-4cd1-806a-4c839fefdcd8-kube-api-access-tl6cx\") pod \"certified-operators-7svp2\" (UID: \"e2339bbe-49ef-4cd1-806a-4c839fefdcd8\") " pod="openshift-marketplace/certified-operators-7svp2" Mar 12 15:20:57 crc kubenswrapper[4869]: I0312 15:20:57.507103 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7svp2" Mar 12 15:20:58 crc kubenswrapper[4869]: I0312 15:20:58.054472 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7svp2"] Mar 12 15:20:58 crc kubenswrapper[4869]: I0312 15:20:58.143146 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7svp2" event={"ID":"e2339bbe-49ef-4cd1-806a-4c839fefdcd8","Type":"ContainerStarted","Data":"99843383b9b8fe02e55424c1e665473056dc8bbe6d6f72c9095886a8366cc44e"} Mar 12 15:20:59 crc kubenswrapper[4869]: I0312 15:20:59.152503 4869 generic.go:334] "Generic (PLEG): container finished" podID="e2339bbe-49ef-4cd1-806a-4c839fefdcd8" containerID="a630dda13fca00c554c07b832cfb7ffce623aed15a538c435a2654ebb6f89d31" exitCode=0 Mar 12 15:20:59 crc kubenswrapper[4869]: I0312 15:20:59.152570 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7svp2" event={"ID":"e2339bbe-49ef-4cd1-806a-4c839fefdcd8","Type":"ContainerDied","Data":"a630dda13fca00c554c07b832cfb7ffce623aed15a538c435a2654ebb6f89d31"} Mar 12 15:20:59 crc kubenswrapper[4869]: I0312 15:20:59.568924 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xf8vz"] Mar 12 15:20:59 crc kubenswrapper[4869]: I0312 15:20:59.570979 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xf8vz" Mar 12 15:20:59 crc kubenswrapper[4869]: I0312 15:20:59.581833 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xf8vz"] Mar 12 15:20:59 crc kubenswrapper[4869]: I0312 15:20:59.608003 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d668230-fd7a-44cf-9355-a8174bd0b4ff-utilities\") pod \"redhat-marketplace-xf8vz\" (UID: \"4d668230-fd7a-44cf-9355-a8174bd0b4ff\") " pod="openshift-marketplace/redhat-marketplace-xf8vz" Mar 12 15:20:59 crc kubenswrapper[4869]: I0312 15:20:59.608230 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dr4h\" (UniqueName: \"kubernetes.io/projected/4d668230-fd7a-44cf-9355-a8174bd0b4ff-kube-api-access-8dr4h\") pod \"redhat-marketplace-xf8vz\" (UID: \"4d668230-fd7a-44cf-9355-a8174bd0b4ff\") " pod="openshift-marketplace/redhat-marketplace-xf8vz" Mar 12 15:20:59 crc kubenswrapper[4869]: I0312 15:20:59.609029 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d668230-fd7a-44cf-9355-a8174bd0b4ff-catalog-content\") pod \"redhat-marketplace-xf8vz\" (UID: \"4d668230-fd7a-44cf-9355-a8174bd0b4ff\") " pod="openshift-marketplace/redhat-marketplace-xf8vz" Mar 12 15:20:59 crc kubenswrapper[4869]: I0312 15:20:59.711099 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d668230-fd7a-44cf-9355-a8174bd0b4ff-catalog-content\") pod \"redhat-marketplace-xf8vz\" (UID: \"4d668230-fd7a-44cf-9355-a8174bd0b4ff\") " pod="openshift-marketplace/redhat-marketplace-xf8vz" Mar 12 15:20:59 crc kubenswrapper[4869]: I0312 15:20:59.711217 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d668230-fd7a-44cf-9355-a8174bd0b4ff-utilities\") pod \"redhat-marketplace-xf8vz\" (UID: \"4d668230-fd7a-44cf-9355-a8174bd0b4ff\") " pod="openshift-marketplace/redhat-marketplace-xf8vz" Mar 12 15:20:59 crc kubenswrapper[4869]: I0312 15:20:59.711251 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dr4h\" (UniqueName: \"kubernetes.io/projected/4d668230-fd7a-44cf-9355-a8174bd0b4ff-kube-api-access-8dr4h\") pod \"redhat-marketplace-xf8vz\" (UID: \"4d668230-fd7a-44cf-9355-a8174bd0b4ff\") " pod="openshift-marketplace/redhat-marketplace-xf8vz" Mar 12 15:20:59 crc kubenswrapper[4869]: I0312 15:20:59.711921 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d668230-fd7a-44cf-9355-a8174bd0b4ff-catalog-content\") pod \"redhat-marketplace-xf8vz\" (UID: \"4d668230-fd7a-44cf-9355-a8174bd0b4ff\") " pod="openshift-marketplace/redhat-marketplace-xf8vz" Mar 12 15:20:59 crc kubenswrapper[4869]: I0312 15:20:59.712564 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d668230-fd7a-44cf-9355-a8174bd0b4ff-utilities\") pod \"redhat-marketplace-xf8vz\" (UID: \"4d668230-fd7a-44cf-9355-a8174bd0b4ff\") " pod="openshift-marketplace/redhat-marketplace-xf8vz" Mar 12 15:20:59 crc kubenswrapper[4869]: I0312 15:20:59.746488 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dr4h\" (UniqueName: \"kubernetes.io/projected/4d668230-fd7a-44cf-9355-a8174bd0b4ff-kube-api-access-8dr4h\") pod \"redhat-marketplace-xf8vz\" (UID: \"4d668230-fd7a-44cf-9355-a8174bd0b4ff\") " pod="openshift-marketplace/redhat-marketplace-xf8vz" Mar 12 15:20:59 crc kubenswrapper[4869]: I0312 15:20:59.902132 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xf8vz" Mar 12 15:21:00 crc kubenswrapper[4869]: I0312 15:21:00.164525 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7svp2" event={"ID":"e2339bbe-49ef-4cd1-806a-4c839fefdcd8","Type":"ContainerStarted","Data":"8076743abf3ed50dc19a97baaf6c7f585f2185efe9f19b5612b09cdc630fac23"} Mar 12 15:21:00 crc kubenswrapper[4869]: I0312 15:21:00.336362 4869 scope.go:117] "RemoveContainer" containerID="bc737999b30693337d1e520b47de07490eab1ec628aefabcaeb8e63a17e676eb" Mar 12 15:21:00 crc kubenswrapper[4869]: I0312 15:21:00.405113 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xf8vz"] Mar 12 15:21:01 crc kubenswrapper[4869]: I0312 15:21:01.177004 4869 generic.go:334] "Generic (PLEG): container finished" podID="4d668230-fd7a-44cf-9355-a8174bd0b4ff" containerID="d2d86646ad105a23e40865eaa60190a7cd90bf35a56736b5f09d8b21b6b48280" exitCode=0 Mar 12 15:21:01 crc kubenswrapper[4869]: I0312 15:21:01.177107 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xf8vz" event={"ID":"4d668230-fd7a-44cf-9355-a8174bd0b4ff","Type":"ContainerDied","Data":"d2d86646ad105a23e40865eaa60190a7cd90bf35a56736b5f09d8b21b6b48280"} Mar 12 15:21:01 crc kubenswrapper[4869]: I0312 15:21:01.177504 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xf8vz" event={"ID":"4d668230-fd7a-44cf-9355-a8174bd0b4ff","Type":"ContainerStarted","Data":"b56e7551e7b826378efcca96824fb23a99ef0c6c23f6e1b8c444ca7757997f15"} Mar 12 15:21:01 crc kubenswrapper[4869]: I0312 15:21:01.182199 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerStarted","Data":"6d2f6b1edc93a1efc0577095741ab3b3ad70babba8d5975e923b0ee75f9c99a6"} Mar 12 15:21:01 crc kubenswrapper[4869]: I0312 15:21:01.185114 4869 generic.go:334] "Generic (PLEG): container finished" podID="e2339bbe-49ef-4cd1-806a-4c839fefdcd8" containerID="8076743abf3ed50dc19a97baaf6c7f585f2185efe9f19b5612b09cdc630fac23" exitCode=0 Mar 12 15:21:01 crc kubenswrapper[4869]: I0312 15:21:01.185146 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7svp2" event={"ID":"e2339bbe-49ef-4cd1-806a-4c839fefdcd8","Type":"ContainerDied","Data":"8076743abf3ed50dc19a97baaf6c7f585f2185efe9f19b5612b09cdc630fac23"} Mar 12 15:21:02 crc kubenswrapper[4869]: I0312 15:21:02.201495 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7svp2" event={"ID":"e2339bbe-49ef-4cd1-806a-4c839fefdcd8","Type":"ContainerStarted","Data":"9b82ea9c0cf2a358175e8554613280eb0c6ef565dd193b5a7260553dd21d77b5"} Mar 12 15:21:02 crc kubenswrapper[4869]: I0312 15:21:02.230173 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7svp2" podStartSLOduration=2.5772184339999997 podStartE2EDuration="5.230132652s" podCreationTimestamp="2026-03-12 15:20:57 +0000 UTC" firstStartedPulling="2026-03-12 15:20:59.154835987 +0000 UTC m=+2011.440061265" lastFinishedPulling="2026-03-12 15:21:01.807750205 +0000 UTC m=+2014.092975483" observedRunningTime="2026-03-12 15:21:02.219780127 +0000 UTC m=+2014.505005405" watchObservedRunningTime="2026-03-12 15:21:02.230132652 +0000 UTC m=+2014.515357930" Mar 12 15:21:03 crc kubenswrapper[4869]: I0312 15:21:03.215417 4869 generic.go:334] "Generic (PLEG): container finished" podID="4d668230-fd7a-44cf-9355-a8174bd0b4ff" containerID="8bc40bed86a894eb5ce4d7ca2bdb59e601b47e36c255c31824cb739d46f79f90" exitCode=0 Mar 12 15:21:03 crc kubenswrapper[4869]: I0312 15:21:03.215491 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xf8vz" event={"ID":"4d668230-fd7a-44cf-9355-a8174bd0b4ff","Type":"ContainerDied","Data":"8bc40bed86a894eb5ce4d7ca2bdb59e601b47e36c255c31824cb739d46f79f90"} Mar 12 15:21:04 crc kubenswrapper[4869]: I0312 15:21:04.228182 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xf8vz" event={"ID":"4d668230-fd7a-44cf-9355-a8174bd0b4ff","Type":"ContainerStarted","Data":"e9c35be318e5dc28c7cb9da9169b64fbc31a90b5415e6a01f7a726e644dc64e3"} Mar 12 15:21:04 crc kubenswrapper[4869]: I0312 15:21:04.255591 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xf8vz" podStartSLOduration=2.786959001 podStartE2EDuration="5.255570144s" podCreationTimestamp="2026-03-12 15:20:59 +0000 UTC" firstStartedPulling="2026-03-12 15:21:01.181342726 +0000 UTC m=+2013.466568004" lastFinishedPulling="2026-03-12 15:21:03.649953869 +0000 UTC m=+2015.935179147" observedRunningTime="2026-03-12 15:21:04.245797303 +0000 UTC m=+2016.531022581" watchObservedRunningTime="2026-03-12 15:21:04.255570144 +0000 UTC m=+2016.540795422" Mar 12 15:21:07 crc kubenswrapper[4869]: I0312 15:21:07.507824 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7svp2" Mar 12 15:21:07 crc kubenswrapper[4869]: I0312 15:21:07.508153 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7svp2" Mar 12 15:21:07 crc kubenswrapper[4869]: I0312 15:21:07.549859 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7svp2" Mar 12 15:21:08 crc kubenswrapper[4869]: I0312 15:21:08.320361 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7svp2" Mar 12 15:21:08 crc kubenswrapper[4869]: I0312 15:21:08.947027 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7svp2"] Mar 12 15:21:09 crc kubenswrapper[4869]: I0312 15:21:09.902299 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xf8vz" Mar 12 15:21:09 crc kubenswrapper[4869]: I0312 15:21:09.902660 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xf8vz" Mar 12 15:21:09 crc kubenswrapper[4869]: I0312 15:21:09.946715 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xf8vz" Mar 12 15:21:10 crc kubenswrapper[4869]: I0312 15:21:10.276655 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7svp2" podUID="e2339bbe-49ef-4cd1-806a-4c839fefdcd8" containerName="registry-server" containerID="cri-o://9b82ea9c0cf2a358175e8554613280eb0c6ef565dd193b5a7260553dd21d77b5" gracePeriod=2 Mar 12 15:21:10 crc kubenswrapper[4869]: I0312 15:21:10.348944 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xf8vz" Mar 12 15:21:10 crc kubenswrapper[4869]: I0312 15:21:10.725382 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7svp2" Mar 12 15:21:10 crc kubenswrapper[4869]: I0312 15:21:10.743252 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2339bbe-49ef-4cd1-806a-4c839fefdcd8-catalog-content\") pod \"e2339bbe-49ef-4cd1-806a-4c839fefdcd8\" (UID: \"e2339bbe-49ef-4cd1-806a-4c839fefdcd8\") " Mar 12 15:21:10 crc kubenswrapper[4869]: I0312 15:21:10.743363 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl6cx\" (UniqueName: \"kubernetes.io/projected/e2339bbe-49ef-4cd1-806a-4c839fefdcd8-kube-api-access-tl6cx\") pod \"e2339bbe-49ef-4cd1-806a-4c839fefdcd8\" (UID: \"e2339bbe-49ef-4cd1-806a-4c839fefdcd8\") " Mar 12 15:21:10 crc kubenswrapper[4869]: I0312 15:21:10.743429 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2339bbe-49ef-4cd1-806a-4c839fefdcd8-utilities\") pod \"e2339bbe-49ef-4cd1-806a-4c839fefdcd8\" (UID: \"e2339bbe-49ef-4cd1-806a-4c839fefdcd8\") " Mar 12 15:21:10 crc kubenswrapper[4869]: I0312 15:21:10.744263 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2339bbe-49ef-4cd1-806a-4c839fefdcd8-utilities" (OuterVolumeSpecName: "utilities") pod "e2339bbe-49ef-4cd1-806a-4c839fefdcd8" (UID: "e2339bbe-49ef-4cd1-806a-4c839fefdcd8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:21:10 crc kubenswrapper[4869]: I0312 15:21:10.754351 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2339bbe-49ef-4cd1-806a-4c839fefdcd8-kube-api-access-tl6cx" (OuterVolumeSpecName: "kube-api-access-tl6cx") pod "e2339bbe-49ef-4cd1-806a-4c839fefdcd8" (UID: "e2339bbe-49ef-4cd1-806a-4c839fefdcd8"). InnerVolumeSpecName "kube-api-access-tl6cx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:21:10 crc kubenswrapper[4869]: I0312 15:21:10.805521 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2339bbe-49ef-4cd1-806a-4c839fefdcd8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e2339bbe-49ef-4cd1-806a-4c839fefdcd8" (UID: "e2339bbe-49ef-4cd1-806a-4c839fefdcd8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:21:10 crc kubenswrapper[4869]: I0312 15:21:10.846106 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2339bbe-49ef-4cd1-806a-4c839fefdcd8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:21:10 crc kubenswrapper[4869]: I0312 15:21:10.846146 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl6cx\" (UniqueName: \"kubernetes.io/projected/e2339bbe-49ef-4cd1-806a-4c839fefdcd8-kube-api-access-tl6cx\") on node \"crc\" DevicePath \"\"" Mar 12 15:21:10 crc kubenswrapper[4869]: I0312 15:21:10.846161 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2339bbe-49ef-4cd1-806a-4c839fefdcd8-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:21:11 crc kubenswrapper[4869]: I0312 15:21:11.288043 4869 generic.go:334] "Generic (PLEG): container finished" podID="e2339bbe-49ef-4cd1-806a-4c839fefdcd8" containerID="9b82ea9c0cf2a358175e8554613280eb0c6ef565dd193b5a7260553dd21d77b5" exitCode=0 Mar 12 15:21:11 crc kubenswrapper[4869]: I0312 15:21:11.288108 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7svp2" event={"ID":"e2339bbe-49ef-4cd1-806a-4c839fefdcd8","Type":"ContainerDied","Data":"9b82ea9c0cf2a358175e8554613280eb0c6ef565dd193b5a7260553dd21d77b5"} Mar 12 15:21:11 crc kubenswrapper[4869]: I0312 15:21:11.288133 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7svp2" Mar 12 15:21:11 crc kubenswrapper[4869]: I0312 15:21:11.288194 4869 scope.go:117] "RemoveContainer" containerID="9b82ea9c0cf2a358175e8554613280eb0c6ef565dd193b5a7260553dd21d77b5" Mar 12 15:21:11 crc kubenswrapper[4869]: I0312 15:21:11.288176 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7svp2" event={"ID":"e2339bbe-49ef-4cd1-806a-4c839fefdcd8","Type":"ContainerDied","Data":"99843383b9b8fe02e55424c1e665473056dc8bbe6d6f72c9095886a8366cc44e"} Mar 12 15:21:11 crc kubenswrapper[4869]: I0312 15:21:11.321745 4869 scope.go:117] "RemoveContainer" containerID="8076743abf3ed50dc19a97baaf6c7f585f2185efe9f19b5612b09cdc630fac23" Mar 12 15:21:11 crc kubenswrapper[4869]: I0312 15:21:11.324656 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7svp2"] Mar 12 15:21:11 crc kubenswrapper[4869]: I0312 15:21:11.339286 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7svp2"] Mar 12 15:21:11 crc kubenswrapper[4869]: I0312 15:21:11.346508 4869 scope.go:117] "RemoveContainer" containerID="a630dda13fca00c554c07b832cfb7ffce623aed15a538c435a2654ebb6f89d31" Mar 12 15:21:11 crc kubenswrapper[4869]: I0312 15:21:11.399429 4869 scope.go:117] "RemoveContainer" containerID="9b82ea9c0cf2a358175e8554613280eb0c6ef565dd193b5a7260553dd21d77b5" Mar 12 15:21:11 crc kubenswrapper[4869]: E0312 15:21:11.399978 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b82ea9c0cf2a358175e8554613280eb0c6ef565dd193b5a7260553dd21d77b5\": container with ID starting with 9b82ea9c0cf2a358175e8554613280eb0c6ef565dd193b5a7260553dd21d77b5 not found: ID does not exist" containerID="9b82ea9c0cf2a358175e8554613280eb0c6ef565dd193b5a7260553dd21d77b5" Mar 12 15:21:11 crc kubenswrapper[4869]: I0312 15:21:11.400033 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b82ea9c0cf2a358175e8554613280eb0c6ef565dd193b5a7260553dd21d77b5"} err="failed to get container status \"9b82ea9c0cf2a358175e8554613280eb0c6ef565dd193b5a7260553dd21d77b5\": rpc error: code = NotFound desc = could not find container \"9b82ea9c0cf2a358175e8554613280eb0c6ef565dd193b5a7260553dd21d77b5\": container with ID starting with 9b82ea9c0cf2a358175e8554613280eb0c6ef565dd193b5a7260553dd21d77b5 not found: ID does not exist" Mar 12 15:21:11 crc kubenswrapper[4869]: I0312 15:21:11.400069 4869 scope.go:117] "RemoveContainer" containerID="8076743abf3ed50dc19a97baaf6c7f585f2185efe9f19b5612b09cdc630fac23" Mar 12 15:21:11 crc kubenswrapper[4869]: E0312 15:21:11.400516 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8076743abf3ed50dc19a97baaf6c7f585f2185efe9f19b5612b09cdc630fac23\": container with ID starting with 8076743abf3ed50dc19a97baaf6c7f585f2185efe9f19b5612b09cdc630fac23 not found: ID does not exist" containerID="8076743abf3ed50dc19a97baaf6c7f585f2185efe9f19b5612b09cdc630fac23" Mar 12 15:21:11 crc kubenswrapper[4869]: I0312 15:21:11.400601 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8076743abf3ed50dc19a97baaf6c7f585f2185efe9f19b5612b09cdc630fac23"} err="failed to get container status \"8076743abf3ed50dc19a97baaf6c7f585f2185efe9f19b5612b09cdc630fac23\": rpc error: code = NotFound desc = could not find container \"8076743abf3ed50dc19a97baaf6c7f585f2185efe9f19b5612b09cdc630fac23\": container with ID starting with 8076743abf3ed50dc19a97baaf6c7f585f2185efe9f19b5612b09cdc630fac23 not found: ID does not exist" Mar 12 15:21:11 crc kubenswrapper[4869]: I0312 15:21:11.400642 4869 scope.go:117] "RemoveContainer" containerID="a630dda13fca00c554c07b832cfb7ffce623aed15a538c435a2654ebb6f89d31" Mar 12 15:21:11 crc kubenswrapper[4869]: E0312 15:21:11.400968 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a630dda13fca00c554c07b832cfb7ffce623aed15a538c435a2654ebb6f89d31\": container with ID starting with a630dda13fca00c554c07b832cfb7ffce623aed15a538c435a2654ebb6f89d31 not found: ID does not exist" containerID="a630dda13fca00c554c07b832cfb7ffce623aed15a538c435a2654ebb6f89d31" Mar 12 15:21:11 crc kubenswrapper[4869]: I0312 15:21:11.401000 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a630dda13fca00c554c07b832cfb7ffce623aed15a538c435a2654ebb6f89d31"} err="failed to get container status \"a630dda13fca00c554c07b832cfb7ffce623aed15a538c435a2654ebb6f89d31\": rpc error: code = NotFound desc = could not find container \"a630dda13fca00c554c07b832cfb7ffce623aed15a538c435a2654ebb6f89d31\": container with ID starting with a630dda13fca00c554c07b832cfb7ffce623aed15a538c435a2654ebb6f89d31 not found: ID does not exist" Mar 12 15:21:11 crc kubenswrapper[4869]: I0312 15:21:11.741596 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xf8vz"] Mar 12 15:21:12 crc kubenswrapper[4869]: I0312 15:21:12.377405 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2339bbe-49ef-4cd1-806a-4c839fefdcd8" path="/var/lib/kubelet/pods/e2339bbe-49ef-4cd1-806a-4c839fefdcd8/volumes" Mar 12 15:21:13 crc kubenswrapper[4869]: I0312 15:21:13.306574 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xf8vz" podUID="4d668230-fd7a-44cf-9355-a8174bd0b4ff" containerName="registry-server" containerID="cri-o://e9c35be318e5dc28c7cb9da9169b64fbc31a90b5415e6a01f7a726e644dc64e3" gracePeriod=2 Mar 12 15:21:13 crc kubenswrapper[4869]: I0312 15:21:13.851526 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xf8vz" Mar 12 15:21:14 crc kubenswrapper[4869]: I0312 15:21:14.011749 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d668230-fd7a-44cf-9355-a8174bd0b4ff-catalog-content\") pod \"4d668230-fd7a-44cf-9355-a8174bd0b4ff\" (UID: \"4d668230-fd7a-44cf-9355-a8174bd0b4ff\") " Mar 12 15:21:14 crc kubenswrapper[4869]: I0312 15:21:14.012041 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d668230-fd7a-44cf-9355-a8174bd0b4ff-utilities\") pod \"4d668230-fd7a-44cf-9355-a8174bd0b4ff\" (UID: \"4d668230-fd7a-44cf-9355-a8174bd0b4ff\") " Mar 12 15:21:14 crc kubenswrapper[4869]: I0312 15:21:14.012156 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dr4h\" (UniqueName: \"kubernetes.io/projected/4d668230-fd7a-44cf-9355-a8174bd0b4ff-kube-api-access-8dr4h\") pod \"4d668230-fd7a-44cf-9355-a8174bd0b4ff\" (UID: \"4d668230-fd7a-44cf-9355-a8174bd0b4ff\") " Mar 12 15:21:14 crc kubenswrapper[4869]: I0312 15:21:14.012908 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d668230-fd7a-44cf-9355-a8174bd0b4ff-utilities" (OuterVolumeSpecName: "utilities") pod "4d668230-fd7a-44cf-9355-a8174bd0b4ff" (UID: "4d668230-fd7a-44cf-9355-a8174bd0b4ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:21:14 crc kubenswrapper[4869]: I0312 15:21:14.018814 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d668230-fd7a-44cf-9355-a8174bd0b4ff-kube-api-access-8dr4h" (OuterVolumeSpecName: "kube-api-access-8dr4h") pod "4d668230-fd7a-44cf-9355-a8174bd0b4ff" (UID: "4d668230-fd7a-44cf-9355-a8174bd0b4ff"). InnerVolumeSpecName "kube-api-access-8dr4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:21:14 crc kubenswrapper[4869]: I0312 15:21:14.035485 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d668230-fd7a-44cf-9355-a8174bd0b4ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d668230-fd7a-44cf-9355-a8174bd0b4ff" (UID: "4d668230-fd7a-44cf-9355-a8174bd0b4ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:21:14 crc kubenswrapper[4869]: I0312 15:21:14.114647 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d668230-fd7a-44cf-9355-a8174bd0b4ff-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:21:14 crc kubenswrapper[4869]: I0312 15:21:14.114679 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d668230-fd7a-44cf-9355-a8174bd0b4ff-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:21:14 crc kubenswrapper[4869]: I0312 15:21:14.114694 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dr4h\" (UniqueName: \"kubernetes.io/projected/4d668230-fd7a-44cf-9355-a8174bd0b4ff-kube-api-access-8dr4h\") on node \"crc\" DevicePath \"\"" Mar 12 15:21:14 crc kubenswrapper[4869]: I0312 15:21:14.315794 4869 generic.go:334] "Generic (PLEG): container finished" podID="4d668230-fd7a-44cf-9355-a8174bd0b4ff" containerID="e9c35be318e5dc28c7cb9da9169b64fbc31a90b5415e6a01f7a726e644dc64e3" exitCode=0 Mar 12 15:21:14 crc kubenswrapper[4869]: I0312 15:21:14.315837 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xf8vz" event={"ID":"4d668230-fd7a-44cf-9355-a8174bd0b4ff","Type":"ContainerDied","Data":"e9c35be318e5dc28c7cb9da9169b64fbc31a90b5415e6a01f7a726e644dc64e3"} Mar 12 15:21:14 crc kubenswrapper[4869]: I0312 15:21:14.315862 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xf8vz" Mar 12 15:21:14 crc kubenswrapper[4869]: I0312 15:21:14.315882 4869 scope.go:117] "RemoveContainer" containerID="e9c35be318e5dc28c7cb9da9169b64fbc31a90b5415e6a01f7a726e644dc64e3" Mar 12 15:21:14 crc kubenswrapper[4869]: I0312 15:21:14.315870 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xf8vz" event={"ID":"4d668230-fd7a-44cf-9355-a8174bd0b4ff","Type":"ContainerDied","Data":"b56e7551e7b826378efcca96824fb23a99ef0c6c23f6e1b8c444ca7757997f15"} Mar 12 15:21:14 crc kubenswrapper[4869]: I0312 15:21:14.348379 4869 scope.go:117] "RemoveContainer" containerID="8bc40bed86a894eb5ce4d7ca2bdb59e601b47e36c255c31824cb739d46f79f90" Mar 12 15:21:14 crc kubenswrapper[4869]: I0312 15:21:14.356950 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xf8vz"] Mar 12 15:21:14 crc kubenswrapper[4869]: I0312 15:21:14.364840 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xf8vz"] Mar 12 15:21:14 crc kubenswrapper[4869]: I0312 15:21:14.368111 4869 scope.go:117] "RemoveContainer" containerID="d2d86646ad105a23e40865eaa60190a7cd90bf35a56736b5f09d8b21b6b48280" Mar 12 15:21:14 crc kubenswrapper[4869]: I0312 15:21:14.412552 4869 scope.go:117] "RemoveContainer" containerID="e9c35be318e5dc28c7cb9da9169b64fbc31a90b5415e6a01f7a726e644dc64e3" Mar 12 15:21:14 crc kubenswrapper[4869]: E0312 15:21:14.413059 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9c35be318e5dc28c7cb9da9169b64fbc31a90b5415e6a01f7a726e644dc64e3\": container with ID starting with e9c35be318e5dc28c7cb9da9169b64fbc31a90b5415e6a01f7a726e644dc64e3 not found: ID does not exist" containerID="e9c35be318e5dc28c7cb9da9169b64fbc31a90b5415e6a01f7a726e644dc64e3" Mar 12 15:21:14 crc kubenswrapper[4869]: I0312 15:21:14.413091 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9c35be318e5dc28c7cb9da9169b64fbc31a90b5415e6a01f7a726e644dc64e3"} err="failed to get container status \"e9c35be318e5dc28c7cb9da9169b64fbc31a90b5415e6a01f7a726e644dc64e3\": rpc error: code = NotFound desc = could not find container \"e9c35be318e5dc28c7cb9da9169b64fbc31a90b5415e6a01f7a726e644dc64e3\": container with ID starting with e9c35be318e5dc28c7cb9da9169b64fbc31a90b5415e6a01f7a726e644dc64e3 not found: ID does not exist" Mar 12 15:21:14 crc kubenswrapper[4869]: I0312 15:21:14.413110 4869 scope.go:117] "RemoveContainer" containerID="8bc40bed86a894eb5ce4d7ca2bdb59e601b47e36c255c31824cb739d46f79f90" Mar 12 15:21:14 crc kubenswrapper[4869]: E0312 15:21:14.413407 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bc40bed86a894eb5ce4d7ca2bdb59e601b47e36c255c31824cb739d46f79f90\": container with ID starting with 8bc40bed86a894eb5ce4d7ca2bdb59e601b47e36c255c31824cb739d46f79f90 not found: ID does not exist" containerID="8bc40bed86a894eb5ce4d7ca2bdb59e601b47e36c255c31824cb739d46f79f90" Mar 12 15:21:14 crc kubenswrapper[4869]: I0312 15:21:14.413427 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bc40bed86a894eb5ce4d7ca2bdb59e601b47e36c255c31824cb739d46f79f90"} err="failed to get container status \"8bc40bed86a894eb5ce4d7ca2bdb59e601b47e36c255c31824cb739d46f79f90\": rpc error: code = NotFound desc = could not find container \"8bc40bed86a894eb5ce4d7ca2bdb59e601b47e36c255c31824cb739d46f79f90\": container with ID starting with 8bc40bed86a894eb5ce4d7ca2bdb59e601b47e36c255c31824cb739d46f79f90 not found: ID does not exist" Mar 12 15:21:14 crc kubenswrapper[4869]: I0312 15:21:14.413438 4869 scope.go:117] "RemoveContainer" containerID="d2d86646ad105a23e40865eaa60190a7cd90bf35a56736b5f09d8b21b6b48280" Mar 12 15:21:14 crc kubenswrapper[4869]: E0312 15:21:14.413762 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2d86646ad105a23e40865eaa60190a7cd90bf35a56736b5f09d8b21b6b48280\": container with ID starting with d2d86646ad105a23e40865eaa60190a7cd90bf35a56736b5f09d8b21b6b48280 not found: ID does not exist" containerID="d2d86646ad105a23e40865eaa60190a7cd90bf35a56736b5f09d8b21b6b48280" Mar 12 15:21:14 crc kubenswrapper[4869]: I0312 15:21:14.413801 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2d86646ad105a23e40865eaa60190a7cd90bf35a56736b5f09d8b21b6b48280"} err="failed to get container status \"d2d86646ad105a23e40865eaa60190a7cd90bf35a56736b5f09d8b21b6b48280\": rpc error: code = NotFound desc = could not find container \"d2d86646ad105a23e40865eaa60190a7cd90bf35a56736b5f09d8b21b6b48280\": container with ID starting with d2d86646ad105a23e40865eaa60190a7cd90bf35a56736b5f09d8b21b6b48280 not found: ID does not exist" Mar 12 15:21:16 crc kubenswrapper[4869]: I0312 15:21:16.376747 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d668230-fd7a-44cf-9355-a8174bd0b4ff" path="/var/lib/kubelet/pods/4d668230-fd7a-44cf-9355-a8174bd0b4ff/volumes" Mar 12 15:21:34 crc kubenswrapper[4869]: I0312 15:21:34.092246 4869 scope.go:117] "RemoveContainer" containerID="4daa8807894435a3e07a29aced07bd516c2036a676927d13c83627681b7ae737" Mar 12 15:21:45 crc kubenswrapper[4869]: I0312 15:21:45.756876 4869 generic.go:334] "Generic (PLEG): container finished" podID="f9846b6b-2f29-45b7-86c0-aff84144a93a" containerID="6edd6843612e32419d5da3fbbefcad25f7e07bbe7a8359e220645e28fee8b7ad" exitCode=0 Mar 12 15:21:45 crc kubenswrapper[4869]: I0312 15:21:45.756935 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d9ljv" event={"ID":"f9846b6b-2f29-45b7-86c0-aff84144a93a","Type":"ContainerDied","Data":"6edd6843612e32419d5da3fbbefcad25f7e07bbe7a8359e220645e28fee8b7ad"} Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.214645 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d9ljv" Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.305430 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f9846b6b-2f29-45b7-86c0-aff84144a93a-ovncontroller-config-0\") pod \"f9846b6b-2f29-45b7-86c0-aff84144a93a\" (UID: \"f9846b6b-2f29-45b7-86c0-aff84144a93a\") " Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.305916 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9846b6b-2f29-45b7-86c0-aff84144a93a-inventory\") pod \"f9846b6b-2f29-45b7-86c0-aff84144a93a\" (UID: \"f9846b6b-2f29-45b7-86c0-aff84144a93a\") " Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.306047 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9846b6b-2f29-45b7-86c0-aff84144a93a-ssh-key-openstack-edpm-ipam\") pod \"f9846b6b-2f29-45b7-86c0-aff84144a93a\" (UID: \"f9846b6b-2f29-45b7-86c0-aff84144a93a\") " Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.306074 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9846b6b-2f29-45b7-86c0-aff84144a93a-ovn-combined-ca-bundle\") pod \"f9846b6b-2f29-45b7-86c0-aff84144a93a\" (UID: \"f9846b6b-2f29-45b7-86c0-aff84144a93a\") " Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.306099 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hsrm\" (UniqueName: \"kubernetes.io/projected/f9846b6b-2f29-45b7-86c0-aff84144a93a-kube-api-access-2hsrm\") pod \"f9846b6b-2f29-45b7-86c0-aff84144a93a\" (UID: \"f9846b6b-2f29-45b7-86c0-aff84144a93a\") " Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.311452 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9846b6b-2f29-45b7-86c0-aff84144a93a-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "f9846b6b-2f29-45b7-86c0-aff84144a93a" (UID: "f9846b6b-2f29-45b7-86c0-aff84144a93a"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.314898 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9846b6b-2f29-45b7-86c0-aff84144a93a-kube-api-access-2hsrm" (OuterVolumeSpecName: "kube-api-access-2hsrm") pod "f9846b6b-2f29-45b7-86c0-aff84144a93a" (UID: "f9846b6b-2f29-45b7-86c0-aff84144a93a"). InnerVolumeSpecName "kube-api-access-2hsrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.331252 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9846b6b-2f29-45b7-86c0-aff84144a93a-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "f9846b6b-2f29-45b7-86c0-aff84144a93a" (UID: "f9846b6b-2f29-45b7-86c0-aff84144a93a"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.332224 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9846b6b-2f29-45b7-86c0-aff84144a93a-inventory" (OuterVolumeSpecName: "inventory") pod "f9846b6b-2f29-45b7-86c0-aff84144a93a" (UID: "f9846b6b-2f29-45b7-86c0-aff84144a93a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.332719 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9846b6b-2f29-45b7-86c0-aff84144a93a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f9846b6b-2f29-45b7-86c0-aff84144a93a" (UID: "f9846b6b-2f29-45b7-86c0-aff84144a93a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.408189 4869 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9846b6b-2f29-45b7-86c0-aff84144a93a-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.408217 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9846b6b-2f29-45b7-86c0-aff84144a93a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.408229 4869 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9846b6b-2f29-45b7-86c0-aff84144a93a-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.408238 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hsrm\" (UniqueName: \"kubernetes.io/projected/f9846b6b-2f29-45b7-86c0-aff84144a93a-kube-api-access-2hsrm\") on node \"crc\" DevicePath \"\"" Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.408248 4869 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f9846b6b-2f29-45b7-86c0-aff84144a93a-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.780453 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d9ljv" event={"ID":"f9846b6b-2f29-45b7-86c0-aff84144a93a","Type":"ContainerDied","Data":"c8ce8d33602a7f8cfcdd141157a0a111876cc21efb958b7b367f83927228655f"} Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.780492 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8ce8d33602a7f8cfcdd141157a0a111876cc21efb958b7b367f83927228655f" Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.780523 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-d9ljv" Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.863013 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts"] Mar 12 15:21:47 crc kubenswrapper[4869]: E0312 15:21:47.863397 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9846b6b-2f29-45b7-86c0-aff84144a93a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.863415 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9846b6b-2f29-45b7-86c0-aff84144a93a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 12 15:21:47 crc kubenswrapper[4869]: E0312 15:21:47.863431 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d668230-fd7a-44cf-9355-a8174bd0b4ff" containerName="extract-utilities" Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.863438 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d668230-fd7a-44cf-9355-a8174bd0b4ff" containerName="extract-utilities" Mar 12 15:21:47 crc kubenswrapper[4869]: E0312 15:21:47.863452 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2339bbe-49ef-4cd1-806a-4c839fefdcd8" containerName="registry-server" Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.863458 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2339bbe-49ef-4cd1-806a-4c839fefdcd8" containerName="registry-server" Mar 12 15:21:47 crc kubenswrapper[4869]: E0312 15:21:47.863469 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d668230-fd7a-44cf-9355-a8174bd0b4ff" containerName="registry-server" Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.863475 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d668230-fd7a-44cf-9355-a8174bd0b4ff" containerName="registry-server" Mar 12 15:21:47 crc kubenswrapper[4869]: E0312 15:21:47.863488 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2339bbe-49ef-4cd1-806a-4c839fefdcd8" containerName="extract-utilities" Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.863494 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2339bbe-49ef-4cd1-806a-4c839fefdcd8" containerName="extract-utilities" Mar 12 15:21:47 crc kubenswrapper[4869]: E0312 15:21:47.863512 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2339bbe-49ef-4cd1-806a-4c839fefdcd8" containerName="extract-content" Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.863517 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2339bbe-49ef-4cd1-806a-4c839fefdcd8" containerName="extract-content" Mar 12 15:21:47 crc kubenswrapper[4869]: E0312 15:21:47.863524 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d668230-fd7a-44cf-9355-a8174bd0b4ff" containerName="extract-content" Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.863530 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d668230-fd7a-44cf-9355-a8174bd0b4ff" containerName="extract-content" Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.863708 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2339bbe-49ef-4cd1-806a-4c839fefdcd8" containerName="registry-server" Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.863727 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9846b6b-2f29-45b7-86c0-aff84144a93a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.863741 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d668230-fd7a-44cf-9355-a8174bd0b4ff" containerName="registry-server" Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.864329 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts" Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.866931 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.868768 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cxsgq" Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.871192 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.871289 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.875637 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts"] Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.876251 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 15:21:47 crc kubenswrapper[4869]: I0312 15:21:47.876748 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 12 15:21:48 crc kubenswrapper[4869]: I0312 15:21:48.019060 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/42c001af-bc5e-4906-b65f-0ec328893bce-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts\" (UID: \"42c001af-bc5e-4906-b65f-0ec328893bce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts" Mar 12 15:21:48 crc kubenswrapper[4869]: I0312 15:21:48.019126 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwd5b\" (UniqueName: \"kubernetes.io/projected/42c001af-bc5e-4906-b65f-0ec328893bce-kube-api-access-bwd5b\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts\" (UID: \"42c001af-bc5e-4906-b65f-0ec328893bce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts" Mar 12 15:21:48 crc kubenswrapper[4869]: I0312 15:21:48.019190 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42c001af-bc5e-4906-b65f-0ec328893bce-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts\" (UID: \"42c001af-bc5e-4906-b65f-0ec328893bce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts" Mar 12 15:21:48 crc kubenswrapper[4869]: I0312 15:21:48.020223 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/42c001af-bc5e-4906-b65f-0ec328893bce-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts\" (UID: \"42c001af-bc5e-4906-b65f-0ec328893bce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts" Mar 12 15:21:48 crc kubenswrapper[4869]: I0312 15:21:48.020476 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/42c001af-bc5e-4906-b65f-0ec328893bce-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts\" (UID: \"42c001af-bc5e-4906-b65f-0ec328893bce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts" Mar 12 15:21:48 crc kubenswrapper[4869]: I0312 15:21:48.020794 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c001af-bc5e-4906-b65f-0ec328893bce-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts\" (UID: \"42c001af-bc5e-4906-b65f-0ec328893bce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts" Mar 12 15:21:48 crc kubenswrapper[4869]: I0312 15:21:48.122864 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/42c001af-bc5e-4906-b65f-0ec328893bce-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts\" (UID: \"42c001af-bc5e-4906-b65f-0ec328893bce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts" Mar 12 15:21:48 crc kubenswrapper[4869]: I0312 15:21:48.123156 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c001af-bc5e-4906-b65f-0ec328893bce-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts\" (UID: \"42c001af-bc5e-4906-b65f-0ec328893bce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts" Mar 12 15:21:48 crc kubenswrapper[4869]: I0312 15:21:48.123218 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/42c001af-bc5e-4906-b65f-0ec328893bce-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts\" (UID: \"42c001af-bc5e-4906-b65f-0ec328893bce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts" Mar 12 15:21:48 crc kubenswrapper[4869]: I0312 15:21:48.123291 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwd5b\" (UniqueName: \"kubernetes.io/projected/42c001af-bc5e-4906-b65f-0ec328893bce-kube-api-access-bwd5b\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts\" (UID: \"42c001af-bc5e-4906-b65f-0ec328893bce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts" Mar 12 15:21:48 crc kubenswrapper[4869]: I0312 15:21:48.123370 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42c001af-bc5e-4906-b65f-0ec328893bce-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts\" (UID: \"42c001af-bc5e-4906-b65f-0ec328893bce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts" Mar 12 15:21:48 crc kubenswrapper[4869]: I0312 15:21:48.123412 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/42c001af-bc5e-4906-b65f-0ec328893bce-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts\" (UID: \"42c001af-bc5e-4906-b65f-0ec328893bce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts" Mar 12 15:21:48 crc kubenswrapper[4869]: I0312 15:21:48.128568 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/42c001af-bc5e-4906-b65f-0ec328893bce-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts\" (UID: \"42c001af-bc5e-4906-b65f-0ec328893bce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts" Mar 12 15:21:48 crc kubenswrapper[4869]: I0312 15:21:48.129573 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/42c001af-bc5e-4906-b65f-0ec328893bce-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts\" (UID: \"42c001af-bc5e-4906-b65f-0ec328893bce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts" Mar 12 15:21:48 crc kubenswrapper[4869]: I0312 15:21:48.129859 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/42c001af-bc5e-4906-b65f-0ec328893bce-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts\" (UID: \"42c001af-bc5e-4906-b65f-0ec328893bce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts" Mar 12 15:21:48 crc kubenswrapper[4869]: I0312 15:21:48.131053 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42c001af-bc5e-4906-b65f-0ec328893bce-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts\" (UID: \"42c001af-bc5e-4906-b65f-0ec328893bce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts" Mar 12 15:21:48 crc kubenswrapper[4869]: I0312 15:21:48.131798 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c001af-bc5e-4906-b65f-0ec328893bce-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts\" (UID: \"42c001af-bc5e-4906-b65f-0ec328893bce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts" Mar 12 15:21:48 crc kubenswrapper[4869]: I0312 15:21:48.148958 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwd5b\" (UniqueName: \"kubernetes.io/projected/42c001af-bc5e-4906-b65f-0ec328893bce-kube-api-access-bwd5b\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts\" (UID: \"42c001af-bc5e-4906-b65f-0ec328893bce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts" Mar 12 15:21:48 crc kubenswrapper[4869]: I0312 15:21:48.185409 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts" Mar 12 15:21:48 crc kubenswrapper[4869]: I0312 15:21:48.745255 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts"] Mar 12 15:21:48 crc kubenswrapper[4869]: W0312 15:21:48.748165 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42c001af_bc5e_4906_b65f_0ec328893bce.slice/crio-89c85f427a0abcbaa0ca8e99d54f2af7d3c55a7c314607f6f598006be0a13aae WatchSource:0}: Error finding container 89c85f427a0abcbaa0ca8e99d54f2af7d3c55a7c314607f6f598006be0a13aae: Status 404 returned error can't find the container with id 89c85f427a0abcbaa0ca8e99d54f2af7d3c55a7c314607f6f598006be0a13aae Mar 12 15:21:48 crc kubenswrapper[4869]: I0312 15:21:48.789181 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts" event={"ID":"42c001af-bc5e-4906-b65f-0ec328893bce","Type":"ContainerStarted","Data":"89c85f427a0abcbaa0ca8e99d54f2af7d3c55a7c314607f6f598006be0a13aae"} Mar 12 15:21:49 crc kubenswrapper[4869]: I0312 15:21:49.800519 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts" event={"ID":"42c001af-bc5e-4906-b65f-0ec328893bce","Type":"ContainerStarted","Data":"1c9fe49d0b81729ef634d9ef1bc998c671af2a36edafe7040c7f1cea1bd87bea"} Mar 12 15:21:49 crc kubenswrapper[4869]: I0312 15:21:49.825288 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts" podStartSLOduration=2.349743975 podStartE2EDuration="2.825271117s" podCreationTimestamp="2026-03-12 15:21:47 +0000 UTC" firstStartedPulling="2026-03-12 15:21:48.751189502 +0000 UTC m=+2061.036414780" lastFinishedPulling="2026-03-12 15:21:49.226716604 +0000 UTC m=+2061.511941922" observedRunningTime="2026-03-12 15:21:49.822295851 +0000 UTC m=+2062.107521139" watchObservedRunningTime="2026-03-12 15:21:49.825271117 +0000 UTC m=+2062.110496395" Mar 12 15:22:00 crc kubenswrapper[4869]: I0312 15:22:00.145208 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555482-sjpgl"] Mar 12 15:22:00 crc kubenswrapper[4869]: I0312 15:22:00.148062 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555482-sjpgl" Mar 12 15:22:00 crc kubenswrapper[4869]: I0312 15:22:00.150105 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 15:22:00 crc kubenswrapper[4869]: I0312 15:22:00.152948 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:22:00 crc kubenswrapper[4869]: I0312 15:22:00.153357 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:22:00 crc kubenswrapper[4869]: I0312 15:22:00.157296 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555482-sjpgl"] Mar 12 15:22:00 crc kubenswrapper[4869]: I0312 15:22:00.217094 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh5pt\" (UniqueName: \"kubernetes.io/projected/f76d71a8-ef71-49d2-815f-c043aa32ab16-kube-api-access-zh5pt\") pod \"auto-csr-approver-29555482-sjpgl\" (UID: \"f76d71a8-ef71-49d2-815f-c043aa32ab16\") " pod="openshift-infra/auto-csr-approver-29555482-sjpgl" Mar 12 15:22:00 crc kubenswrapper[4869]: I0312 15:22:00.318978 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh5pt\" (UniqueName: \"kubernetes.io/projected/f76d71a8-ef71-49d2-815f-c043aa32ab16-kube-api-access-zh5pt\") pod \"auto-csr-approver-29555482-sjpgl\" (UID: \"f76d71a8-ef71-49d2-815f-c043aa32ab16\") " pod="openshift-infra/auto-csr-approver-29555482-sjpgl" Mar 12 15:22:00 crc kubenswrapper[4869]: I0312 15:22:00.337676 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh5pt\" (UniqueName: \"kubernetes.io/projected/f76d71a8-ef71-49d2-815f-c043aa32ab16-kube-api-access-zh5pt\") pod \"auto-csr-approver-29555482-sjpgl\" (UID: \"f76d71a8-ef71-49d2-815f-c043aa32ab16\") " pod="openshift-infra/auto-csr-approver-29555482-sjpgl" Mar 12 15:22:00 crc kubenswrapper[4869]: I0312 15:22:00.466718 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555482-sjpgl" Mar 12 15:22:00 crc kubenswrapper[4869]: I0312 15:22:00.914247 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555482-sjpgl"] Mar 12 15:22:00 crc kubenswrapper[4869]: W0312 15:22:00.919465 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf76d71a8_ef71_49d2_815f_c043aa32ab16.slice/crio-af945622ccd03f7dd24be51ef32a7f3cef1072f18b797724ad0930f9f3acdc01 WatchSource:0}: Error finding container af945622ccd03f7dd24be51ef32a7f3cef1072f18b797724ad0930f9f3acdc01: Status 404 returned error can't find the container with id af945622ccd03f7dd24be51ef32a7f3cef1072f18b797724ad0930f9f3acdc01 Mar 12 15:22:00 crc kubenswrapper[4869]: I0312 15:22:00.921814 4869 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 15:22:01 crc kubenswrapper[4869]: I0312 15:22:01.905734 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555482-sjpgl" event={"ID":"f76d71a8-ef71-49d2-815f-c043aa32ab16","Type":"ContainerStarted","Data":"af945622ccd03f7dd24be51ef32a7f3cef1072f18b797724ad0930f9f3acdc01"} Mar 12 15:22:02 crc kubenswrapper[4869]: I0312 15:22:02.916325 4869 generic.go:334] "Generic (PLEG): container finished" podID="f76d71a8-ef71-49d2-815f-c043aa32ab16" containerID="68df47a6888c65bfd924d124542625f146030dc647a468ae45a4614a4bdda223" exitCode=0 Mar 12 15:22:02 crc kubenswrapper[4869]: I0312 15:22:02.916404 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555482-sjpgl" event={"ID":"f76d71a8-ef71-49d2-815f-c043aa32ab16","Type":"ContainerDied","Data":"68df47a6888c65bfd924d124542625f146030dc647a468ae45a4614a4bdda223"} Mar 12 15:22:04 crc kubenswrapper[4869]: I0312 15:22:04.257287 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555482-sjpgl" Mar 12 15:22:04 crc kubenswrapper[4869]: I0312 15:22:04.299695 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh5pt\" (UniqueName: \"kubernetes.io/projected/f76d71a8-ef71-49d2-815f-c043aa32ab16-kube-api-access-zh5pt\") pod \"f76d71a8-ef71-49d2-815f-c043aa32ab16\" (UID: \"f76d71a8-ef71-49d2-815f-c043aa32ab16\") " Mar 12 15:22:04 crc kubenswrapper[4869]: I0312 15:22:04.306893 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f76d71a8-ef71-49d2-815f-c043aa32ab16-kube-api-access-zh5pt" (OuterVolumeSpecName: "kube-api-access-zh5pt") pod "f76d71a8-ef71-49d2-815f-c043aa32ab16" (UID: "f76d71a8-ef71-49d2-815f-c043aa32ab16"). InnerVolumeSpecName "kube-api-access-zh5pt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:22:04 crc kubenswrapper[4869]: I0312 15:22:04.402793 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh5pt\" (UniqueName: \"kubernetes.io/projected/f76d71a8-ef71-49d2-815f-c043aa32ab16-kube-api-access-zh5pt\") on node \"crc\" DevicePath \"\"" Mar 12 15:22:04 crc kubenswrapper[4869]: I0312 15:22:04.939164 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555482-sjpgl" event={"ID":"f76d71a8-ef71-49d2-815f-c043aa32ab16","Type":"ContainerDied","Data":"af945622ccd03f7dd24be51ef32a7f3cef1072f18b797724ad0930f9f3acdc01"} Mar 12 15:22:04 crc kubenswrapper[4869]: I0312 15:22:04.939223 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555482-sjpgl" Mar 12 15:22:04 crc kubenswrapper[4869]: I0312 15:22:04.939233 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af945622ccd03f7dd24be51ef32a7f3cef1072f18b797724ad0930f9f3acdc01" Mar 12 15:22:05 crc kubenswrapper[4869]: I0312 15:22:05.330654 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555476-rt59v"] Mar 12 15:22:05 crc kubenswrapper[4869]: I0312 15:22:05.339473 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555476-rt59v"] Mar 12 15:22:06 crc kubenswrapper[4869]: I0312 15:22:06.352428 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e96d5205-aee2-4126-8c64-9de951feffc7" path="/var/lib/kubelet/pods/e96d5205-aee2-4126-8c64-9de951feffc7/volumes" Mar 12 15:22:34 crc kubenswrapper[4869]: I0312 15:22:34.182049 4869 scope.go:117] "RemoveContainer" containerID="d6c951f1a88c0beec45caad509423a3b6e84f11c515c38560b98ea0c2a140ae6" Mar 12 15:22:35 crc kubenswrapper[4869]: I0312 15:22:35.330745 4869 generic.go:334] "Generic (PLEG): container finished" podID="42c001af-bc5e-4906-b65f-0ec328893bce" containerID="1c9fe49d0b81729ef634d9ef1bc998c671af2a36edafe7040c7f1cea1bd87bea" exitCode=0 Mar 12 15:22:35 crc kubenswrapper[4869]: I0312 15:22:35.330790 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts" event={"ID":"42c001af-bc5e-4906-b65f-0ec328893bce","Type":"ContainerDied","Data":"1c9fe49d0b81729ef634d9ef1bc998c671af2a36edafe7040c7f1cea1bd87bea"} Mar 12 15:22:36 crc kubenswrapper[4869]: I0312 15:22:36.764120 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts" Mar 12 15:22:36 crc kubenswrapper[4869]: I0312 15:22:36.909510 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/42c001af-bc5e-4906-b65f-0ec328893bce-neutron-ovn-metadata-agent-neutron-config-0\") pod \"42c001af-bc5e-4906-b65f-0ec328893bce\" (UID: \"42c001af-bc5e-4906-b65f-0ec328893bce\") " Mar 12 15:22:36 crc kubenswrapper[4869]: I0312 15:22:36.909649 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/42c001af-bc5e-4906-b65f-0ec328893bce-nova-metadata-neutron-config-0\") pod \"42c001af-bc5e-4906-b65f-0ec328893bce\" (UID: \"42c001af-bc5e-4906-b65f-0ec328893bce\") " Mar 12 15:22:36 crc kubenswrapper[4869]: I0312 15:22:36.909680 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwd5b\" (UniqueName: \"kubernetes.io/projected/42c001af-bc5e-4906-b65f-0ec328893bce-kube-api-access-bwd5b\") pod \"42c001af-bc5e-4906-b65f-0ec328893bce\" (UID: \"42c001af-bc5e-4906-b65f-0ec328893bce\") " Mar 12 15:22:36 crc kubenswrapper[4869]: I0312 15:22:36.909737 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c001af-bc5e-4906-b65f-0ec328893bce-neutron-metadata-combined-ca-bundle\") pod \"42c001af-bc5e-4906-b65f-0ec328893bce\" (UID: \"42c001af-bc5e-4906-b65f-0ec328893bce\") " Mar 12 15:22:36 crc kubenswrapper[4869]: I0312 15:22:36.909861 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/42c001af-bc5e-4906-b65f-0ec328893bce-ssh-key-openstack-edpm-ipam\") pod \"42c001af-bc5e-4906-b65f-0ec328893bce\" (UID: \"42c001af-bc5e-4906-b65f-0ec328893bce\") " Mar 12 15:22:36 crc kubenswrapper[4869]: I0312 15:22:36.909903 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42c001af-bc5e-4906-b65f-0ec328893bce-inventory\") pod \"42c001af-bc5e-4906-b65f-0ec328893bce\" (UID: \"42c001af-bc5e-4906-b65f-0ec328893bce\") " Mar 12 15:22:36 crc kubenswrapper[4869]: I0312 15:22:36.916775 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42c001af-bc5e-4906-b65f-0ec328893bce-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "42c001af-bc5e-4906-b65f-0ec328893bce" (UID: "42c001af-bc5e-4906-b65f-0ec328893bce"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:22:36 crc kubenswrapper[4869]: I0312 15:22:36.917345 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42c001af-bc5e-4906-b65f-0ec328893bce-kube-api-access-bwd5b" (OuterVolumeSpecName: "kube-api-access-bwd5b") pod "42c001af-bc5e-4906-b65f-0ec328893bce" (UID: "42c001af-bc5e-4906-b65f-0ec328893bce"). InnerVolumeSpecName "kube-api-access-bwd5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:22:36 crc kubenswrapper[4869]: I0312 15:22:36.945514 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42c001af-bc5e-4906-b65f-0ec328893bce-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "42c001af-bc5e-4906-b65f-0ec328893bce" (UID: "42c001af-bc5e-4906-b65f-0ec328893bce"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:22:36 crc kubenswrapper[4869]: I0312 15:22:36.949734 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42c001af-bc5e-4906-b65f-0ec328893bce-inventory" (OuterVolumeSpecName: "inventory") pod "42c001af-bc5e-4906-b65f-0ec328893bce" (UID: "42c001af-bc5e-4906-b65f-0ec328893bce"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:22:36 crc kubenswrapper[4869]: I0312 15:22:36.949792 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42c001af-bc5e-4906-b65f-0ec328893bce-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "42c001af-bc5e-4906-b65f-0ec328893bce" (UID: "42c001af-bc5e-4906-b65f-0ec328893bce"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:22:36 crc kubenswrapper[4869]: I0312 15:22:36.955149 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42c001af-bc5e-4906-b65f-0ec328893bce-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "42c001af-bc5e-4906-b65f-0ec328893bce" (UID: "42c001af-bc5e-4906-b65f-0ec328893bce"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.012278 4869 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/42c001af-bc5e-4906-b65f-0ec328893bce-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.012590 4869 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/42c001af-bc5e-4906-b65f-0ec328893bce-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.012602 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwd5b\" (UniqueName: \"kubernetes.io/projected/42c001af-bc5e-4906-b65f-0ec328893bce-kube-api-access-bwd5b\") on node \"crc\" DevicePath \"\"" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.012613 4869 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c001af-bc5e-4906-b65f-0ec328893bce-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.012624 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/42c001af-bc5e-4906-b65f-0ec328893bce-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.012634 4869 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42c001af-bc5e-4906-b65f-0ec328893bce-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.356620 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts" event={"ID":"42c001af-bc5e-4906-b65f-0ec328893bce","Type":"ContainerDied","Data":"89c85f427a0abcbaa0ca8e99d54f2af7d3c55a7c314607f6f598006be0a13aae"} Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.356665 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89c85f427a0abcbaa0ca8e99d54f2af7d3c55a7c314607f6f598006be0a13aae" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.356760 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.532564 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw"] Mar 12 15:22:37 crc kubenswrapper[4869]: E0312 15:22:37.532969 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c001af-bc5e-4906-b65f-0ec328893bce" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.532993 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c001af-bc5e-4906-b65f-0ec328893bce" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 12 15:22:37 crc kubenswrapper[4869]: E0312 15:22:37.533041 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f76d71a8-ef71-49d2-815f-c043aa32ab16" containerName="oc" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.533049 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f76d71a8-ef71-49d2-815f-c043aa32ab16" containerName="oc" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.533270 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f76d71a8-ef71-49d2-815f-c043aa32ab16" containerName="oc" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.533297 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="42c001af-bc5e-4906-b65f-0ec328893bce" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.534071 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.538909 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.539009 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cxsgq" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.539652 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.540102 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.540630 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.570268 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw"] Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.630936 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw\" (UID: \"2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.630985 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw\" (UID: \"2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.631315 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzg9k\" (UniqueName: \"kubernetes.io/projected/2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a-kube-api-access-wzg9k\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw\" (UID: \"2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.631486 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw\" (UID: \"2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.631580 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw\" (UID: \"2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.732919 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzg9k\" (UniqueName: \"kubernetes.io/projected/2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a-kube-api-access-wzg9k\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw\" (UID: \"2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.733003 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw\" (UID: \"2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.733032 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw\" (UID: \"2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.733069 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw\" (UID: \"2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.733092 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw\" (UID: \"2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.738242 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw\" (UID: \"2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.738786 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw\" (UID: \"2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.739085 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw\" (UID: \"2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.742381 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw\" (UID: \"2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.753180 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzg9k\" (UniqueName: \"kubernetes.io/projected/2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a-kube-api-access-wzg9k\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw\" (UID: \"2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw" Mar 12 15:22:37 crc kubenswrapper[4869]: I0312 15:22:37.874749 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw" Mar 12 15:22:38 crc kubenswrapper[4869]: I0312 15:22:38.402142 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw"] Mar 12 15:22:39 crc kubenswrapper[4869]: I0312 15:22:39.388269 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw" event={"ID":"2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a","Type":"ContainerStarted","Data":"1d5c7c2b0a4a39b4035e52b722c55b6e1e8602f9b778240d575c1b2fe08d1264"} Mar 12 15:22:39 crc kubenswrapper[4869]: I0312 15:22:39.388594 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw" event={"ID":"2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a","Type":"ContainerStarted","Data":"dc8d223040de31087eb3187079ff20473cca8175d818ac28c9567e33d048fc78"} Mar 12 15:22:39 crc kubenswrapper[4869]: I0312 15:22:39.405950 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw" podStartSLOduration=2.021301527 podStartE2EDuration="2.405926696s" podCreationTimestamp="2026-03-12 15:22:37 +0000 UTC" firstStartedPulling="2026-03-12 15:22:38.410604483 +0000 UTC m=+2110.695829761" lastFinishedPulling="2026-03-12 15:22:38.795229652 +0000 UTC m=+2111.080454930" observedRunningTime="2026-03-12 15:22:39.404871189 +0000 UTC m=+2111.690096487" watchObservedRunningTime="2026-03-12 15:22:39.405926696 +0000 UTC m=+2111.691151974" Mar 12 15:23:19 crc kubenswrapper[4869]: I0312 15:23:19.684234 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:23:19 crc kubenswrapper[4869]: I0312 15:23:19.686001 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:23:49 crc kubenswrapper[4869]: I0312 15:23:49.684039 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:23:49 crc kubenswrapper[4869]: I0312 15:23:49.684515 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:24:00 crc kubenswrapper[4869]: I0312 15:24:00.175490 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555484-zk5zb"] Mar 12 15:24:00 crc kubenswrapper[4869]: I0312 15:24:00.178211 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555484-zk5zb" Mar 12 15:24:00 crc kubenswrapper[4869]: I0312 15:24:00.180238 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:24:00 crc kubenswrapper[4869]: I0312 15:24:00.180384 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 15:24:00 crc kubenswrapper[4869]: I0312 15:24:00.180632 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:24:00 crc kubenswrapper[4869]: I0312 15:24:00.186560 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555484-zk5zb"] Mar 12 15:24:00 crc kubenswrapper[4869]: I0312 15:24:00.245345 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99tmr\" (UniqueName: \"kubernetes.io/projected/4c5371f8-4455-4af9-a234-c4c9c8de6ec9-kube-api-access-99tmr\") pod \"auto-csr-approver-29555484-zk5zb\" (UID: \"4c5371f8-4455-4af9-a234-c4c9c8de6ec9\") " pod="openshift-infra/auto-csr-approver-29555484-zk5zb" Mar 12 15:24:00 crc kubenswrapper[4869]: I0312 15:24:00.347170 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99tmr\" (UniqueName: \"kubernetes.io/projected/4c5371f8-4455-4af9-a234-c4c9c8de6ec9-kube-api-access-99tmr\") pod \"auto-csr-approver-29555484-zk5zb\" (UID: \"4c5371f8-4455-4af9-a234-c4c9c8de6ec9\") " pod="openshift-infra/auto-csr-approver-29555484-zk5zb" Mar 12 15:24:00 crc kubenswrapper[4869]: I0312 15:24:00.374282 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99tmr\" (UniqueName: \"kubernetes.io/projected/4c5371f8-4455-4af9-a234-c4c9c8de6ec9-kube-api-access-99tmr\") pod \"auto-csr-approver-29555484-zk5zb\" (UID: \"4c5371f8-4455-4af9-a234-c4c9c8de6ec9\") " pod="openshift-infra/auto-csr-approver-29555484-zk5zb" Mar 12 15:24:00 crc kubenswrapper[4869]: I0312 15:24:00.498002 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555484-zk5zb" Mar 12 15:24:00 crc kubenswrapper[4869]: I0312 15:24:00.921412 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555484-zk5zb"] Mar 12 15:24:00 crc kubenswrapper[4869]: W0312 15:24:00.929220 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c5371f8_4455_4af9_a234_c4c9c8de6ec9.slice/crio-cac55c5374b0f2ad71855f630de1bc98ccb127bb45c81fd7bc1fd36b057f3b23 WatchSource:0}: Error finding container cac55c5374b0f2ad71855f630de1bc98ccb127bb45c81fd7bc1fd36b057f3b23: Status 404 returned error can't find the container with id cac55c5374b0f2ad71855f630de1bc98ccb127bb45c81fd7bc1fd36b057f3b23 Mar 12 15:24:01 crc kubenswrapper[4869]: I0312 15:24:01.066661 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555484-zk5zb" event={"ID":"4c5371f8-4455-4af9-a234-c4c9c8de6ec9","Type":"ContainerStarted","Data":"cac55c5374b0f2ad71855f630de1bc98ccb127bb45c81fd7bc1fd36b057f3b23"} Mar 12 15:24:03 crc kubenswrapper[4869]: I0312 15:24:03.088650 4869 generic.go:334] "Generic (PLEG): container finished" podID="4c5371f8-4455-4af9-a234-c4c9c8de6ec9" containerID="ccb610c4a3ac79138c5a060ba9fbe09037b3fd9fc8acebcbcdc8da98d09108b8" exitCode=0 Mar 12 15:24:03 crc kubenswrapper[4869]: I0312 15:24:03.088733 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555484-zk5zb" event={"ID":"4c5371f8-4455-4af9-a234-c4c9c8de6ec9","Type":"ContainerDied","Data":"ccb610c4a3ac79138c5a060ba9fbe09037b3fd9fc8acebcbcdc8da98d09108b8"} Mar 12 15:24:04 crc kubenswrapper[4869]: I0312 15:24:04.422946 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555484-zk5zb" Mar 12 15:24:04 crc kubenswrapper[4869]: I0312 15:24:04.548580 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99tmr\" (UniqueName: \"kubernetes.io/projected/4c5371f8-4455-4af9-a234-c4c9c8de6ec9-kube-api-access-99tmr\") pod \"4c5371f8-4455-4af9-a234-c4c9c8de6ec9\" (UID: \"4c5371f8-4455-4af9-a234-c4c9c8de6ec9\") " Mar 12 15:24:04 crc kubenswrapper[4869]: I0312 15:24:04.555240 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c5371f8-4455-4af9-a234-c4c9c8de6ec9-kube-api-access-99tmr" (OuterVolumeSpecName: "kube-api-access-99tmr") pod "4c5371f8-4455-4af9-a234-c4c9c8de6ec9" (UID: "4c5371f8-4455-4af9-a234-c4c9c8de6ec9"). InnerVolumeSpecName "kube-api-access-99tmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:24:04 crc kubenswrapper[4869]: I0312 15:24:04.651463 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99tmr\" (UniqueName: \"kubernetes.io/projected/4c5371f8-4455-4af9-a234-c4c9c8de6ec9-kube-api-access-99tmr\") on node \"crc\" DevicePath \"\"" Mar 12 15:24:05 crc kubenswrapper[4869]: I0312 15:24:05.106796 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555484-zk5zb" event={"ID":"4c5371f8-4455-4af9-a234-c4c9c8de6ec9","Type":"ContainerDied","Data":"cac55c5374b0f2ad71855f630de1bc98ccb127bb45c81fd7bc1fd36b057f3b23"} Mar 12 15:24:05 crc kubenswrapper[4869]: I0312 15:24:05.106872 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555484-zk5zb" Mar 12 15:24:05 crc kubenswrapper[4869]: I0312 15:24:05.106887 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cac55c5374b0f2ad71855f630de1bc98ccb127bb45c81fd7bc1fd36b057f3b23" Mar 12 15:24:05 crc kubenswrapper[4869]: I0312 15:24:05.497941 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555478-5smpd"] Mar 12 15:24:05 crc kubenswrapper[4869]: I0312 15:24:05.507013 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555478-5smpd"] Mar 12 15:24:06 crc kubenswrapper[4869]: I0312 15:24:06.347441 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a11410e3-6d91-4cfe-b61d-fa28af63504a" path="/var/lib/kubelet/pods/a11410e3-6d91-4cfe-b61d-fa28af63504a/volumes" Mar 12 15:24:19 crc kubenswrapper[4869]: I0312 15:24:19.684472 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:24:19 crc kubenswrapper[4869]: I0312 15:24:19.684987 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:24:19 crc kubenswrapper[4869]: I0312 15:24:19.685022 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" Mar 12 15:24:19 crc kubenswrapper[4869]: I0312 15:24:19.685765 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6d2f6b1edc93a1efc0577095741ab3b3ad70babba8d5975e923b0ee75f9c99a6"} pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 15:24:19 crc kubenswrapper[4869]: I0312 15:24:19.685837 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" containerID="cri-o://6d2f6b1edc93a1efc0577095741ab3b3ad70babba8d5975e923b0ee75f9c99a6" gracePeriod=600 Mar 12 15:24:20 crc kubenswrapper[4869]: I0312 15:24:20.250422 4869 generic.go:334] "Generic (PLEG): container finished" podID="1621c994-94d2-4105-a988-f4739518ba91" containerID="6d2f6b1edc93a1efc0577095741ab3b3ad70babba8d5975e923b0ee75f9c99a6" exitCode=0 Mar 12 15:24:20 crc kubenswrapper[4869]: I0312 15:24:20.250470 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerDied","Data":"6d2f6b1edc93a1efc0577095741ab3b3ad70babba8d5975e923b0ee75f9c99a6"} Mar 12 15:24:20 crc kubenswrapper[4869]: I0312 15:24:20.250865 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerStarted","Data":"c9a47bbb284d539c798de0e32d0c54787cc43e1f77e98632a15c2b240ce0d5b0"} Mar 12 15:24:20 crc kubenswrapper[4869]: I0312 15:24:20.250891 4869 scope.go:117] "RemoveContainer" containerID="bc737999b30693337d1e520b47de07490eab1ec628aefabcaeb8e63a17e676eb" Mar 12 15:24:34 crc kubenswrapper[4869]: I0312 15:24:34.312963 4869 scope.go:117] "RemoveContainer" containerID="1d076ff963f5c0bce1825f7bbe8cd5b4193df8f5e7c7af916609a77246f4cd77" Mar 12 15:25:30 crc kubenswrapper[4869]: I0312 15:25:30.226861 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-994qp"] Mar 12 15:25:30 crc kubenswrapper[4869]: E0312 15:25:30.228669 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c5371f8-4455-4af9-a234-c4c9c8de6ec9" containerName="oc" Mar 12 15:25:30 crc kubenswrapper[4869]: I0312 15:25:30.228683 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c5371f8-4455-4af9-a234-c4c9c8de6ec9" containerName="oc" Mar 12 15:25:30 crc kubenswrapper[4869]: I0312 15:25:30.229087 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c5371f8-4455-4af9-a234-c4c9c8de6ec9" containerName="oc" Mar 12 15:25:30 crc kubenswrapper[4869]: I0312 15:25:30.231503 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-994qp" Mar 12 15:25:30 crc kubenswrapper[4869]: I0312 15:25:30.255486 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-994qp"] Mar 12 15:25:30 crc kubenswrapper[4869]: I0312 15:25:30.304151 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq9wj\" (UniqueName: \"kubernetes.io/projected/6366ae71-5f0e-40d7-b27a-d94649d6ca4f-kube-api-access-sq9wj\") pod \"community-operators-994qp\" (UID: \"6366ae71-5f0e-40d7-b27a-d94649d6ca4f\") " pod="openshift-marketplace/community-operators-994qp" Mar 12 15:25:30 crc kubenswrapper[4869]: I0312 15:25:30.304205 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6366ae71-5f0e-40d7-b27a-d94649d6ca4f-utilities\") pod \"community-operators-994qp\" (UID: \"6366ae71-5f0e-40d7-b27a-d94649d6ca4f\") " pod="openshift-marketplace/community-operators-994qp" Mar 12 15:25:30 crc kubenswrapper[4869]: I0312 15:25:30.304237 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6366ae71-5f0e-40d7-b27a-d94649d6ca4f-catalog-content\") pod \"community-operators-994qp\" (UID: \"6366ae71-5f0e-40d7-b27a-d94649d6ca4f\") " pod="openshift-marketplace/community-operators-994qp" Mar 12 15:25:30 crc kubenswrapper[4869]: I0312 15:25:30.406196 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq9wj\" (UniqueName: \"kubernetes.io/projected/6366ae71-5f0e-40d7-b27a-d94649d6ca4f-kube-api-access-sq9wj\") pod \"community-operators-994qp\" (UID: \"6366ae71-5f0e-40d7-b27a-d94649d6ca4f\") " pod="openshift-marketplace/community-operators-994qp" Mar 12 15:25:30 crc kubenswrapper[4869]: I0312 15:25:30.406251 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6366ae71-5f0e-40d7-b27a-d94649d6ca4f-utilities\") pod \"community-operators-994qp\" (UID: \"6366ae71-5f0e-40d7-b27a-d94649d6ca4f\") " pod="openshift-marketplace/community-operators-994qp" Mar 12 15:25:30 crc kubenswrapper[4869]: I0312 15:25:30.406306 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6366ae71-5f0e-40d7-b27a-d94649d6ca4f-catalog-content\") pod \"community-operators-994qp\" (UID: \"6366ae71-5f0e-40d7-b27a-d94649d6ca4f\") " pod="openshift-marketplace/community-operators-994qp" Mar 12 15:25:30 crc kubenswrapper[4869]: I0312 15:25:30.406872 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6366ae71-5f0e-40d7-b27a-d94649d6ca4f-utilities\") pod \"community-operators-994qp\" (UID: \"6366ae71-5f0e-40d7-b27a-d94649d6ca4f\") " pod="openshift-marketplace/community-operators-994qp" Mar 12 15:25:30 crc kubenswrapper[4869]: I0312 15:25:30.407031 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6366ae71-5f0e-40d7-b27a-d94649d6ca4f-catalog-content\") pod \"community-operators-994qp\" (UID: \"6366ae71-5f0e-40d7-b27a-d94649d6ca4f\") " pod="openshift-marketplace/community-operators-994qp" Mar 12 15:25:30 crc kubenswrapper[4869]: I0312 15:25:30.433447 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq9wj\" (UniqueName: \"kubernetes.io/projected/6366ae71-5f0e-40d7-b27a-d94649d6ca4f-kube-api-access-sq9wj\") pod \"community-operators-994qp\" (UID: \"6366ae71-5f0e-40d7-b27a-d94649d6ca4f\") " pod="openshift-marketplace/community-operators-994qp" Mar 12 15:25:30 crc kubenswrapper[4869]: I0312 15:25:30.563868 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-994qp" Mar 12 15:25:31 crc kubenswrapper[4869]: I0312 15:25:31.120555 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-994qp"] Mar 12 15:25:31 crc kubenswrapper[4869]: I0312 15:25:31.864591 4869 generic.go:334] "Generic (PLEG): container finished" podID="6366ae71-5f0e-40d7-b27a-d94649d6ca4f" containerID="a575420f9c6f9c32b0da500158c103474bcd5a736bb704e163ce15991e582ffb" exitCode=0 Mar 12 15:25:31 crc kubenswrapper[4869]: I0312 15:25:31.864685 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-994qp" event={"ID":"6366ae71-5f0e-40d7-b27a-d94649d6ca4f","Type":"ContainerDied","Data":"a575420f9c6f9c32b0da500158c103474bcd5a736bb704e163ce15991e582ffb"} Mar 12 15:25:31 crc kubenswrapper[4869]: I0312 15:25:31.865009 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-994qp" event={"ID":"6366ae71-5f0e-40d7-b27a-d94649d6ca4f","Type":"ContainerStarted","Data":"03c478b2412400a5cb990ce2398e3d020c3aa925b0ed7f4f1f1663671cb1de3c"} Mar 12 15:25:35 crc kubenswrapper[4869]: I0312 15:25:35.899736 4869 generic.go:334] "Generic (PLEG): container finished" podID="6366ae71-5f0e-40d7-b27a-d94649d6ca4f" containerID="1f8993c73c1109041e7b7e96c1cad0c0d9d66a3c9c031256e97071c6145b16e5" exitCode=0 Mar 12 15:25:35 crc kubenswrapper[4869]: I0312 15:25:35.899808 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-994qp" event={"ID":"6366ae71-5f0e-40d7-b27a-d94649d6ca4f","Type":"ContainerDied","Data":"1f8993c73c1109041e7b7e96c1cad0c0d9d66a3c9c031256e97071c6145b16e5"} Mar 12 15:25:36 crc kubenswrapper[4869]: I0312 15:25:36.913886 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-994qp" event={"ID":"6366ae71-5f0e-40d7-b27a-d94649d6ca4f","Type":"ContainerStarted","Data":"527156c3a95c50d993baf52ddf83bebadc03bd9dd1d5bf681298d7db54acb050"} Mar 12 15:25:36 crc kubenswrapper[4869]: I0312 15:25:36.944470 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-994qp" podStartSLOduration=2.455462395 podStartE2EDuration="6.944449616s" podCreationTimestamp="2026-03-12 15:25:30 +0000 UTC" firstStartedPulling="2026-03-12 15:25:31.867453185 +0000 UTC m=+2284.152678473" lastFinishedPulling="2026-03-12 15:25:36.356440416 +0000 UTC m=+2288.641665694" observedRunningTime="2026-03-12 15:25:36.93457416 +0000 UTC m=+2289.219799438" watchObservedRunningTime="2026-03-12 15:25:36.944449616 +0000 UTC m=+2289.229674914" Mar 12 15:25:40 crc kubenswrapper[4869]: I0312 15:25:40.564824 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-994qp" Mar 12 15:25:40 crc kubenswrapper[4869]: I0312 15:25:40.566509 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-994qp" Mar 12 15:25:40 crc kubenswrapper[4869]: I0312 15:25:40.626364 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-994qp" Mar 12 15:25:42 crc kubenswrapper[4869]: I0312 15:25:42.007639 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-994qp" Mar 12 15:25:42 crc kubenswrapper[4869]: I0312 15:25:42.072984 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-994qp"] Mar 12 15:25:42 crc kubenswrapper[4869]: I0312 15:25:42.117614 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hlrr7"] Mar 12 15:25:42 crc kubenswrapper[4869]: I0312 15:25:42.117903 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hlrr7" podUID="870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6" containerName="registry-server" containerID="cri-o://1f3d97fb96f3c37d2d5b5fea4eeb7efd8571ecf8b66f6fc30ed13ee954df3d7b" gracePeriod=2 Mar 12 15:25:42 crc kubenswrapper[4869]: E0312 15:25:42.417463 4869 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod870b2ca4_83f3_4a2b_9bc4_3dea9c1773f6.slice/crio-1f3d97fb96f3c37d2d5b5fea4eeb7efd8571ecf8b66f6fc30ed13ee954df3d7b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod870b2ca4_83f3_4a2b_9bc4_3dea9c1773f6.slice/crio-conmon-1f3d97fb96f3c37d2d5b5fea4eeb7efd8571ecf8b66f6fc30ed13ee954df3d7b.scope\": RecentStats: unable to find data in memory cache]" Mar 12 15:25:42 crc kubenswrapper[4869]: I0312 15:25:42.965724 4869 generic.go:334] "Generic (PLEG): container finished" podID="870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6" containerID="1f3d97fb96f3c37d2d5b5fea4eeb7efd8571ecf8b66f6fc30ed13ee954df3d7b" exitCode=0 Mar 12 15:25:42 crc kubenswrapper[4869]: I0312 15:25:42.965819 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlrr7" event={"ID":"870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6","Type":"ContainerDied","Data":"1f3d97fb96f3c37d2d5b5fea4eeb7efd8571ecf8b66f6fc30ed13ee954df3d7b"} Mar 12 15:25:43 crc kubenswrapper[4869]: I0312 15:25:43.675239 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hlrr7" Mar 12 15:25:43 crc kubenswrapper[4869]: I0312 15:25:43.796443 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6-catalog-content\") pod \"870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6\" (UID: \"870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6\") " Mar 12 15:25:43 crc kubenswrapper[4869]: I0312 15:25:43.796724 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xwgf\" (UniqueName: \"kubernetes.io/projected/870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6-kube-api-access-9xwgf\") pod \"870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6\" (UID: \"870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6\") " Mar 12 15:25:43 crc kubenswrapper[4869]: I0312 15:25:43.796750 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6-utilities\") pod \"870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6\" (UID: \"870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6\") " Mar 12 15:25:43 crc kubenswrapper[4869]: I0312 15:25:43.798699 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6-utilities" (OuterVolumeSpecName: "utilities") pod "870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6" (UID: "870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:25:43 crc kubenswrapper[4869]: I0312 15:25:43.805725 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6-kube-api-access-9xwgf" (OuterVolumeSpecName: "kube-api-access-9xwgf") pod "870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6" (UID: "870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6"). InnerVolumeSpecName "kube-api-access-9xwgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:25:43 crc kubenswrapper[4869]: I0312 15:25:43.888197 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6" (UID: "870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:25:43 crc kubenswrapper[4869]: I0312 15:25:43.898646 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xwgf\" (UniqueName: \"kubernetes.io/projected/870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6-kube-api-access-9xwgf\") on node \"crc\" DevicePath \"\"" Mar 12 15:25:43 crc kubenswrapper[4869]: I0312 15:25:43.898668 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:25:43 crc kubenswrapper[4869]: I0312 15:25:43.898676 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:25:43 crc kubenswrapper[4869]: I0312 15:25:43.976574 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlrr7" event={"ID":"870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6","Type":"ContainerDied","Data":"780c45694cf780d1739d5a67d67033d17d915e53ea0fc5b26863aac3c87476a3"} Mar 12 15:25:43 crc kubenswrapper[4869]: I0312 15:25:43.976629 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hlrr7" Mar 12 15:25:43 crc kubenswrapper[4869]: I0312 15:25:43.976647 4869 scope.go:117] "RemoveContainer" containerID="1f3d97fb96f3c37d2d5b5fea4eeb7efd8571ecf8b66f6fc30ed13ee954df3d7b" Mar 12 15:25:44 crc kubenswrapper[4869]: I0312 15:25:44.002199 4869 scope.go:117] "RemoveContainer" containerID="7a51ccbaafecd816b0bc847e17ccfee444a82598853130c85a3b74cdef0ac9d3" Mar 12 15:25:44 crc kubenswrapper[4869]: I0312 15:25:44.022585 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hlrr7"] Mar 12 15:25:44 crc kubenswrapper[4869]: I0312 15:25:44.039173 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hlrr7"] Mar 12 15:25:44 crc kubenswrapper[4869]: I0312 15:25:44.043763 4869 scope.go:117] "RemoveContainer" containerID="fd515955af0aa011af81ccef68cc4d1e4cb8e40f7636f79f1d0a356ccfbaec0c" Mar 12 15:25:44 crc kubenswrapper[4869]: I0312 15:25:44.347636 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6" path="/var/lib/kubelet/pods/870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6/volumes" Mar 12 15:26:00 crc kubenswrapper[4869]: I0312 15:26:00.137106 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555486-27ldq"] Mar 12 15:26:00 crc kubenswrapper[4869]: E0312 15:26:00.138193 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6" containerName="extract-content" Mar 12 15:26:00 crc kubenswrapper[4869]: I0312 15:26:00.138240 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6" containerName="extract-content" Mar 12 15:26:00 crc kubenswrapper[4869]: E0312 15:26:00.138273 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6" containerName="registry-server" Mar 12 15:26:00 crc kubenswrapper[4869]: I0312 15:26:00.138280 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6" containerName="registry-server" Mar 12 15:26:00 crc kubenswrapper[4869]: E0312 15:26:00.138295 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6" containerName="extract-utilities" Mar 12 15:26:00 crc kubenswrapper[4869]: I0312 15:26:00.138302 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6" containerName="extract-utilities" Mar 12 15:26:00 crc kubenswrapper[4869]: I0312 15:26:00.138555 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="870b2ca4-83f3-4a2b-9bc4-3dea9c1773f6" containerName="registry-server" Mar 12 15:26:00 crc kubenswrapper[4869]: I0312 15:26:00.139357 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555486-27ldq" Mar 12 15:26:00 crc kubenswrapper[4869]: I0312 15:26:00.141916 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 15:26:00 crc kubenswrapper[4869]: I0312 15:26:00.141917 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:26:00 crc kubenswrapper[4869]: I0312 15:26:00.142995 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:26:00 crc kubenswrapper[4869]: I0312 15:26:00.147600 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555486-27ldq"] Mar 12 15:26:00 crc kubenswrapper[4869]: I0312 15:26:00.331221 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj4j5\" (UniqueName: \"kubernetes.io/projected/d393e968-8951-49cf-8ff2-5f14d4230b14-kube-api-access-vj4j5\") pod \"auto-csr-approver-29555486-27ldq\" (UID: \"d393e968-8951-49cf-8ff2-5f14d4230b14\") " pod="openshift-infra/auto-csr-approver-29555486-27ldq" Mar 12 15:26:00 crc kubenswrapper[4869]: I0312 15:26:00.433281 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj4j5\" (UniqueName: \"kubernetes.io/projected/d393e968-8951-49cf-8ff2-5f14d4230b14-kube-api-access-vj4j5\") pod \"auto-csr-approver-29555486-27ldq\" (UID: \"d393e968-8951-49cf-8ff2-5f14d4230b14\") " pod="openshift-infra/auto-csr-approver-29555486-27ldq" Mar 12 15:26:00 crc kubenswrapper[4869]: I0312 15:26:00.463108 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj4j5\" (UniqueName: \"kubernetes.io/projected/d393e968-8951-49cf-8ff2-5f14d4230b14-kube-api-access-vj4j5\") pod \"auto-csr-approver-29555486-27ldq\" (UID: \"d393e968-8951-49cf-8ff2-5f14d4230b14\") " pod="openshift-infra/auto-csr-approver-29555486-27ldq" Mar 12 15:26:00 crc kubenswrapper[4869]: I0312 15:26:00.464108 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555486-27ldq" Mar 12 15:26:00 crc kubenswrapper[4869]: I0312 15:26:00.940452 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555486-27ldq"] Mar 12 15:26:01 crc kubenswrapper[4869]: I0312 15:26:01.165568 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555486-27ldq" event={"ID":"d393e968-8951-49cf-8ff2-5f14d4230b14","Type":"ContainerStarted","Data":"072dafe84bbb0526d806ced327e599984c074fa8f811bb043109478e55854f58"} Mar 12 15:26:02 crc kubenswrapper[4869]: I0312 15:26:02.174160 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555486-27ldq" event={"ID":"d393e968-8951-49cf-8ff2-5f14d4230b14","Type":"ContainerStarted","Data":"9b91e001d0acac955428a1550e634a13efe70169b97ebf32d51852f46c0067a6"} Mar 12 15:26:02 crc kubenswrapper[4869]: I0312 15:26:02.198455 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555486-27ldq" podStartSLOduration=1.328655409 podStartE2EDuration="2.198430813s" podCreationTimestamp="2026-03-12 15:26:00 +0000 UTC" firstStartedPulling="2026-03-12 15:26:00.953829122 +0000 UTC m=+2313.239054400" lastFinishedPulling="2026-03-12 15:26:01.823604526 +0000 UTC m=+2314.108829804" observedRunningTime="2026-03-12 15:26:02.188046294 +0000 UTC m=+2314.473271592" watchObservedRunningTime="2026-03-12 15:26:02.198430813 +0000 UTC m=+2314.483656101" Mar 12 15:26:03 crc kubenswrapper[4869]: I0312 15:26:03.191215 4869 generic.go:334] "Generic (PLEG): container finished" podID="d393e968-8951-49cf-8ff2-5f14d4230b14" containerID="9b91e001d0acac955428a1550e634a13efe70169b97ebf32d51852f46c0067a6" exitCode=0 Mar 12 15:26:03 crc kubenswrapper[4869]: I0312 15:26:03.191276 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555486-27ldq" event={"ID":"d393e968-8951-49cf-8ff2-5f14d4230b14","Type":"ContainerDied","Data":"9b91e001d0acac955428a1550e634a13efe70169b97ebf32d51852f46c0067a6"} Mar 12 15:26:04 crc kubenswrapper[4869]: I0312 15:26:04.595815 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555486-27ldq" Mar 12 15:26:04 crc kubenswrapper[4869]: I0312 15:26:04.655386 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vj4j5\" (UniqueName: \"kubernetes.io/projected/d393e968-8951-49cf-8ff2-5f14d4230b14-kube-api-access-vj4j5\") pod \"d393e968-8951-49cf-8ff2-5f14d4230b14\" (UID: \"d393e968-8951-49cf-8ff2-5f14d4230b14\") " Mar 12 15:26:04 crc kubenswrapper[4869]: I0312 15:26:04.662038 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d393e968-8951-49cf-8ff2-5f14d4230b14-kube-api-access-vj4j5" (OuterVolumeSpecName: "kube-api-access-vj4j5") pod "d393e968-8951-49cf-8ff2-5f14d4230b14" (UID: "d393e968-8951-49cf-8ff2-5f14d4230b14"). InnerVolumeSpecName "kube-api-access-vj4j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:26:04 crc kubenswrapper[4869]: I0312 15:26:04.757431 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vj4j5\" (UniqueName: \"kubernetes.io/projected/d393e968-8951-49cf-8ff2-5f14d4230b14-kube-api-access-vj4j5\") on node \"crc\" DevicePath \"\"" Mar 12 15:26:05 crc kubenswrapper[4869]: I0312 15:26:05.208899 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555486-27ldq" event={"ID":"d393e968-8951-49cf-8ff2-5f14d4230b14","Type":"ContainerDied","Data":"072dafe84bbb0526d806ced327e599984c074fa8f811bb043109478e55854f58"} Mar 12 15:26:05 crc kubenswrapper[4869]: I0312 15:26:05.209304 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="072dafe84bbb0526d806ced327e599984c074fa8f811bb043109478e55854f58" Mar 12 15:26:05 crc kubenswrapper[4869]: I0312 15:26:05.208934 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555486-27ldq" Mar 12 15:26:05 crc kubenswrapper[4869]: I0312 15:26:05.263509 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555480-bplrp"] Mar 12 15:26:05 crc kubenswrapper[4869]: I0312 15:26:05.271042 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555480-bplrp"] Mar 12 15:26:06 crc kubenswrapper[4869]: I0312 15:26:06.347227 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a21e3145-7118-4360-90d6-9f32c92208b2" path="/var/lib/kubelet/pods/a21e3145-7118-4360-90d6-9f32c92208b2/volumes" Mar 12 15:26:29 crc kubenswrapper[4869]: I0312 15:26:29.418331 4869 generic.go:334] "Generic (PLEG): container finished" podID="2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a" containerID="1d5c7c2b0a4a39b4035e52b722c55b6e1e8602f9b778240d575c1b2fe08d1264" exitCode=0 Mar 12 15:26:29 crc kubenswrapper[4869]: I0312 15:26:29.418404 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw" event={"ID":"2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a","Type":"ContainerDied","Data":"1d5c7c2b0a4a39b4035e52b722c55b6e1e8602f9b778240d575c1b2fe08d1264"} Mar 12 15:26:30 crc kubenswrapper[4869]: I0312 15:26:30.857921 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.015086 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a-libvirt-secret-0\") pod \"2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a\" (UID: \"2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a\") " Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.015216 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzg9k\" (UniqueName: \"kubernetes.io/projected/2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a-kube-api-access-wzg9k\") pod \"2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a\" (UID: \"2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a\") " Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.015341 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a-libvirt-combined-ca-bundle\") pod \"2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a\" (UID: \"2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a\") " Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.015399 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a-ssh-key-openstack-edpm-ipam\") pod \"2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a\" (UID: \"2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a\") " Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.015476 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a-inventory\") pod \"2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a\" (UID: \"2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a\") " Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.021261 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a" (UID: "2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.023763 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a-kube-api-access-wzg9k" (OuterVolumeSpecName: "kube-api-access-wzg9k") pod "2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a" (UID: "2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a"). InnerVolumeSpecName "kube-api-access-wzg9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.068528 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a" (UID: "2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.069407 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a-inventory" (OuterVolumeSpecName: "inventory") pod "2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a" (UID: "2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.074713 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a" (UID: "2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.118585 4869 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.118618 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzg9k\" (UniqueName: \"kubernetes.io/projected/2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a-kube-api-access-wzg9k\") on node \"crc\" DevicePath \"\"" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.118630 4869 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.118641 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.118651 4869 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.436331 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw" event={"ID":"2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a","Type":"ContainerDied","Data":"dc8d223040de31087eb3187079ff20473cca8175d818ac28c9567e33d048fc78"} Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.436677 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc8d223040de31087eb3187079ff20473cca8175d818ac28c9567e33d048fc78" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.436755 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.536889 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd"] Mar 12 15:26:31 crc kubenswrapper[4869]: E0312 15:26:31.537358 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d393e968-8951-49cf-8ff2-5f14d4230b14" containerName="oc" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.537379 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="d393e968-8951-49cf-8ff2-5f14d4230b14" containerName="oc" Mar 12 15:26:31 crc kubenswrapper[4869]: E0312 15:26:31.537416 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.537425 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.537632 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="d393e968-8951-49cf-8ff2-5f14d4230b14" containerName="oc" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.537649 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.538365 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.542876 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.542980 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cxsgq" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.543004 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.543177 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.545392 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.546519 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.546897 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.555307 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd"] Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.630576 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.630634 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.630992 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.631092 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mv8k\" (UniqueName: \"kubernetes.io/projected/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-kube-api-access-2mv8k\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.631284 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.631340 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.631394 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.631457 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.631627 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.631681 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.631731 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.733427 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.733496 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.733565 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.733600 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.733626 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.733664 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.733688 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.733788 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.733818 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mv8k\" (UniqueName: \"kubernetes.io/projected/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-kube-api-access-2mv8k\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.733883 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.733910 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.735218 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.738650 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.739427 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.739462 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.740137 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.742161 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.742265 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.742628 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.743226 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.744046 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.752307 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mv8k\" (UniqueName: \"kubernetes.io/projected/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-kube-api-access-2mv8k\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zssnd\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:31 crc kubenswrapper[4869]: I0312 15:26:31.857951 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:26:32 crc kubenswrapper[4869]: I0312 15:26:32.397628 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd"] Mar 12 15:26:32 crc kubenswrapper[4869]: I0312 15:26:32.445424 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" event={"ID":"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4","Type":"ContainerStarted","Data":"c78a14f6c1a1353868986c2fc4d23e18ab69ce9cdbb117c9d9df7fe8c37241d3"} Mar 12 15:26:33 crc kubenswrapper[4869]: I0312 15:26:33.458985 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" event={"ID":"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4","Type":"ContainerStarted","Data":"dbddde6d585d7a3334a39ac7dd1502401e8be57aae435bc7442cf3dd8330700a"} Mar 12 15:26:33 crc kubenswrapper[4869]: I0312 15:26:33.484937 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" podStartSLOduration=1.988757133 podStartE2EDuration="2.484917643s" podCreationTimestamp="2026-03-12 15:26:31 +0000 UTC" firstStartedPulling="2026-03-12 15:26:32.408078278 +0000 UTC m=+2344.693303556" lastFinishedPulling="2026-03-12 15:26:32.904238788 +0000 UTC m=+2345.189464066" observedRunningTime="2026-03-12 15:26:33.481403325 +0000 UTC m=+2345.766628623" watchObservedRunningTime="2026-03-12 15:26:33.484917643 +0000 UTC m=+2345.770142921" Mar 12 15:26:34 crc kubenswrapper[4869]: I0312 15:26:34.398390 4869 scope.go:117] "RemoveContainer" containerID="5c4d37df4b83531931f84bb5a5adfe55bc370852940e13d94da0157dac4bb7b4" Mar 12 15:26:49 crc kubenswrapper[4869]: I0312 15:26:49.685002 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:26:49 crc kubenswrapper[4869]: I0312 15:26:49.685508 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:27:19 crc kubenswrapper[4869]: I0312 15:27:19.684615 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:27:19 crc kubenswrapper[4869]: I0312 15:27:19.685156 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:27:49 crc kubenswrapper[4869]: I0312 15:27:49.683734 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:27:49 crc kubenswrapper[4869]: I0312 15:27:49.684115 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:27:49 crc kubenswrapper[4869]: I0312 15:27:49.684155 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" Mar 12 15:27:49 crc kubenswrapper[4869]: I0312 15:27:49.685029 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c9a47bbb284d539c798de0e32d0c54787cc43e1f77e98632a15c2b240ce0d5b0"} pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 15:27:49 crc kubenswrapper[4869]: I0312 15:27:49.685074 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" containerID="cri-o://c9a47bbb284d539c798de0e32d0c54787cc43e1f77e98632a15c2b240ce0d5b0" gracePeriod=600 Mar 12 15:27:49 crc kubenswrapper[4869]: E0312 15:27:49.824208 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:27:50 crc kubenswrapper[4869]: I0312 15:27:50.110146 4869 generic.go:334] "Generic (PLEG): container finished" podID="1621c994-94d2-4105-a988-f4739518ba91" containerID="c9a47bbb284d539c798de0e32d0c54787cc43e1f77e98632a15c2b240ce0d5b0" exitCode=0 Mar 12 15:27:50 crc kubenswrapper[4869]: I0312 15:27:50.110189 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerDied","Data":"c9a47bbb284d539c798de0e32d0c54787cc43e1f77e98632a15c2b240ce0d5b0"} Mar 12 15:27:50 crc kubenswrapper[4869]: I0312 15:27:50.110225 4869 scope.go:117] "RemoveContainer" containerID="6d2f6b1edc93a1efc0577095741ab3b3ad70babba8d5975e923b0ee75f9c99a6" Mar 12 15:27:50 crc kubenswrapper[4869]: I0312 15:27:50.110944 4869 scope.go:117] "RemoveContainer" containerID="c9a47bbb284d539c798de0e32d0c54787cc43e1f77e98632a15c2b240ce0d5b0" Mar 12 15:27:50 crc kubenswrapper[4869]: E0312 15:27:50.111316 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:28:00 crc kubenswrapper[4869]: I0312 15:28:00.141198 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555488-x9dqp"] Mar 12 15:28:00 crc kubenswrapper[4869]: I0312 15:28:00.143462 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555488-x9dqp" Mar 12 15:28:00 crc kubenswrapper[4869]: I0312 15:28:00.149399 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:28:00 crc kubenswrapper[4869]: I0312 15:28:00.149965 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:28:00 crc kubenswrapper[4869]: I0312 15:28:00.150363 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 15:28:00 crc kubenswrapper[4869]: I0312 15:28:00.150615 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555488-x9dqp"] Mar 12 15:28:00 crc kubenswrapper[4869]: I0312 15:28:00.208567 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnzxr\" (UniqueName: \"kubernetes.io/projected/f751fb96-803e-4ae0-9abd-62c03ea50ecd-kube-api-access-hnzxr\") pod \"auto-csr-approver-29555488-x9dqp\" (UID: \"f751fb96-803e-4ae0-9abd-62c03ea50ecd\") " pod="openshift-infra/auto-csr-approver-29555488-x9dqp" Mar 12 15:28:00 crc kubenswrapper[4869]: I0312 15:28:00.310115 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnzxr\" (UniqueName: \"kubernetes.io/projected/f751fb96-803e-4ae0-9abd-62c03ea50ecd-kube-api-access-hnzxr\") pod \"auto-csr-approver-29555488-x9dqp\" (UID: \"f751fb96-803e-4ae0-9abd-62c03ea50ecd\") " pod="openshift-infra/auto-csr-approver-29555488-x9dqp" Mar 12 15:28:00 crc kubenswrapper[4869]: I0312 15:28:00.336267 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnzxr\" (UniqueName: \"kubernetes.io/projected/f751fb96-803e-4ae0-9abd-62c03ea50ecd-kube-api-access-hnzxr\") pod \"auto-csr-approver-29555488-x9dqp\" (UID: \"f751fb96-803e-4ae0-9abd-62c03ea50ecd\") " pod="openshift-infra/auto-csr-approver-29555488-x9dqp" Mar 12 15:28:00 crc kubenswrapper[4869]: I0312 15:28:00.470650 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555488-x9dqp" Mar 12 15:28:00 crc kubenswrapper[4869]: I0312 15:28:00.962355 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555488-x9dqp"] Mar 12 15:28:00 crc kubenswrapper[4869]: I0312 15:28:00.968316 4869 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 15:28:01 crc kubenswrapper[4869]: I0312 15:28:01.229519 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555488-x9dqp" event={"ID":"f751fb96-803e-4ae0-9abd-62c03ea50ecd","Type":"ContainerStarted","Data":"7ca3e67de139fa7e9e321aa917e0471fa8f4444e2fd59d78a72e8d3991689bb7"} Mar 12 15:28:03 crc kubenswrapper[4869]: I0312 15:28:03.248009 4869 generic.go:334] "Generic (PLEG): container finished" podID="f751fb96-803e-4ae0-9abd-62c03ea50ecd" containerID="c12996e2251ffadbd88c2839ea7a1003235f7983b65dc8fbe95412b73961cc00" exitCode=0 Mar 12 15:28:03 crc kubenswrapper[4869]: I0312 15:28:03.248466 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555488-x9dqp" event={"ID":"f751fb96-803e-4ae0-9abd-62c03ea50ecd","Type":"ContainerDied","Data":"c12996e2251ffadbd88c2839ea7a1003235f7983b65dc8fbe95412b73961cc00"} Mar 12 15:28:04 crc kubenswrapper[4869]: I0312 15:28:04.565460 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555488-x9dqp" Mar 12 15:28:04 crc kubenswrapper[4869]: I0312 15:28:04.699599 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnzxr\" (UniqueName: \"kubernetes.io/projected/f751fb96-803e-4ae0-9abd-62c03ea50ecd-kube-api-access-hnzxr\") pod \"f751fb96-803e-4ae0-9abd-62c03ea50ecd\" (UID: \"f751fb96-803e-4ae0-9abd-62c03ea50ecd\") " Mar 12 15:28:04 crc kubenswrapper[4869]: I0312 15:28:04.711746 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f751fb96-803e-4ae0-9abd-62c03ea50ecd-kube-api-access-hnzxr" (OuterVolumeSpecName: "kube-api-access-hnzxr") pod "f751fb96-803e-4ae0-9abd-62c03ea50ecd" (UID: "f751fb96-803e-4ae0-9abd-62c03ea50ecd"). InnerVolumeSpecName "kube-api-access-hnzxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:28:04 crc kubenswrapper[4869]: I0312 15:28:04.802969 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnzxr\" (UniqueName: \"kubernetes.io/projected/f751fb96-803e-4ae0-9abd-62c03ea50ecd-kube-api-access-hnzxr\") on node \"crc\" DevicePath \"\"" Mar 12 15:28:05 crc kubenswrapper[4869]: I0312 15:28:05.265125 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555488-x9dqp" event={"ID":"f751fb96-803e-4ae0-9abd-62c03ea50ecd","Type":"ContainerDied","Data":"7ca3e67de139fa7e9e321aa917e0471fa8f4444e2fd59d78a72e8d3991689bb7"} Mar 12 15:28:05 crc kubenswrapper[4869]: I0312 15:28:05.265173 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ca3e67de139fa7e9e321aa917e0471fa8f4444e2fd59d78a72e8d3991689bb7" Mar 12 15:28:05 crc kubenswrapper[4869]: I0312 15:28:05.265238 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555488-x9dqp" Mar 12 15:28:05 crc kubenswrapper[4869]: I0312 15:28:05.336935 4869 scope.go:117] "RemoveContainer" containerID="c9a47bbb284d539c798de0e32d0c54787cc43e1f77e98632a15c2b240ce0d5b0" Mar 12 15:28:05 crc kubenswrapper[4869]: E0312 15:28:05.337395 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:28:05 crc kubenswrapper[4869]: I0312 15:28:05.628660 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555482-sjpgl"] Mar 12 15:28:05 crc kubenswrapper[4869]: I0312 15:28:05.636471 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555482-sjpgl"] Mar 12 15:28:06 crc kubenswrapper[4869]: I0312 15:28:06.345349 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f76d71a8-ef71-49d2-815f-c043aa32ab16" path="/var/lib/kubelet/pods/f76d71a8-ef71-49d2-815f-c043aa32ab16/volumes" Mar 12 15:28:16 crc kubenswrapper[4869]: I0312 15:28:16.336210 4869 scope.go:117] "RemoveContainer" containerID="c9a47bbb284d539c798de0e32d0c54787cc43e1f77e98632a15c2b240ce0d5b0" Mar 12 15:28:16 crc kubenswrapper[4869]: E0312 15:28:16.337166 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:28:28 crc kubenswrapper[4869]: I0312 15:28:28.343917 4869 scope.go:117] "RemoveContainer" containerID="c9a47bbb284d539c798de0e32d0c54787cc43e1f77e98632a15c2b240ce0d5b0" Mar 12 15:28:28 crc kubenswrapper[4869]: E0312 15:28:28.344618 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:28:34 crc kubenswrapper[4869]: I0312 15:28:34.519118 4869 scope.go:117] "RemoveContainer" containerID="68df47a6888c65bfd924d124542625f146030dc647a468ae45a4614a4bdda223" Mar 12 15:28:41 crc kubenswrapper[4869]: I0312 15:28:41.337040 4869 scope.go:117] "RemoveContainer" containerID="c9a47bbb284d539c798de0e32d0c54787cc43e1f77e98632a15c2b240ce0d5b0" Mar 12 15:28:41 crc kubenswrapper[4869]: E0312 15:28:41.337815 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:28:46 crc kubenswrapper[4869]: I0312 15:28:46.648303 4869 generic.go:334] "Generic (PLEG): container finished" podID="8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4" containerID="dbddde6d585d7a3334a39ac7dd1502401e8be57aae435bc7442cf3dd8330700a" exitCode=0 Mar 12 15:28:46 crc kubenswrapper[4869]: I0312 15:28:46.648413 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" event={"ID":"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4","Type":"ContainerDied","Data":"dbddde6d585d7a3334a39ac7dd1502401e8be57aae435bc7442cf3dd8330700a"} Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.131348 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.322195 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-cell1-compute-config-1\") pod \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.322322 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-combined-ca-bundle\") pod \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.322355 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-migration-ssh-key-1\") pod \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.322376 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-cell1-compute-config-2\") pod \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.322408 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-extra-config-0\") pod \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.322431 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-inventory\") pod \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.322452 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-ssh-key-openstack-edpm-ipam\") pod \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.323059 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-migration-ssh-key-0\") pod \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.323172 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-cell1-compute-config-3\") pod \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.323251 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mv8k\" (UniqueName: \"kubernetes.io/projected/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-kube-api-access-2mv8k\") pod \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.323312 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-cell1-compute-config-0\") pod \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\" (UID: \"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4\") " Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.327927 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4" (UID: "8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.328875 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-kube-api-access-2mv8k" (OuterVolumeSpecName: "kube-api-access-2mv8k") pod "8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4" (UID: "8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4"). InnerVolumeSpecName "kube-api-access-2mv8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.355478 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4" (UID: "8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.356125 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4" (UID: "8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.363052 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-inventory" (OuterVolumeSpecName: "inventory") pod "8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4" (UID: "8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.369789 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4" (UID: "8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.370458 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4" (UID: "8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.376880 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4" (UID: "8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.377182 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4" (UID: "8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.381462 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4" (UID: "8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.393080 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4" (UID: "8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.426111 4869 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.426145 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mv8k\" (UniqueName: \"kubernetes.io/projected/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-kube-api-access-2mv8k\") on node \"crc\" DevicePath \"\"" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.426154 4869 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.426163 4869 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.426173 4869 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.426182 4869 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.426192 4869 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.426200 4869 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.426209 4869 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.426218 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.426226 4869 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.668715 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" event={"ID":"8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4","Type":"ContainerDied","Data":"c78a14f6c1a1353868986c2fc4d23e18ab69ce9cdbb117c9d9df7fe8c37241d3"} Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.668757 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c78a14f6c1a1353868986c2fc4d23e18ab69ce9cdbb117c9d9df7fe8c37241d3" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.668777 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zssnd" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.785209 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4"] Mar 12 15:28:48 crc kubenswrapper[4869]: E0312 15:28:48.786149 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.786177 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 12 15:28:48 crc kubenswrapper[4869]: E0312 15:28:48.786193 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f751fb96-803e-4ae0-9abd-62c03ea50ecd" containerName="oc" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.786202 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="f751fb96-803e-4ae0-9abd-62c03ea50ecd" containerName="oc" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.786454 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="f751fb96-803e-4ae0-9abd-62c03ea50ecd" containerName="oc" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.786500 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.787323 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.790085 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-cxsgq" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.790480 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.790860 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.791069 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.795226 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.797808 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4"] Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.934202 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wmss4\" (UID: \"2b744985-65a8-49f0-b177-85c19eeb86c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.934288 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wmss4\" (UID: \"2b744985-65a8-49f0-b177-85c19eeb86c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.934392 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnqf7\" (UniqueName: \"kubernetes.io/projected/2b744985-65a8-49f0-b177-85c19eeb86c1-kube-api-access-dnqf7\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wmss4\" (UID: \"2b744985-65a8-49f0-b177-85c19eeb86c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.934611 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wmss4\" (UID: \"2b744985-65a8-49f0-b177-85c19eeb86c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.934663 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wmss4\" (UID: \"2b744985-65a8-49f0-b177-85c19eeb86c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.935050 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wmss4\" (UID: \"2b744985-65a8-49f0-b177-85c19eeb86c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4" Mar 12 15:28:48 crc kubenswrapper[4869]: I0312 15:28:48.935194 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wmss4\" (UID: \"2b744985-65a8-49f0-b177-85c19eeb86c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4" Mar 12 15:28:49 crc kubenswrapper[4869]: I0312 15:28:49.037000 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wmss4\" (UID: \"2b744985-65a8-49f0-b177-85c19eeb86c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4" Mar 12 15:28:49 crc kubenswrapper[4869]: I0312 15:28:49.037147 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wmss4\" (UID: \"2b744985-65a8-49f0-b177-85c19eeb86c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4" Mar 12 15:28:49 crc kubenswrapper[4869]: I0312 15:28:49.037191 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wmss4\" (UID: \"2b744985-65a8-49f0-b177-85c19eeb86c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4" Mar 12 15:28:49 crc kubenswrapper[4869]: I0312 15:28:49.037258 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnqf7\" (UniqueName: \"kubernetes.io/projected/2b744985-65a8-49f0-b177-85c19eeb86c1-kube-api-access-dnqf7\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wmss4\" (UID: \"2b744985-65a8-49f0-b177-85c19eeb86c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4" Mar 12 15:28:49 crc kubenswrapper[4869]: I0312 15:28:49.037316 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wmss4\" (UID: \"2b744985-65a8-49f0-b177-85c19eeb86c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4" Mar 12 15:28:49 crc kubenswrapper[4869]: I0312 15:28:49.037341 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wmss4\" (UID: \"2b744985-65a8-49f0-b177-85c19eeb86c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4" Mar 12 15:28:49 crc kubenswrapper[4869]: I0312 15:28:49.037368 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wmss4\" (UID: \"2b744985-65a8-49f0-b177-85c19eeb86c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4" Mar 12 15:28:49 crc kubenswrapper[4869]: I0312 15:28:49.041655 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wmss4\" (UID: \"2b744985-65a8-49f0-b177-85c19eeb86c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4" Mar 12 15:28:49 crc kubenswrapper[4869]: I0312 15:28:49.044325 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wmss4\" (UID: \"2b744985-65a8-49f0-b177-85c19eeb86c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4" Mar 12 15:28:49 crc kubenswrapper[4869]: I0312 15:28:49.044902 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wmss4\" (UID: \"2b744985-65a8-49f0-b177-85c19eeb86c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4" Mar 12 15:28:49 crc kubenswrapper[4869]: I0312 15:28:49.044909 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wmss4\" (UID: \"2b744985-65a8-49f0-b177-85c19eeb86c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4" Mar 12 15:28:49 crc kubenswrapper[4869]: I0312 15:28:49.044902 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wmss4\" (UID: \"2b744985-65a8-49f0-b177-85c19eeb86c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4" Mar 12 15:28:49 crc kubenswrapper[4869]: I0312 15:28:49.051627 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wmss4\" (UID: \"2b744985-65a8-49f0-b177-85c19eeb86c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4" Mar 12 15:28:49 crc kubenswrapper[4869]: I0312 15:28:49.053426 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnqf7\" (UniqueName: \"kubernetes.io/projected/2b744985-65a8-49f0-b177-85c19eeb86c1-kube-api-access-dnqf7\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wmss4\" (UID: \"2b744985-65a8-49f0-b177-85c19eeb86c1\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4" Mar 12 15:28:49 crc kubenswrapper[4869]: I0312 15:28:49.145721 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4" Mar 12 15:28:50 crc kubenswrapper[4869]: I0312 15:28:49.664357 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4"] Mar 12 15:28:50 crc kubenswrapper[4869]: I0312 15:28:49.683678 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4" event={"ID":"2b744985-65a8-49f0-b177-85c19eeb86c1","Type":"ContainerStarted","Data":"43d574d28760c7b1c2a63a0b7c50220f0bb5f4312cd836ca4fa5a572b62af175"} Mar 12 15:28:50 crc kubenswrapper[4869]: I0312 15:28:50.693775 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4" event={"ID":"2b744985-65a8-49f0-b177-85c19eeb86c1","Type":"ContainerStarted","Data":"4cae1c694651afdde11c3f9c168182e0b28567ae01924fc19d97e9618047207e"} Mar 12 15:28:50 crc kubenswrapper[4869]: I0312 15:28:50.722537 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4" podStartSLOduration=2.242954196 podStartE2EDuration="2.722514531s" podCreationTimestamp="2026-03-12 15:28:48 +0000 UTC" firstStartedPulling="2026-03-12 15:28:49.670178486 +0000 UTC m=+2481.955403774" lastFinishedPulling="2026-03-12 15:28:50.149738841 +0000 UTC m=+2482.434964109" observedRunningTime="2026-03-12 15:28:50.719325321 +0000 UTC m=+2483.004550589" watchObservedRunningTime="2026-03-12 15:28:50.722514531 +0000 UTC m=+2483.007739809" Mar 12 15:28:54 crc kubenswrapper[4869]: I0312 15:28:54.336767 4869 scope.go:117] "RemoveContainer" containerID="c9a47bbb284d539c798de0e32d0c54787cc43e1f77e98632a15c2b240ce0d5b0" Mar 12 15:28:54 crc kubenswrapper[4869]: E0312 15:28:54.337293 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:29:07 crc kubenswrapper[4869]: I0312 15:29:07.336702 4869 scope.go:117] "RemoveContainer" containerID="c9a47bbb284d539c798de0e32d0c54787cc43e1f77e98632a15c2b240ce0d5b0" Mar 12 15:29:07 crc kubenswrapper[4869]: E0312 15:29:07.338406 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:29:18 crc kubenswrapper[4869]: I0312 15:29:18.345666 4869 scope.go:117] "RemoveContainer" containerID="c9a47bbb284d539c798de0e32d0c54787cc43e1f77e98632a15c2b240ce0d5b0" Mar 12 15:29:18 crc kubenswrapper[4869]: E0312 15:29:18.346454 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:29:29 crc kubenswrapper[4869]: I0312 15:29:29.337350 4869 scope.go:117] "RemoveContainer" containerID="c9a47bbb284d539c798de0e32d0c54787cc43e1f77e98632a15c2b240ce0d5b0" Mar 12 15:29:29 crc kubenswrapper[4869]: E0312 15:29:29.338319 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:29:44 crc kubenswrapper[4869]: I0312 15:29:44.338031 4869 scope.go:117] "RemoveContainer" containerID="c9a47bbb284d539c798de0e32d0c54787cc43e1f77e98632a15c2b240ce0d5b0" Mar 12 15:29:44 crc kubenswrapper[4869]: E0312 15:29:44.338958 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:29:59 crc kubenswrapper[4869]: I0312 15:29:59.336627 4869 scope.go:117] "RemoveContainer" containerID="c9a47bbb284d539c798de0e32d0c54787cc43e1f77e98632a15c2b240ce0d5b0" Mar 12 15:29:59 crc kubenswrapper[4869]: E0312 15:29:59.337807 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:30:00 crc kubenswrapper[4869]: I0312 15:30:00.162476 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555490-t6mmg"] Mar 12 15:30:00 crc kubenswrapper[4869]: I0312 15:30:00.164723 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555490-t6mmg" Mar 12 15:30:00 crc kubenswrapper[4869]: I0312 15:30:00.167593 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:30:00 crc kubenswrapper[4869]: I0312 15:30:00.167943 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 15:30:00 crc kubenswrapper[4869]: I0312 15:30:00.168413 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:30:00 crc kubenswrapper[4869]: I0312 15:30:00.180461 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555490-j8sjt"] Mar 12 15:30:00 crc kubenswrapper[4869]: I0312 15:30:00.182011 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555490-j8sjt" Mar 12 15:30:00 crc kubenswrapper[4869]: I0312 15:30:00.183626 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 12 15:30:00 crc kubenswrapper[4869]: I0312 15:30:00.184399 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 12 15:30:00 crc kubenswrapper[4869]: I0312 15:30:00.192643 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555490-t6mmg"] Mar 12 15:30:00 crc kubenswrapper[4869]: I0312 15:30:00.200891 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555490-j8sjt"] Mar 12 15:30:00 crc kubenswrapper[4869]: I0312 15:30:00.245809 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99j94\" (UniqueName: \"kubernetes.io/projected/50f29abb-8a8d-4684-86f8-f89eff242676-kube-api-access-99j94\") pod \"auto-csr-approver-29555490-t6mmg\" (UID: \"50f29abb-8a8d-4684-86f8-f89eff242676\") " pod="openshift-infra/auto-csr-approver-29555490-t6mmg" Mar 12 15:30:00 crc kubenswrapper[4869]: I0312 15:30:00.245860 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f0f32cb-cab5-4ebb-8ae4-497fd240d50c-config-volume\") pod \"collect-profiles-29555490-j8sjt\" (UID: \"4f0f32cb-cab5-4ebb-8ae4-497fd240d50c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555490-j8sjt" Mar 12 15:30:00 crc kubenswrapper[4869]: I0312 15:30:00.245905 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f0f32cb-cab5-4ebb-8ae4-497fd240d50c-secret-volume\") pod \"collect-profiles-29555490-j8sjt\" (UID: \"4f0f32cb-cab5-4ebb-8ae4-497fd240d50c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555490-j8sjt" Mar 12 15:30:00 crc kubenswrapper[4869]: I0312 15:30:00.246566 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q6k8\" (UniqueName: \"kubernetes.io/projected/4f0f32cb-cab5-4ebb-8ae4-497fd240d50c-kube-api-access-9q6k8\") pod \"collect-profiles-29555490-j8sjt\" (UID: \"4f0f32cb-cab5-4ebb-8ae4-497fd240d50c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555490-j8sjt" Mar 12 15:30:00 crc kubenswrapper[4869]: I0312 15:30:00.348832 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99j94\" (UniqueName: \"kubernetes.io/projected/50f29abb-8a8d-4684-86f8-f89eff242676-kube-api-access-99j94\") pod \"auto-csr-approver-29555490-t6mmg\" (UID: \"50f29abb-8a8d-4684-86f8-f89eff242676\") " pod="openshift-infra/auto-csr-approver-29555490-t6mmg" Mar 12 15:30:00 crc kubenswrapper[4869]: I0312 15:30:00.348893 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f0f32cb-cab5-4ebb-8ae4-497fd240d50c-config-volume\") pod \"collect-profiles-29555490-j8sjt\" (UID: \"4f0f32cb-cab5-4ebb-8ae4-497fd240d50c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555490-j8sjt" Mar 12 15:30:00 crc kubenswrapper[4869]: I0312 15:30:00.348943 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f0f32cb-cab5-4ebb-8ae4-497fd240d50c-secret-volume\") pod \"collect-profiles-29555490-j8sjt\" (UID: \"4f0f32cb-cab5-4ebb-8ae4-497fd240d50c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555490-j8sjt" Mar 12 15:30:00 crc kubenswrapper[4869]: I0312 15:30:00.349029 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q6k8\" (UniqueName: \"kubernetes.io/projected/4f0f32cb-cab5-4ebb-8ae4-497fd240d50c-kube-api-access-9q6k8\") pod \"collect-profiles-29555490-j8sjt\" (UID: \"4f0f32cb-cab5-4ebb-8ae4-497fd240d50c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555490-j8sjt" Mar 12 15:30:00 crc kubenswrapper[4869]: I0312 15:30:00.350567 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f0f32cb-cab5-4ebb-8ae4-497fd240d50c-config-volume\") pod \"collect-profiles-29555490-j8sjt\" (UID: \"4f0f32cb-cab5-4ebb-8ae4-497fd240d50c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555490-j8sjt" Mar 12 15:30:00 crc kubenswrapper[4869]: I0312 15:30:00.359527 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f0f32cb-cab5-4ebb-8ae4-497fd240d50c-secret-volume\") pod \"collect-profiles-29555490-j8sjt\" (UID: \"4f0f32cb-cab5-4ebb-8ae4-497fd240d50c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555490-j8sjt" Mar 12 15:30:00 crc kubenswrapper[4869]: I0312 15:30:00.366467 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q6k8\" (UniqueName: \"kubernetes.io/projected/4f0f32cb-cab5-4ebb-8ae4-497fd240d50c-kube-api-access-9q6k8\") pod \"collect-profiles-29555490-j8sjt\" (UID: \"4f0f32cb-cab5-4ebb-8ae4-497fd240d50c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555490-j8sjt" Mar 12 15:30:00 crc kubenswrapper[4869]: I0312 15:30:00.370081 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99j94\" (UniqueName: \"kubernetes.io/projected/50f29abb-8a8d-4684-86f8-f89eff242676-kube-api-access-99j94\") pod \"auto-csr-approver-29555490-t6mmg\" (UID: \"50f29abb-8a8d-4684-86f8-f89eff242676\") " pod="openshift-infra/auto-csr-approver-29555490-t6mmg" Mar 12 15:30:00 crc kubenswrapper[4869]: I0312 15:30:00.497947 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555490-t6mmg" Mar 12 15:30:00 crc kubenswrapper[4869]: I0312 15:30:00.515182 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555490-j8sjt" Mar 12 15:30:00 crc kubenswrapper[4869]: I0312 15:30:00.978233 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555490-t6mmg"] Mar 12 15:30:00 crc kubenswrapper[4869]: I0312 15:30:00.991747 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555490-j8sjt"] Mar 12 15:30:01 crc kubenswrapper[4869]: I0312 15:30:01.356136 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555490-t6mmg" event={"ID":"50f29abb-8a8d-4684-86f8-f89eff242676","Type":"ContainerStarted","Data":"d268b58bae124fca34075d0ee6b1fe36db93fde2a81bd32e8deee1e76a3b70a1"} Mar 12 15:30:01 crc kubenswrapper[4869]: I0312 15:30:01.358048 4869 generic.go:334] "Generic (PLEG): container finished" podID="4f0f32cb-cab5-4ebb-8ae4-497fd240d50c" containerID="b7d01bb74941dfabda9c962ab403b264a85807a381daeefe134e7b4296a84be8" exitCode=0 Mar 12 15:30:01 crc kubenswrapper[4869]: I0312 15:30:01.358085 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555490-j8sjt" event={"ID":"4f0f32cb-cab5-4ebb-8ae4-497fd240d50c","Type":"ContainerDied","Data":"b7d01bb74941dfabda9c962ab403b264a85807a381daeefe134e7b4296a84be8"} Mar 12 15:30:01 crc kubenswrapper[4869]: I0312 15:30:01.358107 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555490-j8sjt" event={"ID":"4f0f32cb-cab5-4ebb-8ae4-497fd240d50c","Type":"ContainerStarted","Data":"937f14203fd498bf09faf6ce952a3f5c1e97bec7afcb9014f0475a7547ba4a6a"} Mar 12 15:30:02 crc kubenswrapper[4869]: I0312 15:30:02.731171 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555490-j8sjt" Mar 12 15:30:02 crc kubenswrapper[4869]: I0312 15:30:02.807339 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q6k8\" (UniqueName: \"kubernetes.io/projected/4f0f32cb-cab5-4ebb-8ae4-497fd240d50c-kube-api-access-9q6k8\") pod \"4f0f32cb-cab5-4ebb-8ae4-497fd240d50c\" (UID: \"4f0f32cb-cab5-4ebb-8ae4-497fd240d50c\") " Mar 12 15:30:02 crc kubenswrapper[4869]: I0312 15:30:02.807528 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f0f32cb-cab5-4ebb-8ae4-497fd240d50c-config-volume\") pod \"4f0f32cb-cab5-4ebb-8ae4-497fd240d50c\" (UID: \"4f0f32cb-cab5-4ebb-8ae4-497fd240d50c\") " Mar 12 15:30:02 crc kubenswrapper[4869]: I0312 15:30:02.807619 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f0f32cb-cab5-4ebb-8ae4-497fd240d50c-secret-volume\") pod \"4f0f32cb-cab5-4ebb-8ae4-497fd240d50c\" (UID: \"4f0f32cb-cab5-4ebb-8ae4-497fd240d50c\") " Mar 12 15:30:02 crc kubenswrapper[4869]: I0312 15:30:02.808311 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f0f32cb-cab5-4ebb-8ae4-497fd240d50c-config-volume" (OuterVolumeSpecName: "config-volume") pod "4f0f32cb-cab5-4ebb-8ae4-497fd240d50c" (UID: "4f0f32cb-cab5-4ebb-8ae4-497fd240d50c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:30:02 crc kubenswrapper[4869]: I0312 15:30:02.813350 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f0f32cb-cab5-4ebb-8ae4-497fd240d50c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4f0f32cb-cab5-4ebb-8ae4-497fd240d50c" (UID: "4f0f32cb-cab5-4ebb-8ae4-497fd240d50c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:30:02 crc kubenswrapper[4869]: I0312 15:30:02.814020 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f0f32cb-cab5-4ebb-8ae4-497fd240d50c-kube-api-access-9q6k8" (OuterVolumeSpecName: "kube-api-access-9q6k8") pod "4f0f32cb-cab5-4ebb-8ae4-497fd240d50c" (UID: "4f0f32cb-cab5-4ebb-8ae4-497fd240d50c"). InnerVolumeSpecName "kube-api-access-9q6k8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:30:02 crc kubenswrapper[4869]: I0312 15:30:02.909355 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9q6k8\" (UniqueName: \"kubernetes.io/projected/4f0f32cb-cab5-4ebb-8ae4-497fd240d50c-kube-api-access-9q6k8\") on node \"crc\" DevicePath \"\"" Mar 12 15:30:02 crc kubenswrapper[4869]: I0312 15:30:02.909391 4869 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f0f32cb-cab5-4ebb-8ae4-497fd240d50c-config-volume\") on node \"crc\" DevicePath \"\"" Mar 12 15:30:02 crc kubenswrapper[4869]: I0312 15:30:02.909401 4869 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4f0f32cb-cab5-4ebb-8ae4-497fd240d50c-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 12 15:30:03 crc kubenswrapper[4869]: I0312 15:30:03.375852 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555490-t6mmg" event={"ID":"50f29abb-8a8d-4684-86f8-f89eff242676","Type":"ContainerStarted","Data":"7f66a56bd537fafd7aeac991f2cbe5803d91dcf9fa0874d0d894b1c0849796c6"} Mar 12 15:30:03 crc kubenswrapper[4869]: I0312 15:30:03.377275 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555490-j8sjt" event={"ID":"4f0f32cb-cab5-4ebb-8ae4-497fd240d50c","Type":"ContainerDied","Data":"937f14203fd498bf09faf6ce952a3f5c1e97bec7afcb9014f0475a7547ba4a6a"} Mar 12 15:30:03 crc kubenswrapper[4869]: I0312 15:30:03.377322 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="937f14203fd498bf09faf6ce952a3f5c1e97bec7afcb9014f0475a7547ba4a6a" Mar 12 15:30:03 crc kubenswrapper[4869]: I0312 15:30:03.377335 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555490-j8sjt" Mar 12 15:30:03 crc kubenswrapper[4869]: I0312 15:30:03.397903 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555490-t6mmg" podStartSLOduration=1.380810287 podStartE2EDuration="3.397880677s" podCreationTimestamp="2026-03-12 15:30:00 +0000 UTC" firstStartedPulling="2026-03-12 15:30:00.979783666 +0000 UTC m=+2553.265008944" lastFinishedPulling="2026-03-12 15:30:02.996854056 +0000 UTC m=+2555.282079334" observedRunningTime="2026-03-12 15:30:03.388334219 +0000 UTC m=+2555.673559517" watchObservedRunningTime="2026-03-12 15:30:03.397880677 +0000 UTC m=+2555.683105955" Mar 12 15:30:03 crc kubenswrapper[4869]: I0312 15:30:03.812098 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555445-z4j5n"] Mar 12 15:30:03 crc kubenswrapper[4869]: I0312 15:30:03.821779 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555445-z4j5n"] Mar 12 15:30:04 crc kubenswrapper[4869]: I0312 15:30:04.081267 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-28zhl"] Mar 12 15:30:04 crc kubenswrapper[4869]: E0312 15:30:04.081821 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f0f32cb-cab5-4ebb-8ae4-497fd240d50c" containerName="collect-profiles" Mar 12 15:30:04 crc kubenswrapper[4869]: I0312 15:30:04.081845 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f0f32cb-cab5-4ebb-8ae4-497fd240d50c" containerName="collect-profiles" Mar 12 15:30:04 crc kubenswrapper[4869]: I0312 15:30:04.082074 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f0f32cb-cab5-4ebb-8ae4-497fd240d50c" containerName="collect-profiles" Mar 12 15:30:04 crc kubenswrapper[4869]: I0312 15:30:04.083707 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-28zhl" Mar 12 15:30:04 crc kubenswrapper[4869]: I0312 15:30:04.095020 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-28zhl"] Mar 12 15:30:04 crc kubenswrapper[4869]: I0312 15:30:04.132193 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8127238-0a15-41ca-bb51-9d83145bac33-utilities\") pod \"redhat-operators-28zhl\" (UID: \"d8127238-0a15-41ca-bb51-9d83145bac33\") " pod="openshift-marketplace/redhat-operators-28zhl" Mar 12 15:30:04 crc kubenswrapper[4869]: I0312 15:30:04.132400 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpzhs\" (UniqueName: \"kubernetes.io/projected/d8127238-0a15-41ca-bb51-9d83145bac33-kube-api-access-qpzhs\") pod \"redhat-operators-28zhl\" (UID: \"d8127238-0a15-41ca-bb51-9d83145bac33\") " pod="openshift-marketplace/redhat-operators-28zhl" Mar 12 15:30:04 crc kubenswrapper[4869]: I0312 15:30:04.132722 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8127238-0a15-41ca-bb51-9d83145bac33-catalog-content\") pod \"redhat-operators-28zhl\" (UID: \"d8127238-0a15-41ca-bb51-9d83145bac33\") " pod="openshift-marketplace/redhat-operators-28zhl" Mar 12 15:30:04 crc kubenswrapper[4869]: I0312 15:30:04.234486 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8127238-0a15-41ca-bb51-9d83145bac33-catalog-content\") pod \"redhat-operators-28zhl\" (UID: \"d8127238-0a15-41ca-bb51-9d83145bac33\") " pod="openshift-marketplace/redhat-operators-28zhl" Mar 12 15:30:04 crc kubenswrapper[4869]: I0312 15:30:04.234810 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8127238-0a15-41ca-bb51-9d83145bac33-utilities\") pod \"redhat-operators-28zhl\" (UID: \"d8127238-0a15-41ca-bb51-9d83145bac33\") " pod="openshift-marketplace/redhat-operators-28zhl" Mar 12 15:30:04 crc kubenswrapper[4869]: I0312 15:30:04.234942 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpzhs\" (UniqueName: \"kubernetes.io/projected/d8127238-0a15-41ca-bb51-9d83145bac33-kube-api-access-qpzhs\") pod \"redhat-operators-28zhl\" (UID: \"d8127238-0a15-41ca-bb51-9d83145bac33\") " pod="openshift-marketplace/redhat-operators-28zhl" Mar 12 15:30:04 crc kubenswrapper[4869]: I0312 15:30:04.234953 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8127238-0a15-41ca-bb51-9d83145bac33-catalog-content\") pod \"redhat-operators-28zhl\" (UID: \"d8127238-0a15-41ca-bb51-9d83145bac33\") " pod="openshift-marketplace/redhat-operators-28zhl" Mar 12 15:30:04 crc kubenswrapper[4869]: I0312 15:30:04.235235 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8127238-0a15-41ca-bb51-9d83145bac33-utilities\") pod \"redhat-operators-28zhl\" (UID: \"d8127238-0a15-41ca-bb51-9d83145bac33\") " pod="openshift-marketplace/redhat-operators-28zhl" Mar 12 15:30:04 crc kubenswrapper[4869]: I0312 15:30:04.265689 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpzhs\" (UniqueName: \"kubernetes.io/projected/d8127238-0a15-41ca-bb51-9d83145bac33-kube-api-access-qpzhs\") pod \"redhat-operators-28zhl\" (UID: \"d8127238-0a15-41ca-bb51-9d83145bac33\") " pod="openshift-marketplace/redhat-operators-28zhl" Mar 12 15:30:04 crc kubenswrapper[4869]: I0312 15:30:04.349515 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22b1e4b6-8e9c-4e12-8627-469e056beee5" path="/var/lib/kubelet/pods/22b1e4b6-8e9c-4e12-8627-469e056beee5/volumes" Mar 12 15:30:04 crc kubenswrapper[4869]: I0312 15:30:04.386184 4869 generic.go:334] "Generic (PLEG): container finished" podID="50f29abb-8a8d-4684-86f8-f89eff242676" containerID="7f66a56bd537fafd7aeac991f2cbe5803d91dcf9fa0874d0d894b1c0849796c6" exitCode=0 Mar 12 15:30:04 crc kubenswrapper[4869]: I0312 15:30:04.386367 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555490-t6mmg" event={"ID":"50f29abb-8a8d-4684-86f8-f89eff242676","Type":"ContainerDied","Data":"7f66a56bd537fafd7aeac991f2cbe5803d91dcf9fa0874d0d894b1c0849796c6"} Mar 12 15:30:04 crc kubenswrapper[4869]: I0312 15:30:04.401492 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-28zhl" Mar 12 15:30:04 crc kubenswrapper[4869]: I0312 15:30:04.865772 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-28zhl"] Mar 12 15:30:04 crc kubenswrapper[4869]: W0312 15:30:04.869007 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8127238_0a15_41ca_bb51_9d83145bac33.slice/crio-9249e3c3e89eee1e545a95bf4605e1efa79f85db3593c40bd9a707f316f1a3b3 WatchSource:0}: Error finding container 9249e3c3e89eee1e545a95bf4605e1efa79f85db3593c40bd9a707f316f1a3b3: Status 404 returned error can't find the container with id 9249e3c3e89eee1e545a95bf4605e1efa79f85db3593c40bd9a707f316f1a3b3 Mar 12 15:30:05 crc kubenswrapper[4869]: I0312 15:30:05.395733 4869 generic.go:334] "Generic (PLEG): container finished" podID="d8127238-0a15-41ca-bb51-9d83145bac33" containerID="a4a8eab55b48f4a7ee25b1f4f153aee3dde8fb39ff1a316d45368d399abaac9c" exitCode=0 Mar 12 15:30:05 crc kubenswrapper[4869]: I0312 15:30:05.395846 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-28zhl" event={"ID":"d8127238-0a15-41ca-bb51-9d83145bac33","Type":"ContainerDied","Data":"a4a8eab55b48f4a7ee25b1f4f153aee3dde8fb39ff1a316d45368d399abaac9c"} Mar 12 15:30:05 crc kubenswrapper[4869]: I0312 15:30:05.396047 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-28zhl" event={"ID":"d8127238-0a15-41ca-bb51-9d83145bac33","Type":"ContainerStarted","Data":"9249e3c3e89eee1e545a95bf4605e1efa79f85db3593c40bd9a707f316f1a3b3"} Mar 12 15:30:05 crc kubenswrapper[4869]: I0312 15:30:05.732823 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555490-t6mmg" Mar 12 15:30:05 crc kubenswrapper[4869]: I0312 15:30:05.765177 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99j94\" (UniqueName: \"kubernetes.io/projected/50f29abb-8a8d-4684-86f8-f89eff242676-kube-api-access-99j94\") pod \"50f29abb-8a8d-4684-86f8-f89eff242676\" (UID: \"50f29abb-8a8d-4684-86f8-f89eff242676\") " Mar 12 15:30:05 crc kubenswrapper[4869]: I0312 15:30:05.780366 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50f29abb-8a8d-4684-86f8-f89eff242676-kube-api-access-99j94" (OuterVolumeSpecName: "kube-api-access-99j94") pod "50f29abb-8a8d-4684-86f8-f89eff242676" (UID: "50f29abb-8a8d-4684-86f8-f89eff242676"). InnerVolumeSpecName "kube-api-access-99j94". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:30:05 crc kubenswrapper[4869]: I0312 15:30:05.867497 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99j94\" (UniqueName: \"kubernetes.io/projected/50f29abb-8a8d-4684-86f8-f89eff242676-kube-api-access-99j94\") on node \"crc\" DevicePath \"\"" Mar 12 15:30:06 crc kubenswrapper[4869]: I0312 15:30:06.407415 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-28zhl" event={"ID":"d8127238-0a15-41ca-bb51-9d83145bac33","Type":"ContainerStarted","Data":"a382ffd64a4298d8a175feab43c030ebc290952cbf0742d2ea555683b1e20ffc"} Mar 12 15:30:06 crc kubenswrapper[4869]: I0312 15:30:06.412621 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555490-t6mmg" event={"ID":"50f29abb-8a8d-4684-86f8-f89eff242676","Type":"ContainerDied","Data":"d268b58bae124fca34075d0ee6b1fe36db93fde2a81bd32e8deee1e76a3b70a1"} Mar 12 15:30:06 crc kubenswrapper[4869]: I0312 15:30:06.412756 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d268b58bae124fca34075d0ee6b1fe36db93fde2a81bd32e8deee1e76a3b70a1" Mar 12 15:30:06 crc kubenswrapper[4869]: I0312 15:30:06.412867 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555490-t6mmg" Mar 12 15:30:06 crc kubenswrapper[4869]: I0312 15:30:06.477504 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555484-zk5zb"] Mar 12 15:30:06 crc kubenswrapper[4869]: I0312 15:30:06.489167 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555484-zk5zb"] Mar 12 15:30:08 crc kubenswrapper[4869]: I0312 15:30:08.346704 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c5371f8-4455-4af9-a234-c4c9c8de6ec9" path="/var/lib/kubelet/pods/4c5371f8-4455-4af9-a234-c4c9c8de6ec9/volumes" Mar 12 15:30:10 crc kubenswrapper[4869]: I0312 15:30:10.453082 4869 generic.go:334] "Generic (PLEG): container finished" podID="d8127238-0a15-41ca-bb51-9d83145bac33" containerID="a382ffd64a4298d8a175feab43c030ebc290952cbf0742d2ea555683b1e20ffc" exitCode=0 Mar 12 15:30:10 crc kubenswrapper[4869]: I0312 15:30:10.453197 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-28zhl" event={"ID":"d8127238-0a15-41ca-bb51-9d83145bac33","Type":"ContainerDied","Data":"a382ffd64a4298d8a175feab43c030ebc290952cbf0742d2ea555683b1e20ffc"} Mar 12 15:30:11 crc kubenswrapper[4869]: I0312 15:30:11.466017 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-28zhl" event={"ID":"d8127238-0a15-41ca-bb51-9d83145bac33","Type":"ContainerStarted","Data":"6fdb63b75b94886d81b79d6a561da0fa2cedc999f720f71052069d8a46978855"} Mar 12 15:30:11 crc kubenswrapper[4869]: I0312 15:30:11.493182 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-28zhl" podStartSLOduration=2.013080098 podStartE2EDuration="7.493162934s" podCreationTimestamp="2026-03-12 15:30:04 +0000 UTC" firstStartedPulling="2026-03-12 15:30:05.397662722 +0000 UTC m=+2557.682888000" lastFinishedPulling="2026-03-12 15:30:10.877745558 +0000 UTC m=+2563.162970836" observedRunningTime="2026-03-12 15:30:11.487648309 +0000 UTC m=+2563.772873617" watchObservedRunningTime="2026-03-12 15:30:11.493162934 +0000 UTC m=+2563.778388212" Mar 12 15:30:14 crc kubenswrapper[4869]: I0312 15:30:14.337404 4869 scope.go:117] "RemoveContainer" containerID="c9a47bbb284d539c798de0e32d0c54787cc43e1f77e98632a15c2b240ce0d5b0" Mar 12 15:30:14 crc kubenswrapper[4869]: E0312 15:30:14.338089 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:30:14 crc kubenswrapper[4869]: I0312 15:30:14.402328 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-28zhl" Mar 12 15:30:14 crc kubenswrapper[4869]: I0312 15:30:14.402434 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-28zhl" Mar 12 15:30:15 crc kubenswrapper[4869]: I0312 15:30:15.449021 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-28zhl" podUID="d8127238-0a15-41ca-bb51-9d83145bac33" containerName="registry-server" probeResult="failure" output=< Mar 12 15:30:15 crc kubenswrapper[4869]: timeout: failed to connect service ":50051" within 1s Mar 12 15:30:15 crc kubenswrapper[4869]: > Mar 12 15:30:24 crc kubenswrapper[4869]: I0312 15:30:24.458660 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-28zhl" Mar 12 15:30:24 crc kubenswrapper[4869]: I0312 15:30:24.527434 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-28zhl" Mar 12 15:30:24 crc kubenswrapper[4869]: I0312 15:30:24.700979 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-28zhl"] Mar 12 15:30:25 crc kubenswrapper[4869]: I0312 15:30:25.601804 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-28zhl" podUID="d8127238-0a15-41ca-bb51-9d83145bac33" containerName="registry-server" containerID="cri-o://6fdb63b75b94886d81b79d6a561da0fa2cedc999f720f71052069d8a46978855" gracePeriod=2 Mar 12 15:30:26 crc kubenswrapper[4869]: I0312 15:30:26.048953 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-28zhl" Mar 12 15:30:26 crc kubenswrapper[4869]: I0312 15:30:26.175450 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpzhs\" (UniqueName: \"kubernetes.io/projected/d8127238-0a15-41ca-bb51-9d83145bac33-kube-api-access-qpzhs\") pod \"d8127238-0a15-41ca-bb51-9d83145bac33\" (UID: \"d8127238-0a15-41ca-bb51-9d83145bac33\") " Mar 12 15:30:26 crc kubenswrapper[4869]: I0312 15:30:26.175593 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8127238-0a15-41ca-bb51-9d83145bac33-catalog-content\") pod \"d8127238-0a15-41ca-bb51-9d83145bac33\" (UID: \"d8127238-0a15-41ca-bb51-9d83145bac33\") " Mar 12 15:30:26 crc kubenswrapper[4869]: I0312 15:30:26.175675 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8127238-0a15-41ca-bb51-9d83145bac33-utilities\") pod \"d8127238-0a15-41ca-bb51-9d83145bac33\" (UID: \"d8127238-0a15-41ca-bb51-9d83145bac33\") " Mar 12 15:30:26 crc kubenswrapper[4869]: I0312 15:30:26.176673 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8127238-0a15-41ca-bb51-9d83145bac33-utilities" (OuterVolumeSpecName: "utilities") pod "d8127238-0a15-41ca-bb51-9d83145bac33" (UID: "d8127238-0a15-41ca-bb51-9d83145bac33"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:30:26 crc kubenswrapper[4869]: I0312 15:30:26.181921 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8127238-0a15-41ca-bb51-9d83145bac33-kube-api-access-qpzhs" (OuterVolumeSpecName: "kube-api-access-qpzhs") pod "d8127238-0a15-41ca-bb51-9d83145bac33" (UID: "d8127238-0a15-41ca-bb51-9d83145bac33"). InnerVolumeSpecName "kube-api-access-qpzhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:30:26 crc kubenswrapper[4869]: I0312 15:30:26.279031 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpzhs\" (UniqueName: \"kubernetes.io/projected/d8127238-0a15-41ca-bb51-9d83145bac33-kube-api-access-qpzhs\") on node \"crc\" DevicePath \"\"" Mar 12 15:30:26 crc kubenswrapper[4869]: I0312 15:30:26.279058 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8127238-0a15-41ca-bb51-9d83145bac33-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:30:26 crc kubenswrapper[4869]: I0312 15:30:26.305610 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8127238-0a15-41ca-bb51-9d83145bac33-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d8127238-0a15-41ca-bb51-9d83145bac33" (UID: "d8127238-0a15-41ca-bb51-9d83145bac33"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:30:26 crc kubenswrapper[4869]: I0312 15:30:26.337169 4869 scope.go:117] "RemoveContainer" containerID="c9a47bbb284d539c798de0e32d0c54787cc43e1f77e98632a15c2b240ce0d5b0" Mar 12 15:30:26 crc kubenswrapper[4869]: E0312 15:30:26.337512 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:30:26 crc kubenswrapper[4869]: I0312 15:30:26.381769 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8127238-0a15-41ca-bb51-9d83145bac33-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:30:26 crc kubenswrapper[4869]: I0312 15:30:26.611763 4869 generic.go:334] "Generic (PLEG): container finished" podID="d8127238-0a15-41ca-bb51-9d83145bac33" containerID="6fdb63b75b94886d81b79d6a561da0fa2cedc999f720f71052069d8a46978855" exitCode=0 Mar 12 15:30:26 crc kubenswrapper[4869]: I0312 15:30:26.611818 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-28zhl" Mar 12 15:30:26 crc kubenswrapper[4869]: I0312 15:30:26.611807 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-28zhl" event={"ID":"d8127238-0a15-41ca-bb51-9d83145bac33","Type":"ContainerDied","Data":"6fdb63b75b94886d81b79d6a561da0fa2cedc999f720f71052069d8a46978855"} Mar 12 15:30:26 crc kubenswrapper[4869]: I0312 15:30:26.611934 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-28zhl" event={"ID":"d8127238-0a15-41ca-bb51-9d83145bac33","Type":"ContainerDied","Data":"9249e3c3e89eee1e545a95bf4605e1efa79f85db3593c40bd9a707f316f1a3b3"} Mar 12 15:30:26 crc kubenswrapper[4869]: I0312 15:30:26.611953 4869 scope.go:117] "RemoveContainer" containerID="6fdb63b75b94886d81b79d6a561da0fa2cedc999f720f71052069d8a46978855" Mar 12 15:30:26 crc kubenswrapper[4869]: I0312 15:30:26.635479 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-28zhl"] Mar 12 15:30:26 crc kubenswrapper[4869]: I0312 15:30:26.643888 4869 scope.go:117] "RemoveContainer" containerID="a382ffd64a4298d8a175feab43c030ebc290952cbf0742d2ea555683b1e20ffc" Mar 12 15:30:26 crc kubenswrapper[4869]: I0312 15:30:26.645583 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-28zhl"] Mar 12 15:30:26 crc kubenswrapper[4869]: I0312 15:30:26.674878 4869 scope.go:117] "RemoveContainer" containerID="a4a8eab55b48f4a7ee25b1f4f153aee3dde8fb39ff1a316d45368d399abaac9c" Mar 12 15:30:26 crc kubenswrapper[4869]: I0312 15:30:26.717628 4869 scope.go:117] "RemoveContainer" containerID="6fdb63b75b94886d81b79d6a561da0fa2cedc999f720f71052069d8a46978855" Mar 12 15:30:26 crc kubenswrapper[4869]: E0312 15:30:26.718174 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fdb63b75b94886d81b79d6a561da0fa2cedc999f720f71052069d8a46978855\": container with ID starting with 6fdb63b75b94886d81b79d6a561da0fa2cedc999f720f71052069d8a46978855 not found: ID does not exist" containerID="6fdb63b75b94886d81b79d6a561da0fa2cedc999f720f71052069d8a46978855" Mar 12 15:30:26 crc kubenswrapper[4869]: I0312 15:30:26.718221 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fdb63b75b94886d81b79d6a561da0fa2cedc999f720f71052069d8a46978855"} err="failed to get container status \"6fdb63b75b94886d81b79d6a561da0fa2cedc999f720f71052069d8a46978855\": rpc error: code = NotFound desc = could not find container \"6fdb63b75b94886d81b79d6a561da0fa2cedc999f720f71052069d8a46978855\": container with ID starting with 6fdb63b75b94886d81b79d6a561da0fa2cedc999f720f71052069d8a46978855 not found: ID does not exist" Mar 12 15:30:26 crc kubenswrapper[4869]: I0312 15:30:26.718248 4869 scope.go:117] "RemoveContainer" containerID="a382ffd64a4298d8a175feab43c030ebc290952cbf0742d2ea555683b1e20ffc" Mar 12 15:30:26 crc kubenswrapper[4869]: E0312 15:30:26.718532 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a382ffd64a4298d8a175feab43c030ebc290952cbf0742d2ea555683b1e20ffc\": container with ID starting with a382ffd64a4298d8a175feab43c030ebc290952cbf0742d2ea555683b1e20ffc not found: ID does not exist" containerID="a382ffd64a4298d8a175feab43c030ebc290952cbf0742d2ea555683b1e20ffc" Mar 12 15:30:26 crc kubenswrapper[4869]: I0312 15:30:26.718574 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a382ffd64a4298d8a175feab43c030ebc290952cbf0742d2ea555683b1e20ffc"} err="failed to get container status \"a382ffd64a4298d8a175feab43c030ebc290952cbf0742d2ea555683b1e20ffc\": rpc error: code = NotFound desc = could not find container \"a382ffd64a4298d8a175feab43c030ebc290952cbf0742d2ea555683b1e20ffc\": container with ID starting with a382ffd64a4298d8a175feab43c030ebc290952cbf0742d2ea555683b1e20ffc not found: ID does not exist" Mar 12 15:30:26 crc kubenswrapper[4869]: I0312 15:30:26.718593 4869 scope.go:117] "RemoveContainer" containerID="a4a8eab55b48f4a7ee25b1f4f153aee3dde8fb39ff1a316d45368d399abaac9c" Mar 12 15:30:26 crc kubenswrapper[4869]: E0312 15:30:26.718851 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4a8eab55b48f4a7ee25b1f4f153aee3dde8fb39ff1a316d45368d399abaac9c\": container with ID starting with a4a8eab55b48f4a7ee25b1f4f153aee3dde8fb39ff1a316d45368d399abaac9c not found: ID does not exist" containerID="a4a8eab55b48f4a7ee25b1f4f153aee3dde8fb39ff1a316d45368d399abaac9c" Mar 12 15:30:26 crc kubenswrapper[4869]: I0312 15:30:26.718874 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4a8eab55b48f4a7ee25b1f4f153aee3dde8fb39ff1a316d45368d399abaac9c"} err="failed to get container status \"a4a8eab55b48f4a7ee25b1f4f153aee3dde8fb39ff1a316d45368d399abaac9c\": rpc error: code = NotFound desc = could not find container \"a4a8eab55b48f4a7ee25b1f4f153aee3dde8fb39ff1a316d45368d399abaac9c\": container with ID starting with a4a8eab55b48f4a7ee25b1f4f153aee3dde8fb39ff1a316d45368d399abaac9c not found: ID does not exist" Mar 12 15:30:28 crc kubenswrapper[4869]: I0312 15:30:28.361977 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8127238-0a15-41ca-bb51-9d83145bac33" path="/var/lib/kubelet/pods/d8127238-0a15-41ca-bb51-9d83145bac33/volumes" Mar 12 15:30:34 crc kubenswrapper[4869]: I0312 15:30:34.604686 4869 scope.go:117] "RemoveContainer" containerID="f0b3a4fd58f3b8d4d5103d22eee5bb2e2cb19c1f2f54ddf41c09d93e9caf40d7" Mar 12 15:30:34 crc kubenswrapper[4869]: I0312 15:30:34.637280 4869 scope.go:117] "RemoveContainer" containerID="ccb610c4a3ac79138c5a060ba9fbe09037b3fd9fc8acebcbcdc8da98d09108b8" Mar 12 15:30:41 crc kubenswrapper[4869]: I0312 15:30:41.337235 4869 scope.go:117] "RemoveContainer" containerID="c9a47bbb284d539c798de0e32d0c54787cc43e1f77e98632a15c2b240ce0d5b0" Mar 12 15:30:41 crc kubenswrapper[4869]: E0312 15:30:41.338280 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:30:52 crc kubenswrapper[4869]: I0312 15:30:52.337403 4869 scope.go:117] "RemoveContainer" containerID="c9a47bbb284d539c798de0e32d0c54787cc43e1f77e98632a15c2b240ce0d5b0" Mar 12 15:30:52 crc kubenswrapper[4869]: E0312 15:30:52.338485 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:31:06 crc kubenswrapper[4869]: I0312 15:31:06.337020 4869 scope.go:117] "RemoveContainer" containerID="c9a47bbb284d539c798de0e32d0c54787cc43e1f77e98632a15c2b240ce0d5b0" Mar 12 15:31:06 crc kubenswrapper[4869]: E0312 15:31:06.337939 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:31:08 crc kubenswrapper[4869]: I0312 15:31:08.998420 4869 generic.go:334] "Generic (PLEG): container finished" podID="2b744985-65a8-49f0-b177-85c19eeb86c1" containerID="4cae1c694651afdde11c3f9c168182e0b28567ae01924fc19d97e9618047207e" exitCode=0 Mar 12 15:31:08 crc kubenswrapper[4869]: I0312 15:31:08.998626 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4" event={"ID":"2b744985-65a8-49f0-b177-85c19eeb86c1","Type":"ContainerDied","Data":"4cae1c694651afdde11c3f9c168182e0b28567ae01924fc19d97e9618047207e"} Mar 12 15:31:10 crc kubenswrapper[4869]: I0312 15:31:10.438226 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4" Mar 12 15:31:10 crc kubenswrapper[4869]: I0312 15:31:10.636528 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-ceilometer-compute-config-data-2\") pod \"2b744985-65a8-49f0-b177-85c19eeb86c1\" (UID: \"2b744985-65a8-49f0-b177-85c19eeb86c1\") " Mar 12 15:31:10 crc kubenswrapper[4869]: I0312 15:31:10.637119 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-ssh-key-openstack-edpm-ipam\") pod \"2b744985-65a8-49f0-b177-85c19eeb86c1\" (UID: \"2b744985-65a8-49f0-b177-85c19eeb86c1\") " Mar 12 15:31:10 crc kubenswrapper[4869]: I0312 15:31:10.637319 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-ceilometer-compute-config-data-0\") pod \"2b744985-65a8-49f0-b177-85c19eeb86c1\" (UID: \"2b744985-65a8-49f0-b177-85c19eeb86c1\") " Mar 12 15:31:10 crc kubenswrapper[4869]: I0312 15:31:10.637511 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-inventory\") pod \"2b744985-65a8-49f0-b177-85c19eeb86c1\" (UID: \"2b744985-65a8-49f0-b177-85c19eeb86c1\") " Mar 12 15:31:10 crc kubenswrapper[4869]: I0312 15:31:10.637654 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnqf7\" (UniqueName: \"kubernetes.io/projected/2b744985-65a8-49f0-b177-85c19eeb86c1-kube-api-access-dnqf7\") pod \"2b744985-65a8-49f0-b177-85c19eeb86c1\" (UID: \"2b744985-65a8-49f0-b177-85c19eeb86c1\") " Mar 12 15:31:10 crc kubenswrapper[4869]: I0312 15:31:10.637779 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-ceilometer-compute-config-data-1\") pod \"2b744985-65a8-49f0-b177-85c19eeb86c1\" (UID: \"2b744985-65a8-49f0-b177-85c19eeb86c1\") " Mar 12 15:31:10 crc kubenswrapper[4869]: I0312 15:31:10.637904 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-telemetry-combined-ca-bundle\") pod \"2b744985-65a8-49f0-b177-85c19eeb86c1\" (UID: \"2b744985-65a8-49f0-b177-85c19eeb86c1\") " Mar 12 15:31:10 crc kubenswrapper[4869]: I0312 15:31:10.643322 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "2b744985-65a8-49f0-b177-85c19eeb86c1" (UID: "2b744985-65a8-49f0-b177-85c19eeb86c1"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:31:10 crc kubenswrapper[4869]: I0312 15:31:10.643527 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b744985-65a8-49f0-b177-85c19eeb86c1-kube-api-access-dnqf7" (OuterVolumeSpecName: "kube-api-access-dnqf7") pod "2b744985-65a8-49f0-b177-85c19eeb86c1" (UID: "2b744985-65a8-49f0-b177-85c19eeb86c1"). InnerVolumeSpecName "kube-api-access-dnqf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:31:10 crc kubenswrapper[4869]: I0312 15:31:10.669172 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "2b744985-65a8-49f0-b177-85c19eeb86c1" (UID: "2b744985-65a8-49f0-b177-85c19eeb86c1"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:31:10 crc kubenswrapper[4869]: I0312 15:31:10.675482 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "2b744985-65a8-49f0-b177-85c19eeb86c1" (UID: "2b744985-65a8-49f0-b177-85c19eeb86c1"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:31:10 crc kubenswrapper[4869]: I0312 15:31:10.680469 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "2b744985-65a8-49f0-b177-85c19eeb86c1" (UID: "2b744985-65a8-49f0-b177-85c19eeb86c1"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:31:10 crc kubenswrapper[4869]: I0312 15:31:10.683735 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-inventory" (OuterVolumeSpecName: "inventory") pod "2b744985-65a8-49f0-b177-85c19eeb86c1" (UID: "2b744985-65a8-49f0-b177-85c19eeb86c1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:31:10 crc kubenswrapper[4869]: I0312 15:31:10.685728 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2b744985-65a8-49f0-b177-85c19eeb86c1" (UID: "2b744985-65a8-49f0-b177-85c19eeb86c1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:31:10 crc kubenswrapper[4869]: I0312 15:31:10.739446 4869 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 12 15:31:10 crc kubenswrapper[4869]: I0312 15:31:10.739476 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 15:31:10 crc kubenswrapper[4869]: I0312 15:31:10.739489 4869 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 12 15:31:10 crc kubenswrapper[4869]: I0312 15:31:10.739503 4869 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 15:31:10 crc kubenswrapper[4869]: I0312 15:31:10.739517 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnqf7\" (UniqueName: \"kubernetes.io/projected/2b744985-65a8-49f0-b177-85c19eeb86c1-kube-api-access-dnqf7\") on node \"crc\" DevicePath \"\"" Mar 12 15:31:10 crc kubenswrapper[4869]: I0312 15:31:10.739530 4869 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 12 15:31:10 crc kubenswrapper[4869]: I0312 15:31:10.739560 4869 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b744985-65a8-49f0-b177-85c19eeb86c1-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:31:11 crc kubenswrapper[4869]: I0312 15:31:11.021492 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4" event={"ID":"2b744985-65a8-49f0-b177-85c19eeb86c1","Type":"ContainerDied","Data":"43d574d28760c7b1c2a63a0b7c50220f0bb5f4312cd836ca4fa5a572b62af175"} Mar 12 15:31:11 crc kubenswrapper[4869]: I0312 15:31:11.021532 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43d574d28760c7b1c2a63a0b7c50220f0bb5f4312cd836ca4fa5a572b62af175" Mar 12 15:31:11 crc kubenswrapper[4869]: I0312 15:31:11.021605 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wmss4" Mar 12 15:31:19 crc kubenswrapper[4869]: I0312 15:31:19.336523 4869 scope.go:117] "RemoveContainer" containerID="c9a47bbb284d539c798de0e32d0c54787cc43e1f77e98632a15c2b240ce0d5b0" Mar 12 15:31:19 crc kubenswrapper[4869]: E0312 15:31:19.337513 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:31:33 crc kubenswrapper[4869]: I0312 15:31:33.337569 4869 scope.go:117] "RemoveContainer" containerID="c9a47bbb284d539c798de0e32d0c54787cc43e1f77e98632a15c2b240ce0d5b0" Mar 12 15:31:33 crc kubenswrapper[4869]: E0312 15:31:33.338425 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:31:45 crc kubenswrapper[4869]: I0312 15:31:45.336118 4869 scope.go:117] "RemoveContainer" containerID="c9a47bbb284d539c798de0e32d0c54787cc43e1f77e98632a15c2b240ce0d5b0" Mar 12 15:31:45 crc kubenswrapper[4869]: E0312 15:31:45.337306 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:32:00 crc kubenswrapper[4869]: I0312 15:32:00.140674 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555492-cpxgp"] Mar 12 15:32:00 crc kubenswrapper[4869]: E0312 15:32:00.141999 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8127238-0a15-41ca-bb51-9d83145bac33" containerName="extract-content" Mar 12 15:32:00 crc kubenswrapper[4869]: I0312 15:32:00.142022 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8127238-0a15-41ca-bb51-9d83145bac33" containerName="extract-content" Mar 12 15:32:00 crc kubenswrapper[4869]: E0312 15:32:00.142080 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50f29abb-8a8d-4684-86f8-f89eff242676" containerName="oc" Mar 12 15:32:00 crc kubenswrapper[4869]: I0312 15:32:00.142095 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f29abb-8a8d-4684-86f8-f89eff242676" containerName="oc" Mar 12 15:32:00 crc kubenswrapper[4869]: E0312 15:32:00.142110 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8127238-0a15-41ca-bb51-9d83145bac33" containerName="extract-utilities" Mar 12 15:32:00 crc kubenswrapper[4869]: I0312 15:32:00.142124 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8127238-0a15-41ca-bb51-9d83145bac33" containerName="extract-utilities" Mar 12 15:32:00 crc kubenswrapper[4869]: E0312 15:32:00.142150 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8127238-0a15-41ca-bb51-9d83145bac33" containerName="registry-server" Mar 12 15:32:00 crc kubenswrapper[4869]: I0312 15:32:00.142166 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8127238-0a15-41ca-bb51-9d83145bac33" containerName="registry-server" Mar 12 15:32:00 crc kubenswrapper[4869]: E0312 15:32:00.142205 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b744985-65a8-49f0-b177-85c19eeb86c1" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 12 15:32:00 crc kubenswrapper[4869]: I0312 15:32:00.142224 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b744985-65a8-49f0-b177-85c19eeb86c1" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 12 15:32:00 crc kubenswrapper[4869]: I0312 15:32:00.142637 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b744985-65a8-49f0-b177-85c19eeb86c1" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 12 15:32:00 crc kubenswrapper[4869]: I0312 15:32:00.142674 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="50f29abb-8a8d-4684-86f8-f89eff242676" containerName="oc" Mar 12 15:32:00 crc kubenswrapper[4869]: I0312 15:32:00.142705 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8127238-0a15-41ca-bb51-9d83145bac33" containerName="registry-server" Mar 12 15:32:00 crc kubenswrapper[4869]: I0312 15:32:00.143810 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555492-cpxgp" Mar 12 15:32:00 crc kubenswrapper[4869]: I0312 15:32:00.146357 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 15:32:00 crc kubenswrapper[4869]: I0312 15:32:00.147922 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:32:00 crc kubenswrapper[4869]: I0312 15:32:00.148002 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:32:00 crc kubenswrapper[4869]: I0312 15:32:00.149579 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555492-cpxgp"] Mar 12 15:32:00 crc kubenswrapper[4869]: I0312 15:32:00.231959 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2dsn\" (UniqueName: \"kubernetes.io/projected/745ca9cb-c31f-4c01-aaff-a67e2f6cb59c-kube-api-access-n2dsn\") pod \"auto-csr-approver-29555492-cpxgp\" (UID: \"745ca9cb-c31f-4c01-aaff-a67e2f6cb59c\") " pod="openshift-infra/auto-csr-approver-29555492-cpxgp" Mar 12 15:32:00 crc kubenswrapper[4869]: I0312 15:32:00.335587 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2dsn\" (UniqueName: \"kubernetes.io/projected/745ca9cb-c31f-4c01-aaff-a67e2f6cb59c-kube-api-access-n2dsn\") pod \"auto-csr-approver-29555492-cpxgp\" (UID: \"745ca9cb-c31f-4c01-aaff-a67e2f6cb59c\") " pod="openshift-infra/auto-csr-approver-29555492-cpxgp" Mar 12 15:32:00 crc kubenswrapper[4869]: I0312 15:32:00.336671 4869 scope.go:117] "RemoveContainer" containerID="c9a47bbb284d539c798de0e32d0c54787cc43e1f77e98632a15c2b240ce0d5b0" Mar 12 15:32:00 crc kubenswrapper[4869]: E0312 15:32:00.336996 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:32:00 crc kubenswrapper[4869]: I0312 15:32:00.362298 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2dsn\" (UniqueName: \"kubernetes.io/projected/745ca9cb-c31f-4c01-aaff-a67e2f6cb59c-kube-api-access-n2dsn\") pod \"auto-csr-approver-29555492-cpxgp\" (UID: \"745ca9cb-c31f-4c01-aaff-a67e2f6cb59c\") " pod="openshift-infra/auto-csr-approver-29555492-cpxgp" Mar 12 15:32:00 crc kubenswrapper[4869]: I0312 15:32:00.470908 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555492-cpxgp" Mar 12 15:32:00 crc kubenswrapper[4869]: I0312 15:32:00.976862 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555492-cpxgp"] Mar 12 15:32:01 crc kubenswrapper[4869]: I0312 15:32:01.499748 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555492-cpxgp" event={"ID":"745ca9cb-c31f-4c01-aaff-a67e2f6cb59c","Type":"ContainerStarted","Data":"76435d928bbff7d750e0e9849bf9cb34ae4ef99d78f7b9d5c2691f047b0e5da3"} Mar 12 15:32:02 crc kubenswrapper[4869]: I0312 15:32:02.509623 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555492-cpxgp" event={"ID":"745ca9cb-c31f-4c01-aaff-a67e2f6cb59c","Type":"ContainerStarted","Data":"91be9c4955bb86d47a1ab3bbdc0770ad1a62689768bf9ba14393d1b1abbeaf55"} Mar 12 15:32:02 crc kubenswrapper[4869]: I0312 15:32:02.524119 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555492-cpxgp" podStartSLOduration=1.475145325 podStartE2EDuration="2.524102706s" podCreationTimestamp="2026-03-12 15:32:00 +0000 UTC" firstStartedPulling="2026-03-12 15:32:00.978029838 +0000 UTC m=+2673.263255116" lastFinishedPulling="2026-03-12 15:32:02.026987219 +0000 UTC m=+2674.312212497" observedRunningTime="2026-03-12 15:32:02.522056529 +0000 UTC m=+2674.807281807" watchObservedRunningTime="2026-03-12 15:32:02.524102706 +0000 UTC m=+2674.809327984" Mar 12 15:32:03 crc kubenswrapper[4869]: I0312 15:32:03.519173 4869 generic.go:334] "Generic (PLEG): container finished" podID="745ca9cb-c31f-4c01-aaff-a67e2f6cb59c" containerID="91be9c4955bb86d47a1ab3bbdc0770ad1a62689768bf9ba14393d1b1abbeaf55" exitCode=0 Mar 12 15:32:03 crc kubenswrapper[4869]: I0312 15:32:03.519232 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555492-cpxgp" event={"ID":"745ca9cb-c31f-4c01-aaff-a67e2f6cb59c","Type":"ContainerDied","Data":"91be9c4955bb86d47a1ab3bbdc0770ad1a62689768bf9ba14393d1b1abbeaf55"} Mar 12 15:32:04 crc kubenswrapper[4869]: I0312 15:32:04.877721 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555492-cpxgp" Mar 12 15:32:04 crc kubenswrapper[4869]: I0312 15:32:04.918103 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2dsn\" (UniqueName: \"kubernetes.io/projected/745ca9cb-c31f-4c01-aaff-a67e2f6cb59c-kube-api-access-n2dsn\") pod \"745ca9cb-c31f-4c01-aaff-a67e2f6cb59c\" (UID: \"745ca9cb-c31f-4c01-aaff-a67e2f6cb59c\") " Mar 12 15:32:04 crc kubenswrapper[4869]: I0312 15:32:04.924647 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/745ca9cb-c31f-4c01-aaff-a67e2f6cb59c-kube-api-access-n2dsn" (OuterVolumeSpecName: "kube-api-access-n2dsn") pod "745ca9cb-c31f-4c01-aaff-a67e2f6cb59c" (UID: "745ca9cb-c31f-4c01-aaff-a67e2f6cb59c"). InnerVolumeSpecName "kube-api-access-n2dsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:32:05 crc kubenswrapper[4869]: I0312 15:32:05.020317 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2dsn\" (UniqueName: \"kubernetes.io/projected/745ca9cb-c31f-4c01-aaff-a67e2f6cb59c-kube-api-access-n2dsn\") on node \"crc\" DevicePath \"\"" Mar 12 15:32:05 crc kubenswrapper[4869]: I0312 15:32:05.538788 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555492-cpxgp" event={"ID":"745ca9cb-c31f-4c01-aaff-a67e2f6cb59c","Type":"ContainerDied","Data":"76435d928bbff7d750e0e9849bf9cb34ae4ef99d78f7b9d5c2691f047b0e5da3"} Mar 12 15:32:05 crc kubenswrapper[4869]: I0312 15:32:05.538857 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555492-cpxgp" Mar 12 15:32:05 crc kubenswrapper[4869]: I0312 15:32:05.538865 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76435d928bbff7d750e0e9849bf9cb34ae4ef99d78f7b9d5c2691f047b0e5da3" Mar 12 15:32:05 crc kubenswrapper[4869]: I0312 15:32:05.595332 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555486-27ldq"] Mar 12 15:32:05 crc kubenswrapper[4869]: I0312 15:32:05.604178 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555486-27ldq"] Mar 12 15:32:06 crc kubenswrapper[4869]: I0312 15:32:06.349473 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d393e968-8951-49cf-8ff2-5f14d4230b14" path="/var/lib/kubelet/pods/d393e968-8951-49cf-8ff2-5f14d4230b14/volumes" Mar 12 15:32:12 crc kubenswrapper[4869]: I0312 15:32:12.537322 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tz2q4"] Mar 12 15:32:12 crc kubenswrapper[4869]: E0312 15:32:12.539621 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="745ca9cb-c31f-4c01-aaff-a67e2f6cb59c" containerName="oc" Mar 12 15:32:12 crc kubenswrapper[4869]: I0312 15:32:12.539710 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="745ca9cb-c31f-4c01-aaff-a67e2f6cb59c" containerName="oc" Mar 12 15:32:12 crc kubenswrapper[4869]: I0312 15:32:12.540040 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="745ca9cb-c31f-4c01-aaff-a67e2f6cb59c" containerName="oc" Mar 12 15:32:12 crc kubenswrapper[4869]: I0312 15:32:12.541861 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tz2q4" Mar 12 15:32:12 crc kubenswrapper[4869]: I0312 15:32:12.549063 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tz2q4"] Mar 12 15:32:12 crc kubenswrapper[4869]: I0312 15:32:12.569222 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpwlk\" (UniqueName: \"kubernetes.io/projected/64b10dee-f478-4509-bb67-840ee41733e1-kube-api-access-mpwlk\") pod \"redhat-marketplace-tz2q4\" (UID: \"64b10dee-f478-4509-bb67-840ee41733e1\") " pod="openshift-marketplace/redhat-marketplace-tz2q4" Mar 12 15:32:12 crc kubenswrapper[4869]: I0312 15:32:12.569301 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64b10dee-f478-4509-bb67-840ee41733e1-utilities\") pod \"redhat-marketplace-tz2q4\" (UID: \"64b10dee-f478-4509-bb67-840ee41733e1\") " pod="openshift-marketplace/redhat-marketplace-tz2q4" Mar 12 15:32:12 crc kubenswrapper[4869]: I0312 15:32:12.569363 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64b10dee-f478-4509-bb67-840ee41733e1-catalog-content\") pod \"redhat-marketplace-tz2q4\" (UID: \"64b10dee-f478-4509-bb67-840ee41733e1\") " pod="openshift-marketplace/redhat-marketplace-tz2q4" Mar 12 15:32:12 crc kubenswrapper[4869]: I0312 15:32:12.670294 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64b10dee-f478-4509-bb67-840ee41733e1-utilities\") pod \"redhat-marketplace-tz2q4\" (UID: \"64b10dee-f478-4509-bb67-840ee41733e1\") " pod="openshift-marketplace/redhat-marketplace-tz2q4" Mar 12 15:32:12 crc kubenswrapper[4869]: I0312 15:32:12.670383 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64b10dee-f478-4509-bb67-840ee41733e1-catalog-content\") pod \"redhat-marketplace-tz2q4\" (UID: \"64b10dee-f478-4509-bb67-840ee41733e1\") " pod="openshift-marketplace/redhat-marketplace-tz2q4" Mar 12 15:32:12 crc kubenswrapper[4869]: I0312 15:32:12.670518 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpwlk\" (UniqueName: \"kubernetes.io/projected/64b10dee-f478-4509-bb67-840ee41733e1-kube-api-access-mpwlk\") pod \"redhat-marketplace-tz2q4\" (UID: \"64b10dee-f478-4509-bb67-840ee41733e1\") " pod="openshift-marketplace/redhat-marketplace-tz2q4" Mar 12 15:32:12 crc kubenswrapper[4869]: I0312 15:32:12.670799 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64b10dee-f478-4509-bb67-840ee41733e1-utilities\") pod \"redhat-marketplace-tz2q4\" (UID: \"64b10dee-f478-4509-bb67-840ee41733e1\") " pod="openshift-marketplace/redhat-marketplace-tz2q4" Mar 12 15:32:12 crc kubenswrapper[4869]: I0312 15:32:12.670811 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64b10dee-f478-4509-bb67-840ee41733e1-catalog-content\") pod \"redhat-marketplace-tz2q4\" (UID: \"64b10dee-f478-4509-bb67-840ee41733e1\") " pod="openshift-marketplace/redhat-marketplace-tz2q4" Mar 12 15:32:12 crc kubenswrapper[4869]: I0312 15:32:12.687886 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpwlk\" (UniqueName: \"kubernetes.io/projected/64b10dee-f478-4509-bb67-840ee41733e1-kube-api-access-mpwlk\") pod \"redhat-marketplace-tz2q4\" (UID: \"64b10dee-f478-4509-bb67-840ee41733e1\") " pod="openshift-marketplace/redhat-marketplace-tz2q4" Mar 12 15:32:12 crc kubenswrapper[4869]: I0312 15:32:12.862767 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tz2q4" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.336801 4869 scope.go:117] "RemoveContainer" containerID="c9a47bbb284d539c798de0e32d0c54787cc43e1f77e98632a15c2b240ce0d5b0" Mar 12 15:32:13 crc kubenswrapper[4869]: E0312 15:32:13.337428 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.341229 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tz2q4"] Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.487501 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.489369 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.492253 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.492592 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.492774 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.496817 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.590436 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e6182d72-d424-4a24-bb32-1c43aaa82bba-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " pod="openstack/tempest-tests-tempest" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.590612 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6182d72-d424-4a24-bb32-1c43aaa82bba-config-data\") pod \"tempest-tests-tempest\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " pod="openstack/tempest-tests-tempest" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.590659 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e6182d72-d424-4a24-bb32-1c43aaa82bba-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " pod="openstack/tempest-tests-tempest" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.607066 4869 generic.go:334] "Generic (PLEG): container finished" podID="64b10dee-f478-4509-bb67-840ee41733e1" containerID="648d3a4930c214ca37b985092c844093c75c21bd6b0df8a6d1a723b721cc8163" exitCode=0 Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.607113 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tz2q4" event={"ID":"64b10dee-f478-4509-bb67-840ee41733e1","Type":"ContainerDied","Data":"648d3a4930c214ca37b985092c844093c75c21bd6b0df8a6d1a723b721cc8163"} Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.607138 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tz2q4" event={"ID":"64b10dee-f478-4509-bb67-840ee41733e1","Type":"ContainerStarted","Data":"12da2e3d871b3360337e06c21f45a09e3243cbcf32e8ccc2e45267253479b87f"} Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.692670 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e6182d72-d424-4a24-bb32-1c43aaa82bba-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " pod="openstack/tempest-tests-tempest" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.692736 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " pod="openstack/tempest-tests-tempest" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.692762 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e6182d72-d424-4a24-bb32-1c43aaa82bba-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " pod="openstack/tempest-tests-tempest" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.692792 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e6182d72-d424-4a24-bb32-1c43aaa82bba-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " pod="openstack/tempest-tests-tempest" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.692840 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6182d72-d424-4a24-bb32-1c43aaa82bba-config-data\") pod \"tempest-tests-tempest\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " pod="openstack/tempest-tests-tempest" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.692869 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e6182d72-d424-4a24-bb32-1c43aaa82bba-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " pod="openstack/tempest-tests-tempest" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.692945 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e6182d72-d424-4a24-bb32-1c43aaa82bba-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " pod="openstack/tempest-tests-tempest" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.693005 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvrjr\" (UniqueName: \"kubernetes.io/projected/e6182d72-d424-4a24-bb32-1c43aaa82bba-kube-api-access-nvrjr\") pod \"tempest-tests-tempest\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " pod="openstack/tempest-tests-tempest" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.693040 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e6182d72-d424-4a24-bb32-1c43aaa82bba-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " pod="openstack/tempest-tests-tempest" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.694442 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6182d72-d424-4a24-bb32-1c43aaa82bba-config-data\") pod \"tempest-tests-tempest\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " pod="openstack/tempest-tests-tempest" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.694722 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e6182d72-d424-4a24-bb32-1c43aaa82bba-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " pod="openstack/tempest-tests-tempest" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.698802 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e6182d72-d424-4a24-bb32-1c43aaa82bba-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " pod="openstack/tempest-tests-tempest" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.794861 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " pod="openstack/tempest-tests-tempest" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.794912 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e6182d72-d424-4a24-bb32-1c43aaa82bba-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " pod="openstack/tempest-tests-tempest" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.795015 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e6182d72-d424-4a24-bb32-1c43aaa82bba-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " pod="openstack/tempest-tests-tempest" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.795251 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/tempest-tests-tempest" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.795466 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e6182d72-d424-4a24-bb32-1c43aaa82bba-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " pod="openstack/tempest-tests-tempest" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.795571 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e6182d72-d424-4a24-bb32-1c43aaa82bba-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " pod="openstack/tempest-tests-tempest" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.796323 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvrjr\" (UniqueName: \"kubernetes.io/projected/e6182d72-d424-4a24-bb32-1c43aaa82bba-kube-api-access-nvrjr\") pod \"tempest-tests-tempest\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " pod="openstack/tempest-tests-tempest" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.796383 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e6182d72-d424-4a24-bb32-1c43aaa82bba-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " pod="openstack/tempest-tests-tempest" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.796493 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e6182d72-d424-4a24-bb32-1c43aaa82bba-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " pod="openstack/tempest-tests-tempest" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.801086 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e6182d72-d424-4a24-bb32-1c43aaa82bba-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " pod="openstack/tempest-tests-tempest" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.803437 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e6182d72-d424-4a24-bb32-1c43aaa82bba-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " pod="openstack/tempest-tests-tempest" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.815210 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvrjr\" (UniqueName: \"kubernetes.io/projected/e6182d72-d424-4a24-bb32-1c43aaa82bba-kube-api-access-nvrjr\") pod \"tempest-tests-tempest\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " pod="openstack/tempest-tests-tempest" Mar 12 15:32:13 crc kubenswrapper[4869]: I0312 15:32:13.827873 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " pod="openstack/tempest-tests-tempest" Mar 12 15:32:14 crc kubenswrapper[4869]: I0312 15:32:14.122587 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 12 15:32:14 crc kubenswrapper[4869]: W0312 15:32:14.587245 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6182d72_d424_4a24_bb32_1c43aaa82bba.slice/crio-ce545262dd1c3a54f5126f3464c4afe21b45cc3054056538eac4411c4c477233 WatchSource:0}: Error finding container ce545262dd1c3a54f5126f3464c4afe21b45cc3054056538eac4411c4c477233: Status 404 returned error can't find the container with id ce545262dd1c3a54f5126f3464c4afe21b45cc3054056538eac4411c4c477233 Mar 12 15:32:14 crc kubenswrapper[4869]: I0312 15:32:14.590099 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 12 15:32:14 crc kubenswrapper[4869]: I0312 15:32:14.617549 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e6182d72-d424-4a24-bb32-1c43aaa82bba","Type":"ContainerStarted","Data":"ce545262dd1c3a54f5126f3464c4afe21b45cc3054056538eac4411c4c477233"} Mar 12 15:32:14 crc kubenswrapper[4869]: I0312 15:32:14.619622 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tz2q4" event={"ID":"64b10dee-f478-4509-bb67-840ee41733e1","Type":"ContainerStarted","Data":"c43bf5b2c6fe6bcc6aa0fecf37d4c5c349b5c73087ab3ebc12bf93f709af9495"} Mar 12 15:32:15 crc kubenswrapper[4869]: I0312 15:32:15.629570 4869 generic.go:334] "Generic (PLEG): container finished" podID="64b10dee-f478-4509-bb67-840ee41733e1" containerID="c43bf5b2c6fe6bcc6aa0fecf37d4c5c349b5c73087ab3ebc12bf93f709af9495" exitCode=0 Mar 12 15:32:15 crc kubenswrapper[4869]: I0312 15:32:15.629666 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tz2q4" event={"ID":"64b10dee-f478-4509-bb67-840ee41733e1","Type":"ContainerDied","Data":"c43bf5b2c6fe6bcc6aa0fecf37d4c5c349b5c73087ab3ebc12bf93f709af9495"} Mar 12 15:32:16 crc kubenswrapper[4869]: I0312 15:32:16.649708 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tz2q4" event={"ID":"64b10dee-f478-4509-bb67-840ee41733e1","Type":"ContainerStarted","Data":"7fccda92c74adeadc6ebde06a03168bc1ab0fdd5be8e9cddc4dc1627507e64b1"} Mar 12 15:32:16 crc kubenswrapper[4869]: I0312 15:32:16.676741 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tz2q4" podStartSLOduration=2.2614910950000002 podStartE2EDuration="4.676723314s" podCreationTimestamp="2026-03-12 15:32:12 +0000 UTC" firstStartedPulling="2026-03-12 15:32:13.609814619 +0000 UTC m=+2685.895039917" lastFinishedPulling="2026-03-12 15:32:16.025046858 +0000 UTC m=+2688.310272136" observedRunningTime="2026-03-12 15:32:16.665661934 +0000 UTC m=+2688.950887222" watchObservedRunningTime="2026-03-12 15:32:16.676723314 +0000 UTC m=+2688.961948592" Mar 12 15:32:22 crc kubenswrapper[4869]: I0312 15:32:22.863851 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tz2q4" Mar 12 15:32:22 crc kubenswrapper[4869]: I0312 15:32:22.864309 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tz2q4" Mar 12 15:32:22 crc kubenswrapper[4869]: I0312 15:32:22.914683 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tz2q4" Mar 12 15:32:23 crc kubenswrapper[4869]: I0312 15:32:23.770921 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tz2q4" Mar 12 15:32:23 crc kubenswrapper[4869]: I0312 15:32:23.824835 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tz2q4"] Mar 12 15:32:24 crc kubenswrapper[4869]: I0312 15:32:24.336794 4869 scope.go:117] "RemoveContainer" containerID="c9a47bbb284d539c798de0e32d0c54787cc43e1f77e98632a15c2b240ce0d5b0" Mar 12 15:32:24 crc kubenswrapper[4869]: E0312 15:32:24.337467 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:32:25 crc kubenswrapper[4869]: I0312 15:32:25.935330 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tz2q4" podUID="64b10dee-f478-4509-bb67-840ee41733e1" containerName="registry-server" containerID="cri-o://7fccda92c74adeadc6ebde06a03168bc1ab0fdd5be8e9cddc4dc1627507e64b1" gracePeriod=2 Mar 12 15:32:26 crc kubenswrapper[4869]: I0312 15:32:26.948706 4869 generic.go:334] "Generic (PLEG): container finished" podID="64b10dee-f478-4509-bb67-840ee41733e1" containerID="7fccda92c74adeadc6ebde06a03168bc1ab0fdd5be8e9cddc4dc1627507e64b1" exitCode=0 Mar 12 15:32:26 crc kubenswrapper[4869]: I0312 15:32:26.948774 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tz2q4" event={"ID":"64b10dee-f478-4509-bb67-840ee41733e1","Type":"ContainerDied","Data":"7fccda92c74adeadc6ebde06a03168bc1ab0fdd5be8e9cddc4dc1627507e64b1"} Mar 12 15:32:27 crc kubenswrapper[4869]: I0312 15:32:27.822555 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ln2fk"] Mar 12 15:32:27 crc kubenswrapper[4869]: I0312 15:32:27.825075 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ln2fk" Mar 12 15:32:27 crc kubenswrapper[4869]: I0312 15:32:27.837958 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ln2fk"] Mar 12 15:32:27 crc kubenswrapper[4869]: I0312 15:32:27.856598 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/584580e2-1b2d-4def-ba44-63a563edebf8-utilities\") pod \"certified-operators-ln2fk\" (UID: \"584580e2-1b2d-4def-ba44-63a563edebf8\") " pod="openshift-marketplace/certified-operators-ln2fk" Mar 12 15:32:27 crc kubenswrapper[4869]: I0312 15:32:27.856709 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/584580e2-1b2d-4def-ba44-63a563edebf8-catalog-content\") pod \"certified-operators-ln2fk\" (UID: \"584580e2-1b2d-4def-ba44-63a563edebf8\") " pod="openshift-marketplace/certified-operators-ln2fk" Mar 12 15:32:27 crc kubenswrapper[4869]: I0312 15:32:27.856753 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlxj8\" (UniqueName: \"kubernetes.io/projected/584580e2-1b2d-4def-ba44-63a563edebf8-kube-api-access-tlxj8\") pod \"certified-operators-ln2fk\" (UID: \"584580e2-1b2d-4def-ba44-63a563edebf8\") " pod="openshift-marketplace/certified-operators-ln2fk" Mar 12 15:32:27 crc kubenswrapper[4869]: I0312 15:32:27.959305 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/584580e2-1b2d-4def-ba44-63a563edebf8-utilities\") pod \"certified-operators-ln2fk\" (UID: \"584580e2-1b2d-4def-ba44-63a563edebf8\") " pod="openshift-marketplace/certified-operators-ln2fk" Mar 12 15:32:27 crc kubenswrapper[4869]: I0312 15:32:27.959477 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/584580e2-1b2d-4def-ba44-63a563edebf8-catalog-content\") pod \"certified-operators-ln2fk\" (UID: \"584580e2-1b2d-4def-ba44-63a563edebf8\") " pod="openshift-marketplace/certified-operators-ln2fk" Mar 12 15:32:27 crc kubenswrapper[4869]: I0312 15:32:27.959575 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlxj8\" (UniqueName: \"kubernetes.io/projected/584580e2-1b2d-4def-ba44-63a563edebf8-kube-api-access-tlxj8\") pod \"certified-operators-ln2fk\" (UID: \"584580e2-1b2d-4def-ba44-63a563edebf8\") " pod="openshift-marketplace/certified-operators-ln2fk" Mar 12 15:32:27 crc kubenswrapper[4869]: I0312 15:32:27.959881 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/584580e2-1b2d-4def-ba44-63a563edebf8-utilities\") pod \"certified-operators-ln2fk\" (UID: \"584580e2-1b2d-4def-ba44-63a563edebf8\") " pod="openshift-marketplace/certified-operators-ln2fk" Mar 12 15:32:27 crc kubenswrapper[4869]: I0312 15:32:27.959882 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/584580e2-1b2d-4def-ba44-63a563edebf8-catalog-content\") pod \"certified-operators-ln2fk\" (UID: \"584580e2-1b2d-4def-ba44-63a563edebf8\") " pod="openshift-marketplace/certified-operators-ln2fk" Mar 12 15:32:27 crc kubenswrapper[4869]: I0312 15:32:27.980432 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlxj8\" (UniqueName: \"kubernetes.io/projected/584580e2-1b2d-4def-ba44-63a563edebf8-kube-api-access-tlxj8\") pod \"certified-operators-ln2fk\" (UID: \"584580e2-1b2d-4def-ba44-63a563edebf8\") " pod="openshift-marketplace/certified-operators-ln2fk" Mar 12 15:32:28 crc kubenswrapper[4869]: I0312 15:32:28.170487 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ln2fk" Mar 12 15:32:32 crc kubenswrapper[4869]: E0312 15:32:32.865348 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7fccda92c74adeadc6ebde06a03168bc1ab0fdd5be8e9cddc4dc1627507e64b1 is running failed: container process not found" containerID="7fccda92c74adeadc6ebde06a03168bc1ab0fdd5be8e9cddc4dc1627507e64b1" cmd=["grpc_health_probe","-addr=:50051"] Mar 12 15:32:32 crc kubenswrapper[4869]: E0312 15:32:32.866263 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7fccda92c74adeadc6ebde06a03168bc1ab0fdd5be8e9cddc4dc1627507e64b1 is running failed: container process not found" containerID="7fccda92c74adeadc6ebde06a03168bc1ab0fdd5be8e9cddc4dc1627507e64b1" cmd=["grpc_health_probe","-addr=:50051"] Mar 12 15:32:32 crc kubenswrapper[4869]: E0312 15:32:32.866501 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7fccda92c74adeadc6ebde06a03168bc1ab0fdd5be8e9cddc4dc1627507e64b1 is running failed: container process not found" containerID="7fccda92c74adeadc6ebde06a03168bc1ab0fdd5be8e9cddc4dc1627507e64b1" cmd=["grpc_health_probe","-addr=:50051"] Mar 12 15:32:32 crc kubenswrapper[4869]: E0312 15:32:32.866552 4869 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7fccda92c74adeadc6ebde06a03168bc1ab0fdd5be8e9cddc4dc1627507e64b1 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-tz2q4" podUID="64b10dee-f478-4509-bb67-840ee41733e1" containerName="registry-server" Mar 12 15:32:34 crc kubenswrapper[4869]: I0312 15:32:34.768746 4869 scope.go:117] "RemoveContainer" containerID="9b91e001d0acac955428a1550e634a13efe70169b97ebf32d51852f46c0067a6" Mar 12 15:32:39 crc kubenswrapper[4869]: I0312 15:32:39.336702 4869 scope.go:117] "RemoveContainer" containerID="c9a47bbb284d539c798de0e32d0c54787cc43e1f77e98632a15c2b240ce0d5b0" Mar 12 15:32:39 crc kubenswrapper[4869]: E0312 15:32:39.337496 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:32:42 crc kubenswrapper[4869]: E0312 15:32:42.863626 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7fccda92c74adeadc6ebde06a03168bc1ab0fdd5be8e9cddc4dc1627507e64b1 is running failed: container process not found" containerID="7fccda92c74adeadc6ebde06a03168bc1ab0fdd5be8e9cddc4dc1627507e64b1" cmd=["grpc_health_probe","-addr=:50051"] Mar 12 15:32:42 crc kubenswrapper[4869]: E0312 15:32:42.865618 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7fccda92c74adeadc6ebde06a03168bc1ab0fdd5be8e9cddc4dc1627507e64b1 is running failed: container process not found" containerID="7fccda92c74adeadc6ebde06a03168bc1ab0fdd5be8e9cddc4dc1627507e64b1" cmd=["grpc_health_probe","-addr=:50051"] Mar 12 15:32:42 crc kubenswrapper[4869]: E0312 15:32:42.866104 4869 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7fccda92c74adeadc6ebde06a03168bc1ab0fdd5be8e9cddc4dc1627507e64b1 is running failed: container process not found" containerID="7fccda92c74adeadc6ebde06a03168bc1ab0fdd5be8e9cddc4dc1627507e64b1" cmd=["grpc_health_probe","-addr=:50051"] Mar 12 15:32:42 crc kubenswrapper[4869]: E0312 15:32:42.866149 4869 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7fccda92c74adeadc6ebde06a03168bc1ab0fdd5be8e9cddc4dc1627507e64b1 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-tz2q4" podUID="64b10dee-f478-4509-bb67-840ee41733e1" containerName="registry-server" Mar 12 15:32:44 crc kubenswrapper[4869]: E0312 15:32:44.173436 4869 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 12 15:32:44 crc kubenswrapper[4869]: E0312 15:32:44.173893 4869 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nvrjr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(e6182d72-d424-4a24-bb32-1c43aaa82bba): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 15:32:44 crc kubenswrapper[4869]: E0312 15:32:44.175404 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="e6182d72-d424-4a24-bb32-1c43aaa82bba" Mar 12 15:32:44 crc kubenswrapper[4869]: I0312 15:32:44.456631 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tz2q4" Mar 12 15:32:44 crc kubenswrapper[4869]: I0312 15:32:44.571550 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpwlk\" (UniqueName: \"kubernetes.io/projected/64b10dee-f478-4509-bb67-840ee41733e1-kube-api-access-mpwlk\") pod \"64b10dee-f478-4509-bb67-840ee41733e1\" (UID: \"64b10dee-f478-4509-bb67-840ee41733e1\") " Mar 12 15:32:44 crc kubenswrapper[4869]: I0312 15:32:44.571697 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64b10dee-f478-4509-bb67-840ee41733e1-utilities\") pod \"64b10dee-f478-4509-bb67-840ee41733e1\" (UID: \"64b10dee-f478-4509-bb67-840ee41733e1\") " Mar 12 15:32:44 crc kubenswrapper[4869]: I0312 15:32:44.571796 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64b10dee-f478-4509-bb67-840ee41733e1-catalog-content\") pod \"64b10dee-f478-4509-bb67-840ee41733e1\" (UID: \"64b10dee-f478-4509-bb67-840ee41733e1\") " Mar 12 15:32:44 crc kubenswrapper[4869]: I0312 15:32:44.572587 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64b10dee-f478-4509-bb67-840ee41733e1-utilities" (OuterVolumeSpecName: "utilities") pod "64b10dee-f478-4509-bb67-840ee41733e1" (UID: "64b10dee-f478-4509-bb67-840ee41733e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:32:44 crc kubenswrapper[4869]: I0312 15:32:44.583760 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64b10dee-f478-4509-bb67-840ee41733e1-kube-api-access-mpwlk" (OuterVolumeSpecName: "kube-api-access-mpwlk") pod "64b10dee-f478-4509-bb67-840ee41733e1" (UID: "64b10dee-f478-4509-bb67-840ee41733e1"). InnerVolumeSpecName "kube-api-access-mpwlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:32:44 crc kubenswrapper[4869]: I0312 15:32:44.594277 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64b10dee-f478-4509-bb67-840ee41733e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "64b10dee-f478-4509-bb67-840ee41733e1" (UID: "64b10dee-f478-4509-bb67-840ee41733e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:32:44 crc kubenswrapper[4869]: I0312 15:32:44.674856 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64b10dee-f478-4509-bb67-840ee41733e1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:32:44 crc kubenswrapper[4869]: I0312 15:32:44.674890 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpwlk\" (UniqueName: \"kubernetes.io/projected/64b10dee-f478-4509-bb67-840ee41733e1-kube-api-access-mpwlk\") on node \"crc\" DevicePath \"\"" Mar 12 15:32:44 crc kubenswrapper[4869]: I0312 15:32:44.674902 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64b10dee-f478-4509-bb67-840ee41733e1-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:32:44 crc kubenswrapper[4869]: I0312 15:32:44.694529 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ln2fk"] Mar 12 15:32:44 crc kubenswrapper[4869]: W0312 15:32:44.696381 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod584580e2_1b2d_4def_ba44_63a563edebf8.slice/crio-110baf70ece5e7c9278f1d26a1b95cb16e6ba3201b29bc830d1c53e27432fa14 WatchSource:0}: Error finding container 110baf70ece5e7c9278f1d26a1b95cb16e6ba3201b29bc830d1c53e27432fa14: Status 404 returned error can't find the container with id 110baf70ece5e7c9278f1d26a1b95cb16e6ba3201b29bc830d1c53e27432fa14 Mar 12 15:32:45 crc kubenswrapper[4869]: I0312 15:32:45.136886 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tz2q4" event={"ID":"64b10dee-f478-4509-bb67-840ee41733e1","Type":"ContainerDied","Data":"12da2e3d871b3360337e06c21f45a09e3243cbcf32e8ccc2e45267253479b87f"} Mar 12 15:32:45 crc kubenswrapper[4869]: I0312 15:32:45.137148 4869 scope.go:117] "RemoveContainer" containerID="7fccda92c74adeadc6ebde06a03168bc1ab0fdd5be8e9cddc4dc1627507e64b1" Mar 12 15:32:45 crc kubenswrapper[4869]: I0312 15:32:45.137507 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tz2q4" Mar 12 15:32:45 crc kubenswrapper[4869]: I0312 15:32:45.146924 4869 generic.go:334] "Generic (PLEG): container finished" podID="584580e2-1b2d-4def-ba44-63a563edebf8" containerID="826c8a9c3581d0a85c3a3618f0c6344deed7f26e0e70b0c943e0d488904b87be" exitCode=0 Mar 12 15:32:45 crc kubenswrapper[4869]: I0312 15:32:45.146987 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ln2fk" event={"ID":"584580e2-1b2d-4def-ba44-63a563edebf8","Type":"ContainerDied","Data":"826c8a9c3581d0a85c3a3618f0c6344deed7f26e0e70b0c943e0d488904b87be"} Mar 12 15:32:45 crc kubenswrapper[4869]: I0312 15:32:45.147075 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ln2fk" event={"ID":"584580e2-1b2d-4def-ba44-63a563edebf8","Type":"ContainerStarted","Data":"110baf70ece5e7c9278f1d26a1b95cb16e6ba3201b29bc830d1c53e27432fa14"} Mar 12 15:32:45 crc kubenswrapper[4869]: E0312 15:32:45.151458 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="e6182d72-d424-4a24-bb32-1c43aaa82bba" Mar 12 15:32:45 crc kubenswrapper[4869]: I0312 15:32:45.177835 4869 scope.go:117] "RemoveContainer" containerID="c43bf5b2c6fe6bcc6aa0fecf37d4c5c349b5c73087ab3ebc12bf93f709af9495" Mar 12 15:32:45 crc kubenswrapper[4869]: I0312 15:32:45.216322 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tz2q4"] Mar 12 15:32:45 crc kubenswrapper[4869]: I0312 15:32:45.221769 4869 scope.go:117] "RemoveContainer" containerID="648d3a4930c214ca37b985092c844093c75c21bd6b0df8a6d1a723b721cc8163" Mar 12 15:32:45 crc kubenswrapper[4869]: I0312 15:32:45.225850 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tz2q4"] Mar 12 15:32:46 crc kubenswrapper[4869]: I0312 15:32:46.159327 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ln2fk" event={"ID":"584580e2-1b2d-4def-ba44-63a563edebf8","Type":"ContainerStarted","Data":"c5ebca07ac47f21fbd1b920f54730521fdd653b673479f62820bfa9e87125163"} Mar 12 15:32:46 crc kubenswrapper[4869]: I0312 15:32:46.346926 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64b10dee-f478-4509-bb67-840ee41733e1" path="/var/lib/kubelet/pods/64b10dee-f478-4509-bb67-840ee41733e1/volumes" Mar 12 15:32:48 crc kubenswrapper[4869]: I0312 15:32:48.182877 4869 generic.go:334] "Generic (PLEG): container finished" podID="584580e2-1b2d-4def-ba44-63a563edebf8" containerID="c5ebca07ac47f21fbd1b920f54730521fdd653b673479f62820bfa9e87125163" exitCode=0 Mar 12 15:32:48 crc kubenswrapper[4869]: I0312 15:32:48.182955 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ln2fk" event={"ID":"584580e2-1b2d-4def-ba44-63a563edebf8","Type":"ContainerDied","Data":"c5ebca07ac47f21fbd1b920f54730521fdd653b673479f62820bfa9e87125163"} Mar 12 15:32:49 crc kubenswrapper[4869]: I0312 15:32:49.206032 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ln2fk" event={"ID":"584580e2-1b2d-4def-ba44-63a563edebf8","Type":"ContainerStarted","Data":"385a672ed65f98f56581309e17a01c9bcf762aa5b2666d8a7d796da7262fab2c"} Mar 12 15:32:49 crc kubenswrapper[4869]: I0312 15:32:49.237477 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ln2fk" podStartSLOduration=18.802790557 podStartE2EDuration="22.237454379s" podCreationTimestamp="2026-03-12 15:32:27 +0000 UTC" firstStartedPulling="2026-03-12 15:32:45.148358183 +0000 UTC m=+2717.433583461" lastFinishedPulling="2026-03-12 15:32:48.583022005 +0000 UTC m=+2720.868247283" observedRunningTime="2026-03-12 15:32:49.224795443 +0000 UTC m=+2721.510020761" watchObservedRunningTime="2026-03-12 15:32:49.237454379 +0000 UTC m=+2721.522679657" Mar 12 15:32:50 crc kubenswrapper[4869]: I0312 15:32:50.337272 4869 scope.go:117] "RemoveContainer" containerID="c9a47bbb284d539c798de0e32d0c54787cc43e1f77e98632a15c2b240ce0d5b0" Mar 12 15:32:51 crc kubenswrapper[4869]: I0312 15:32:51.236714 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerStarted","Data":"fc439565269f3d0ef385b4279fbecd35febba51413b9ff6d43d5a2cd879b2a45"} Mar 12 15:32:56 crc kubenswrapper[4869]: I0312 15:32:56.928054 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 12 15:32:58 crc kubenswrapper[4869]: I0312 15:32:58.170955 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ln2fk" Mar 12 15:32:58 crc kubenswrapper[4869]: I0312 15:32:58.171308 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ln2fk" Mar 12 15:32:58 crc kubenswrapper[4869]: I0312 15:32:58.231417 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ln2fk" Mar 12 15:32:58 crc kubenswrapper[4869]: I0312 15:32:58.305667 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e6182d72-d424-4a24-bb32-1c43aaa82bba","Type":"ContainerStarted","Data":"54c239e479b7d22722ff1817b02fdab11ff5547cc1e1e1e16c01f8e58a0b213c"} Mar 12 15:32:58 crc kubenswrapper[4869]: I0312 15:32:58.330189 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.9939388239999998 podStartE2EDuration="46.330168516s" podCreationTimestamp="2026-03-12 15:32:12 +0000 UTC" firstStartedPulling="2026-03-12 15:32:14.58937248 +0000 UTC m=+2686.874597758" lastFinishedPulling="2026-03-12 15:32:56.925602172 +0000 UTC m=+2729.210827450" observedRunningTime="2026-03-12 15:32:58.321471321 +0000 UTC m=+2730.606696599" watchObservedRunningTime="2026-03-12 15:32:58.330168516 +0000 UTC m=+2730.615393794" Mar 12 15:32:58 crc kubenswrapper[4869]: I0312 15:32:58.353676 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ln2fk" Mar 12 15:33:01 crc kubenswrapper[4869]: I0312 15:33:01.227701 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ln2fk"] Mar 12 15:33:01 crc kubenswrapper[4869]: I0312 15:33:01.229142 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ln2fk" podUID="584580e2-1b2d-4def-ba44-63a563edebf8" containerName="registry-server" containerID="cri-o://385a672ed65f98f56581309e17a01c9bcf762aa5b2666d8a7d796da7262fab2c" gracePeriod=2 Mar 12 15:33:01 crc kubenswrapper[4869]: I0312 15:33:01.754811 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ln2fk" Mar 12 15:33:01 crc kubenswrapper[4869]: I0312 15:33:01.824935 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/584580e2-1b2d-4def-ba44-63a563edebf8-utilities\") pod \"584580e2-1b2d-4def-ba44-63a563edebf8\" (UID: \"584580e2-1b2d-4def-ba44-63a563edebf8\") " Mar 12 15:33:01 crc kubenswrapper[4869]: I0312 15:33:01.825068 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/584580e2-1b2d-4def-ba44-63a563edebf8-catalog-content\") pod \"584580e2-1b2d-4def-ba44-63a563edebf8\" (UID: \"584580e2-1b2d-4def-ba44-63a563edebf8\") " Mar 12 15:33:01 crc kubenswrapper[4869]: I0312 15:33:01.825168 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlxj8\" (UniqueName: \"kubernetes.io/projected/584580e2-1b2d-4def-ba44-63a563edebf8-kube-api-access-tlxj8\") pod \"584580e2-1b2d-4def-ba44-63a563edebf8\" (UID: \"584580e2-1b2d-4def-ba44-63a563edebf8\") " Mar 12 15:33:01 crc kubenswrapper[4869]: I0312 15:33:01.826006 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/584580e2-1b2d-4def-ba44-63a563edebf8-utilities" (OuterVolumeSpecName: "utilities") pod "584580e2-1b2d-4def-ba44-63a563edebf8" (UID: "584580e2-1b2d-4def-ba44-63a563edebf8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:33:01 crc kubenswrapper[4869]: I0312 15:33:01.832889 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/584580e2-1b2d-4def-ba44-63a563edebf8-kube-api-access-tlxj8" (OuterVolumeSpecName: "kube-api-access-tlxj8") pod "584580e2-1b2d-4def-ba44-63a563edebf8" (UID: "584580e2-1b2d-4def-ba44-63a563edebf8"). InnerVolumeSpecName "kube-api-access-tlxj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:33:01 crc kubenswrapper[4869]: I0312 15:33:01.877786 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/584580e2-1b2d-4def-ba44-63a563edebf8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "584580e2-1b2d-4def-ba44-63a563edebf8" (UID: "584580e2-1b2d-4def-ba44-63a563edebf8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:33:01 crc kubenswrapper[4869]: I0312 15:33:01.927932 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/584580e2-1b2d-4def-ba44-63a563edebf8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:33:01 crc kubenswrapper[4869]: I0312 15:33:01.928198 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlxj8\" (UniqueName: \"kubernetes.io/projected/584580e2-1b2d-4def-ba44-63a563edebf8-kube-api-access-tlxj8\") on node \"crc\" DevicePath \"\"" Mar 12 15:33:01 crc kubenswrapper[4869]: I0312 15:33:01.928266 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/584580e2-1b2d-4def-ba44-63a563edebf8-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:33:02 crc kubenswrapper[4869]: I0312 15:33:02.346803 4869 generic.go:334] "Generic (PLEG): container finished" podID="584580e2-1b2d-4def-ba44-63a563edebf8" containerID="385a672ed65f98f56581309e17a01c9bcf762aa5b2666d8a7d796da7262fab2c" exitCode=0 Mar 12 15:33:02 crc kubenswrapper[4869]: I0312 15:33:02.346917 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ln2fk" Mar 12 15:33:02 crc kubenswrapper[4869]: I0312 15:33:02.358509 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ln2fk" event={"ID":"584580e2-1b2d-4def-ba44-63a563edebf8","Type":"ContainerDied","Data":"385a672ed65f98f56581309e17a01c9bcf762aa5b2666d8a7d796da7262fab2c"} Mar 12 15:33:02 crc kubenswrapper[4869]: I0312 15:33:02.358579 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ln2fk" event={"ID":"584580e2-1b2d-4def-ba44-63a563edebf8","Type":"ContainerDied","Data":"110baf70ece5e7c9278f1d26a1b95cb16e6ba3201b29bc830d1c53e27432fa14"} Mar 12 15:33:02 crc kubenswrapper[4869]: I0312 15:33:02.358604 4869 scope.go:117] "RemoveContainer" containerID="385a672ed65f98f56581309e17a01c9bcf762aa5b2666d8a7d796da7262fab2c" Mar 12 15:33:02 crc kubenswrapper[4869]: I0312 15:33:02.385758 4869 scope.go:117] "RemoveContainer" containerID="c5ebca07ac47f21fbd1b920f54730521fdd653b673479f62820bfa9e87125163" Mar 12 15:33:02 crc kubenswrapper[4869]: I0312 15:33:02.393274 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ln2fk"] Mar 12 15:33:02 crc kubenswrapper[4869]: I0312 15:33:02.402054 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ln2fk"] Mar 12 15:33:02 crc kubenswrapper[4869]: I0312 15:33:02.411111 4869 scope.go:117] "RemoveContainer" containerID="826c8a9c3581d0a85c3a3618f0c6344deed7f26e0e70b0c943e0d488904b87be" Mar 12 15:33:02 crc kubenswrapper[4869]: I0312 15:33:02.451512 4869 scope.go:117] "RemoveContainer" containerID="385a672ed65f98f56581309e17a01c9bcf762aa5b2666d8a7d796da7262fab2c" Mar 12 15:33:02 crc kubenswrapper[4869]: E0312 15:33:02.454103 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"385a672ed65f98f56581309e17a01c9bcf762aa5b2666d8a7d796da7262fab2c\": container with ID starting with 385a672ed65f98f56581309e17a01c9bcf762aa5b2666d8a7d796da7262fab2c not found: ID does not exist" containerID="385a672ed65f98f56581309e17a01c9bcf762aa5b2666d8a7d796da7262fab2c" Mar 12 15:33:02 crc kubenswrapper[4869]: I0312 15:33:02.454141 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"385a672ed65f98f56581309e17a01c9bcf762aa5b2666d8a7d796da7262fab2c"} err="failed to get container status \"385a672ed65f98f56581309e17a01c9bcf762aa5b2666d8a7d796da7262fab2c\": rpc error: code = NotFound desc = could not find container \"385a672ed65f98f56581309e17a01c9bcf762aa5b2666d8a7d796da7262fab2c\": container with ID starting with 385a672ed65f98f56581309e17a01c9bcf762aa5b2666d8a7d796da7262fab2c not found: ID does not exist" Mar 12 15:33:02 crc kubenswrapper[4869]: I0312 15:33:02.454168 4869 scope.go:117] "RemoveContainer" containerID="c5ebca07ac47f21fbd1b920f54730521fdd653b673479f62820bfa9e87125163" Mar 12 15:33:02 crc kubenswrapper[4869]: E0312 15:33:02.457935 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5ebca07ac47f21fbd1b920f54730521fdd653b673479f62820bfa9e87125163\": container with ID starting with c5ebca07ac47f21fbd1b920f54730521fdd653b673479f62820bfa9e87125163 not found: ID does not exist" containerID="c5ebca07ac47f21fbd1b920f54730521fdd653b673479f62820bfa9e87125163" Mar 12 15:33:02 crc kubenswrapper[4869]: I0312 15:33:02.458028 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5ebca07ac47f21fbd1b920f54730521fdd653b673479f62820bfa9e87125163"} err="failed to get container status \"c5ebca07ac47f21fbd1b920f54730521fdd653b673479f62820bfa9e87125163\": rpc error: code = NotFound desc = could not find container \"c5ebca07ac47f21fbd1b920f54730521fdd653b673479f62820bfa9e87125163\": container with ID starting with c5ebca07ac47f21fbd1b920f54730521fdd653b673479f62820bfa9e87125163 not found: ID does not exist" Mar 12 15:33:02 crc kubenswrapper[4869]: I0312 15:33:02.458094 4869 scope.go:117] "RemoveContainer" containerID="826c8a9c3581d0a85c3a3618f0c6344deed7f26e0e70b0c943e0d488904b87be" Mar 12 15:33:02 crc kubenswrapper[4869]: E0312 15:33:02.458406 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"826c8a9c3581d0a85c3a3618f0c6344deed7f26e0e70b0c943e0d488904b87be\": container with ID starting with 826c8a9c3581d0a85c3a3618f0c6344deed7f26e0e70b0c943e0d488904b87be not found: ID does not exist" containerID="826c8a9c3581d0a85c3a3618f0c6344deed7f26e0e70b0c943e0d488904b87be" Mar 12 15:33:02 crc kubenswrapper[4869]: I0312 15:33:02.458479 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"826c8a9c3581d0a85c3a3618f0c6344deed7f26e0e70b0c943e0d488904b87be"} err="failed to get container status \"826c8a9c3581d0a85c3a3618f0c6344deed7f26e0e70b0c943e0d488904b87be\": rpc error: code = NotFound desc = could not find container \"826c8a9c3581d0a85c3a3618f0c6344deed7f26e0e70b0c943e0d488904b87be\": container with ID starting with 826c8a9c3581d0a85c3a3618f0c6344deed7f26e0e70b0c943e0d488904b87be not found: ID does not exist" Mar 12 15:33:04 crc kubenswrapper[4869]: I0312 15:33:04.358452 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="584580e2-1b2d-4def-ba44-63a563edebf8" path="/var/lib/kubelet/pods/584580e2-1b2d-4def-ba44-63a563edebf8/volumes" Mar 12 15:34:00 crc kubenswrapper[4869]: I0312 15:34:00.155564 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555494-v95fd"] Mar 12 15:34:00 crc kubenswrapper[4869]: E0312 15:34:00.156439 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="584580e2-1b2d-4def-ba44-63a563edebf8" containerName="registry-server" Mar 12 15:34:00 crc kubenswrapper[4869]: I0312 15:34:00.156451 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="584580e2-1b2d-4def-ba44-63a563edebf8" containerName="registry-server" Mar 12 15:34:00 crc kubenswrapper[4869]: E0312 15:34:00.156465 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64b10dee-f478-4509-bb67-840ee41733e1" containerName="registry-server" Mar 12 15:34:00 crc kubenswrapper[4869]: I0312 15:34:00.156471 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="64b10dee-f478-4509-bb67-840ee41733e1" containerName="registry-server" Mar 12 15:34:00 crc kubenswrapper[4869]: E0312 15:34:00.156486 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64b10dee-f478-4509-bb67-840ee41733e1" containerName="extract-utilities" Mar 12 15:34:00 crc kubenswrapper[4869]: I0312 15:34:00.156493 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="64b10dee-f478-4509-bb67-840ee41733e1" containerName="extract-utilities" Mar 12 15:34:00 crc kubenswrapper[4869]: E0312 15:34:00.156503 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="584580e2-1b2d-4def-ba44-63a563edebf8" containerName="extract-content" Mar 12 15:34:00 crc kubenswrapper[4869]: I0312 15:34:00.156509 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="584580e2-1b2d-4def-ba44-63a563edebf8" containerName="extract-content" Mar 12 15:34:00 crc kubenswrapper[4869]: E0312 15:34:00.156518 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64b10dee-f478-4509-bb67-840ee41733e1" containerName="extract-content" Mar 12 15:34:00 crc kubenswrapper[4869]: I0312 15:34:00.156524 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="64b10dee-f478-4509-bb67-840ee41733e1" containerName="extract-content" Mar 12 15:34:00 crc kubenswrapper[4869]: E0312 15:34:00.156558 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="584580e2-1b2d-4def-ba44-63a563edebf8" containerName="extract-utilities" Mar 12 15:34:00 crc kubenswrapper[4869]: I0312 15:34:00.156564 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="584580e2-1b2d-4def-ba44-63a563edebf8" containerName="extract-utilities" Mar 12 15:34:00 crc kubenswrapper[4869]: I0312 15:34:00.156730 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="584580e2-1b2d-4def-ba44-63a563edebf8" containerName="registry-server" Mar 12 15:34:00 crc kubenswrapper[4869]: I0312 15:34:00.156758 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="64b10dee-f478-4509-bb67-840ee41733e1" containerName="registry-server" Mar 12 15:34:00 crc kubenswrapper[4869]: I0312 15:34:00.157340 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555494-v95fd" Mar 12 15:34:00 crc kubenswrapper[4869]: I0312 15:34:00.159874 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 15:34:00 crc kubenswrapper[4869]: I0312 15:34:00.160597 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:34:00 crc kubenswrapper[4869]: I0312 15:34:00.160759 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:34:00 crc kubenswrapper[4869]: I0312 15:34:00.166280 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555494-v95fd"] Mar 12 15:34:00 crc kubenswrapper[4869]: I0312 15:34:00.244867 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w527x\" (UniqueName: \"kubernetes.io/projected/fd32572d-ce1d-4c27-b381-351c1866a2f4-kube-api-access-w527x\") pod \"auto-csr-approver-29555494-v95fd\" (UID: \"fd32572d-ce1d-4c27-b381-351c1866a2f4\") " pod="openshift-infra/auto-csr-approver-29555494-v95fd" Mar 12 15:34:00 crc kubenswrapper[4869]: I0312 15:34:00.346352 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w527x\" (UniqueName: \"kubernetes.io/projected/fd32572d-ce1d-4c27-b381-351c1866a2f4-kube-api-access-w527x\") pod \"auto-csr-approver-29555494-v95fd\" (UID: \"fd32572d-ce1d-4c27-b381-351c1866a2f4\") " pod="openshift-infra/auto-csr-approver-29555494-v95fd" Mar 12 15:34:00 crc kubenswrapper[4869]: I0312 15:34:00.375405 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w527x\" (UniqueName: \"kubernetes.io/projected/fd32572d-ce1d-4c27-b381-351c1866a2f4-kube-api-access-w527x\") pod \"auto-csr-approver-29555494-v95fd\" (UID: \"fd32572d-ce1d-4c27-b381-351c1866a2f4\") " pod="openshift-infra/auto-csr-approver-29555494-v95fd" Mar 12 15:34:00 crc kubenswrapper[4869]: I0312 15:34:00.481530 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555494-v95fd" Mar 12 15:34:00 crc kubenswrapper[4869]: I0312 15:34:00.967651 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555494-v95fd"] Mar 12 15:34:00 crc kubenswrapper[4869]: I0312 15:34:00.987241 4869 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 15:34:01 crc kubenswrapper[4869]: I0312 15:34:01.867907 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555494-v95fd" event={"ID":"fd32572d-ce1d-4c27-b381-351c1866a2f4","Type":"ContainerStarted","Data":"6477462c2ebf6b14056e48e0de2d7b72dd619cc07b4e98c6b37c04746c6ebed4"} Mar 12 15:34:02 crc kubenswrapper[4869]: I0312 15:34:02.879188 4869 generic.go:334] "Generic (PLEG): container finished" podID="fd32572d-ce1d-4c27-b381-351c1866a2f4" containerID="fe0bb8275b71d5db0e6927500c30a4ef55af23c7dc5194db3b67b9f6d0256d87" exitCode=0 Mar 12 15:34:02 crc kubenswrapper[4869]: I0312 15:34:02.879444 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555494-v95fd" event={"ID":"fd32572d-ce1d-4c27-b381-351c1866a2f4","Type":"ContainerDied","Data":"fe0bb8275b71d5db0e6927500c30a4ef55af23c7dc5194db3b67b9f6d0256d87"} Mar 12 15:34:04 crc kubenswrapper[4869]: I0312 15:34:04.441371 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555494-v95fd" Mar 12 15:34:04 crc kubenswrapper[4869]: I0312 15:34:04.455796 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w527x\" (UniqueName: \"kubernetes.io/projected/fd32572d-ce1d-4c27-b381-351c1866a2f4-kube-api-access-w527x\") pod \"fd32572d-ce1d-4c27-b381-351c1866a2f4\" (UID: \"fd32572d-ce1d-4c27-b381-351c1866a2f4\") " Mar 12 15:34:04 crc kubenswrapper[4869]: I0312 15:34:04.473613 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd32572d-ce1d-4c27-b381-351c1866a2f4-kube-api-access-w527x" (OuterVolumeSpecName: "kube-api-access-w527x") pod "fd32572d-ce1d-4c27-b381-351c1866a2f4" (UID: "fd32572d-ce1d-4c27-b381-351c1866a2f4"). InnerVolumeSpecName "kube-api-access-w527x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:34:04 crc kubenswrapper[4869]: I0312 15:34:04.562550 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w527x\" (UniqueName: \"kubernetes.io/projected/fd32572d-ce1d-4c27-b381-351c1866a2f4-kube-api-access-w527x\") on node \"crc\" DevicePath \"\"" Mar 12 15:34:04 crc kubenswrapper[4869]: I0312 15:34:04.899500 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555494-v95fd" event={"ID":"fd32572d-ce1d-4c27-b381-351c1866a2f4","Type":"ContainerDied","Data":"6477462c2ebf6b14056e48e0de2d7b72dd619cc07b4e98c6b37c04746c6ebed4"} Mar 12 15:34:04 crc kubenswrapper[4869]: I0312 15:34:04.899760 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6477462c2ebf6b14056e48e0de2d7b72dd619cc07b4e98c6b37c04746c6ebed4" Mar 12 15:34:04 crc kubenswrapper[4869]: I0312 15:34:04.899570 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555494-v95fd" Mar 12 15:34:05 crc kubenswrapper[4869]: I0312 15:34:05.517147 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555488-x9dqp"] Mar 12 15:34:05 crc kubenswrapper[4869]: I0312 15:34:05.535863 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555488-x9dqp"] Mar 12 15:34:06 crc kubenswrapper[4869]: I0312 15:34:06.348991 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f751fb96-803e-4ae0-9abd-62c03ea50ecd" path="/var/lib/kubelet/pods/f751fb96-803e-4ae0-9abd-62c03ea50ecd/volumes" Mar 12 15:34:44 crc kubenswrapper[4869]: I0312 15:34:44.309202 4869 scope.go:117] "RemoveContainer" containerID="c12996e2251ffadbd88c2839ea7a1003235f7983b65dc8fbe95412b73961cc00" Mar 12 15:35:19 crc kubenswrapper[4869]: I0312 15:35:19.683907 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:35:19 crc kubenswrapper[4869]: I0312 15:35:19.684554 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:35:32 crc kubenswrapper[4869]: I0312 15:35:32.093460 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lj98d"] Mar 12 15:35:32 crc kubenswrapper[4869]: E0312 15:35:32.095728 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd32572d-ce1d-4c27-b381-351c1866a2f4" containerName="oc" Mar 12 15:35:32 crc kubenswrapper[4869]: I0312 15:35:32.095844 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd32572d-ce1d-4c27-b381-351c1866a2f4" containerName="oc" Mar 12 15:35:32 crc kubenswrapper[4869]: I0312 15:35:32.096148 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd32572d-ce1d-4c27-b381-351c1866a2f4" containerName="oc" Mar 12 15:35:32 crc kubenswrapper[4869]: I0312 15:35:32.098044 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lj98d" Mar 12 15:35:32 crc kubenswrapper[4869]: I0312 15:35:32.107462 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lj98d"] Mar 12 15:35:32 crc kubenswrapper[4869]: I0312 15:35:32.173585 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpml8\" (UniqueName: \"kubernetes.io/projected/2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8-kube-api-access-rpml8\") pod \"community-operators-lj98d\" (UID: \"2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8\") " pod="openshift-marketplace/community-operators-lj98d" Mar 12 15:35:32 crc kubenswrapper[4869]: I0312 15:35:32.173844 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8-catalog-content\") pod \"community-operators-lj98d\" (UID: \"2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8\") " pod="openshift-marketplace/community-operators-lj98d" Mar 12 15:35:32 crc kubenswrapper[4869]: I0312 15:35:32.174069 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8-utilities\") pod \"community-operators-lj98d\" (UID: \"2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8\") " pod="openshift-marketplace/community-operators-lj98d" Mar 12 15:35:32 crc kubenswrapper[4869]: I0312 15:35:32.276062 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8-utilities\") pod \"community-operators-lj98d\" (UID: \"2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8\") " pod="openshift-marketplace/community-operators-lj98d" Mar 12 15:35:32 crc kubenswrapper[4869]: I0312 15:35:32.276187 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpml8\" (UniqueName: \"kubernetes.io/projected/2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8-kube-api-access-rpml8\") pod \"community-operators-lj98d\" (UID: \"2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8\") " pod="openshift-marketplace/community-operators-lj98d" Mar 12 15:35:32 crc kubenswrapper[4869]: I0312 15:35:32.276282 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8-catalog-content\") pod \"community-operators-lj98d\" (UID: \"2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8\") " pod="openshift-marketplace/community-operators-lj98d" Mar 12 15:35:32 crc kubenswrapper[4869]: I0312 15:35:32.276900 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8-catalog-content\") pod \"community-operators-lj98d\" (UID: \"2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8\") " pod="openshift-marketplace/community-operators-lj98d" Mar 12 15:35:32 crc kubenswrapper[4869]: I0312 15:35:32.277163 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8-utilities\") pod \"community-operators-lj98d\" (UID: \"2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8\") " pod="openshift-marketplace/community-operators-lj98d" Mar 12 15:35:32 crc kubenswrapper[4869]: I0312 15:35:32.304376 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpml8\" (UniqueName: \"kubernetes.io/projected/2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8-kube-api-access-rpml8\") pod \"community-operators-lj98d\" (UID: \"2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8\") " pod="openshift-marketplace/community-operators-lj98d" Mar 12 15:35:32 crc kubenswrapper[4869]: I0312 15:35:32.420309 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lj98d" Mar 12 15:35:32 crc kubenswrapper[4869]: I0312 15:35:32.877948 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lj98d"] Mar 12 15:35:32 crc kubenswrapper[4869]: W0312 15:35:32.882704 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d71cdcd_1cd7_4b31_b5d2_eb4bea695ab8.slice/crio-3604acffd987eb7df8ca8c1f975f81759969828c2da64a3f5c1e35e8d0db7b53 WatchSource:0}: Error finding container 3604acffd987eb7df8ca8c1f975f81759969828c2da64a3f5c1e35e8d0db7b53: Status 404 returned error can't find the container with id 3604acffd987eb7df8ca8c1f975f81759969828c2da64a3f5c1e35e8d0db7b53 Mar 12 15:35:33 crc kubenswrapper[4869]: I0312 15:35:33.737108 4869 generic.go:334] "Generic (PLEG): container finished" podID="2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8" containerID="a32eae167f26ac198c9227729c39783b4627ffe3b38d83a9b023eeedd988869a" exitCode=0 Mar 12 15:35:33 crc kubenswrapper[4869]: I0312 15:35:33.737158 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lj98d" event={"ID":"2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8","Type":"ContainerDied","Data":"a32eae167f26ac198c9227729c39783b4627ffe3b38d83a9b023eeedd988869a"} Mar 12 15:35:33 crc kubenswrapper[4869]: I0312 15:35:33.737407 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lj98d" event={"ID":"2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8","Type":"ContainerStarted","Data":"3604acffd987eb7df8ca8c1f975f81759969828c2da64a3f5c1e35e8d0db7b53"} Mar 12 15:35:34 crc kubenswrapper[4869]: I0312 15:35:34.749040 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lj98d" event={"ID":"2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8","Type":"ContainerStarted","Data":"6410f4ebd03d6bb2b59ebd53131836545820b4e8c29ce4fbb96e0bfbb258610a"} Mar 12 15:35:36 crc kubenswrapper[4869]: I0312 15:35:36.775633 4869 generic.go:334] "Generic (PLEG): container finished" podID="2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8" containerID="6410f4ebd03d6bb2b59ebd53131836545820b4e8c29ce4fbb96e0bfbb258610a" exitCode=0 Mar 12 15:35:36 crc kubenswrapper[4869]: I0312 15:35:36.775695 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lj98d" event={"ID":"2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8","Type":"ContainerDied","Data":"6410f4ebd03d6bb2b59ebd53131836545820b4e8c29ce4fbb96e0bfbb258610a"} Mar 12 15:35:37 crc kubenswrapper[4869]: I0312 15:35:37.786472 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lj98d" event={"ID":"2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8","Type":"ContainerStarted","Data":"d8a69f54fc64a52fdb53c4e8a5320b869f40f669600977b15b2f9b785fa4ed38"} Mar 12 15:35:37 crc kubenswrapper[4869]: I0312 15:35:37.814636 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lj98d" podStartSLOduration=2.328956505 podStartE2EDuration="5.814619358s" podCreationTimestamp="2026-03-12 15:35:32 +0000 UTC" firstStartedPulling="2026-03-12 15:35:33.738742714 +0000 UTC m=+2886.023967992" lastFinishedPulling="2026-03-12 15:35:37.224405567 +0000 UTC m=+2889.509630845" observedRunningTime="2026-03-12 15:35:37.801370016 +0000 UTC m=+2890.086595294" watchObservedRunningTime="2026-03-12 15:35:37.814619358 +0000 UTC m=+2890.099844636" Mar 12 15:35:42 crc kubenswrapper[4869]: I0312 15:35:42.420890 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lj98d" Mar 12 15:35:42 crc kubenswrapper[4869]: I0312 15:35:42.421575 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lj98d" Mar 12 15:35:43 crc kubenswrapper[4869]: I0312 15:35:43.489305 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-lj98d" podUID="2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8" containerName="registry-server" probeResult="failure" output=< Mar 12 15:35:43 crc kubenswrapper[4869]: timeout: failed to connect service ":50051" within 1s Mar 12 15:35:43 crc kubenswrapper[4869]: > Mar 12 15:35:49 crc kubenswrapper[4869]: I0312 15:35:49.684388 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:35:49 crc kubenswrapper[4869]: I0312 15:35:49.685298 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:35:52 crc kubenswrapper[4869]: I0312 15:35:52.472535 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lj98d" Mar 12 15:35:52 crc kubenswrapper[4869]: I0312 15:35:52.526775 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lj98d" Mar 12 15:35:52 crc kubenswrapper[4869]: I0312 15:35:52.704729 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lj98d"] Mar 12 15:35:53 crc kubenswrapper[4869]: I0312 15:35:53.929157 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lj98d" podUID="2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8" containerName="registry-server" containerID="cri-o://d8a69f54fc64a52fdb53c4e8a5320b869f40f669600977b15b2f9b785fa4ed38" gracePeriod=2 Mar 12 15:35:54 crc kubenswrapper[4869]: I0312 15:35:54.655881 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lj98d" Mar 12 15:35:54 crc kubenswrapper[4869]: I0312 15:35:54.725828 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8-catalog-content\") pod \"2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8\" (UID: \"2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8\") " Mar 12 15:35:54 crc kubenswrapper[4869]: I0312 15:35:54.726021 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpml8\" (UniqueName: \"kubernetes.io/projected/2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8-kube-api-access-rpml8\") pod \"2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8\" (UID: \"2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8\") " Mar 12 15:35:54 crc kubenswrapper[4869]: I0312 15:35:54.726084 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8-utilities\") pod \"2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8\" (UID: \"2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8\") " Mar 12 15:35:54 crc kubenswrapper[4869]: I0312 15:35:54.726843 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8-utilities" (OuterVolumeSpecName: "utilities") pod "2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8" (UID: "2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:35:54 crc kubenswrapper[4869]: I0312 15:35:54.737700 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8-kube-api-access-rpml8" (OuterVolumeSpecName: "kube-api-access-rpml8") pod "2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8" (UID: "2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8"). InnerVolumeSpecName "kube-api-access-rpml8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:35:54 crc kubenswrapper[4869]: I0312 15:35:54.795313 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8" (UID: "2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:35:54 crc kubenswrapper[4869]: I0312 15:35:54.828872 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpml8\" (UniqueName: \"kubernetes.io/projected/2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8-kube-api-access-rpml8\") on node \"crc\" DevicePath \"\"" Mar 12 15:35:54 crc kubenswrapper[4869]: I0312 15:35:54.829146 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:35:54 crc kubenswrapper[4869]: I0312 15:35:54.829220 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:35:54 crc kubenswrapper[4869]: I0312 15:35:54.940402 4869 generic.go:334] "Generic (PLEG): container finished" podID="2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8" containerID="d8a69f54fc64a52fdb53c4e8a5320b869f40f669600977b15b2f9b785fa4ed38" exitCode=0 Mar 12 15:35:54 crc kubenswrapper[4869]: I0312 15:35:54.940454 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lj98d" event={"ID":"2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8","Type":"ContainerDied","Data":"d8a69f54fc64a52fdb53c4e8a5320b869f40f669600977b15b2f9b785fa4ed38"} Mar 12 15:35:54 crc kubenswrapper[4869]: I0312 15:35:54.940523 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lj98d" event={"ID":"2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8","Type":"ContainerDied","Data":"3604acffd987eb7df8ca8c1f975f81759969828c2da64a3f5c1e35e8d0db7b53"} Mar 12 15:35:54 crc kubenswrapper[4869]: I0312 15:35:54.940480 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lj98d" Mar 12 15:35:54 crc kubenswrapper[4869]: I0312 15:35:54.940560 4869 scope.go:117] "RemoveContainer" containerID="d8a69f54fc64a52fdb53c4e8a5320b869f40f669600977b15b2f9b785fa4ed38" Mar 12 15:35:54 crc kubenswrapper[4869]: I0312 15:35:54.959077 4869 scope.go:117] "RemoveContainer" containerID="6410f4ebd03d6bb2b59ebd53131836545820b4e8c29ce4fbb96e0bfbb258610a" Mar 12 15:35:54 crc kubenswrapper[4869]: I0312 15:35:54.972146 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lj98d"] Mar 12 15:35:54 crc kubenswrapper[4869]: I0312 15:35:54.980099 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lj98d"] Mar 12 15:35:54 crc kubenswrapper[4869]: I0312 15:35:54.998750 4869 scope.go:117] "RemoveContainer" containerID="a32eae167f26ac198c9227729c39783b4627ffe3b38d83a9b023eeedd988869a" Mar 12 15:35:55 crc kubenswrapper[4869]: I0312 15:35:55.040114 4869 scope.go:117] "RemoveContainer" containerID="d8a69f54fc64a52fdb53c4e8a5320b869f40f669600977b15b2f9b785fa4ed38" Mar 12 15:35:55 crc kubenswrapper[4869]: E0312 15:35:55.040754 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8a69f54fc64a52fdb53c4e8a5320b869f40f669600977b15b2f9b785fa4ed38\": container with ID starting with d8a69f54fc64a52fdb53c4e8a5320b869f40f669600977b15b2f9b785fa4ed38 not found: ID does not exist" containerID="d8a69f54fc64a52fdb53c4e8a5320b869f40f669600977b15b2f9b785fa4ed38" Mar 12 15:35:55 crc kubenswrapper[4869]: I0312 15:35:55.040809 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8a69f54fc64a52fdb53c4e8a5320b869f40f669600977b15b2f9b785fa4ed38"} err="failed to get container status \"d8a69f54fc64a52fdb53c4e8a5320b869f40f669600977b15b2f9b785fa4ed38\": rpc error: code = NotFound desc = could not find container \"d8a69f54fc64a52fdb53c4e8a5320b869f40f669600977b15b2f9b785fa4ed38\": container with ID starting with d8a69f54fc64a52fdb53c4e8a5320b869f40f669600977b15b2f9b785fa4ed38 not found: ID does not exist" Mar 12 15:35:55 crc kubenswrapper[4869]: I0312 15:35:55.040844 4869 scope.go:117] "RemoveContainer" containerID="6410f4ebd03d6bb2b59ebd53131836545820b4e8c29ce4fbb96e0bfbb258610a" Mar 12 15:35:55 crc kubenswrapper[4869]: E0312 15:35:55.044959 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6410f4ebd03d6bb2b59ebd53131836545820b4e8c29ce4fbb96e0bfbb258610a\": container with ID starting with 6410f4ebd03d6bb2b59ebd53131836545820b4e8c29ce4fbb96e0bfbb258610a not found: ID does not exist" containerID="6410f4ebd03d6bb2b59ebd53131836545820b4e8c29ce4fbb96e0bfbb258610a" Mar 12 15:35:55 crc kubenswrapper[4869]: I0312 15:35:55.045053 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6410f4ebd03d6bb2b59ebd53131836545820b4e8c29ce4fbb96e0bfbb258610a"} err="failed to get container status \"6410f4ebd03d6bb2b59ebd53131836545820b4e8c29ce4fbb96e0bfbb258610a\": rpc error: code = NotFound desc = could not find container \"6410f4ebd03d6bb2b59ebd53131836545820b4e8c29ce4fbb96e0bfbb258610a\": container with ID starting with 6410f4ebd03d6bb2b59ebd53131836545820b4e8c29ce4fbb96e0bfbb258610a not found: ID does not exist" Mar 12 15:35:55 crc kubenswrapper[4869]: I0312 15:35:55.045527 4869 scope.go:117] "RemoveContainer" containerID="a32eae167f26ac198c9227729c39783b4627ffe3b38d83a9b023eeedd988869a" Mar 12 15:35:55 crc kubenswrapper[4869]: E0312 15:35:55.046657 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a32eae167f26ac198c9227729c39783b4627ffe3b38d83a9b023eeedd988869a\": container with ID starting with a32eae167f26ac198c9227729c39783b4627ffe3b38d83a9b023eeedd988869a not found: ID does not exist" containerID="a32eae167f26ac198c9227729c39783b4627ffe3b38d83a9b023eeedd988869a" Mar 12 15:35:55 crc kubenswrapper[4869]: I0312 15:35:55.046681 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a32eae167f26ac198c9227729c39783b4627ffe3b38d83a9b023eeedd988869a"} err="failed to get container status \"a32eae167f26ac198c9227729c39783b4627ffe3b38d83a9b023eeedd988869a\": rpc error: code = NotFound desc = could not find container \"a32eae167f26ac198c9227729c39783b4627ffe3b38d83a9b023eeedd988869a\": container with ID starting with a32eae167f26ac198c9227729c39783b4627ffe3b38d83a9b023eeedd988869a not found: ID does not exist" Mar 12 15:35:56 crc kubenswrapper[4869]: I0312 15:35:56.348917 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8" path="/var/lib/kubelet/pods/2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8/volumes" Mar 12 15:36:00 crc kubenswrapper[4869]: I0312 15:36:00.140519 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555496-fjblp"] Mar 12 15:36:00 crc kubenswrapper[4869]: E0312 15:36:00.141473 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8" containerName="registry-server" Mar 12 15:36:00 crc kubenswrapper[4869]: I0312 15:36:00.141487 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8" containerName="registry-server" Mar 12 15:36:00 crc kubenswrapper[4869]: E0312 15:36:00.141506 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8" containerName="extract-utilities" Mar 12 15:36:00 crc kubenswrapper[4869]: I0312 15:36:00.141512 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8" containerName="extract-utilities" Mar 12 15:36:00 crc kubenswrapper[4869]: E0312 15:36:00.141522 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8" containerName="extract-content" Mar 12 15:36:00 crc kubenswrapper[4869]: I0312 15:36:00.141529 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8" containerName="extract-content" Mar 12 15:36:00 crc kubenswrapper[4869]: I0312 15:36:00.141729 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d71cdcd-1cd7-4b31-b5d2-eb4bea695ab8" containerName="registry-server" Mar 12 15:36:00 crc kubenswrapper[4869]: I0312 15:36:00.142311 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555496-fjblp" Mar 12 15:36:00 crc kubenswrapper[4869]: I0312 15:36:00.144826 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:36:00 crc kubenswrapper[4869]: I0312 15:36:00.147036 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:36:00 crc kubenswrapper[4869]: I0312 15:36:00.147362 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 15:36:00 crc kubenswrapper[4869]: I0312 15:36:00.160270 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555496-fjblp"] Mar 12 15:36:00 crc kubenswrapper[4869]: I0312 15:36:00.235010 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d76jj\" (UniqueName: \"kubernetes.io/projected/d404b3fc-a23e-44c6-9ca4-b2f4c41e4a6d-kube-api-access-d76jj\") pod \"auto-csr-approver-29555496-fjblp\" (UID: \"d404b3fc-a23e-44c6-9ca4-b2f4c41e4a6d\") " pod="openshift-infra/auto-csr-approver-29555496-fjblp" Mar 12 15:36:00 crc kubenswrapper[4869]: I0312 15:36:00.337354 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d76jj\" (UniqueName: \"kubernetes.io/projected/d404b3fc-a23e-44c6-9ca4-b2f4c41e4a6d-kube-api-access-d76jj\") pod \"auto-csr-approver-29555496-fjblp\" (UID: \"d404b3fc-a23e-44c6-9ca4-b2f4c41e4a6d\") " pod="openshift-infra/auto-csr-approver-29555496-fjblp" Mar 12 15:36:00 crc kubenswrapper[4869]: I0312 15:36:00.372759 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d76jj\" (UniqueName: \"kubernetes.io/projected/d404b3fc-a23e-44c6-9ca4-b2f4c41e4a6d-kube-api-access-d76jj\") pod \"auto-csr-approver-29555496-fjblp\" (UID: \"d404b3fc-a23e-44c6-9ca4-b2f4c41e4a6d\") " pod="openshift-infra/auto-csr-approver-29555496-fjblp" Mar 12 15:36:00 crc kubenswrapper[4869]: I0312 15:36:00.459809 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555496-fjblp" Mar 12 15:36:00 crc kubenswrapper[4869]: I0312 15:36:00.901717 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555496-fjblp"] Mar 12 15:36:00 crc kubenswrapper[4869]: I0312 15:36:00.994270 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555496-fjblp" event={"ID":"d404b3fc-a23e-44c6-9ca4-b2f4c41e4a6d","Type":"ContainerStarted","Data":"47b7af043eed71c02e0d758aebacc714ad4b6dac42d498e9a62efb08abd0d9f9"} Mar 12 15:36:03 crc kubenswrapper[4869]: I0312 15:36:03.015505 4869 generic.go:334] "Generic (PLEG): container finished" podID="d404b3fc-a23e-44c6-9ca4-b2f4c41e4a6d" containerID="6525aa150dac46af6468f9077ba6402d1b7dae6222425afb8f2f789e91e312ac" exitCode=0 Mar 12 15:36:03 crc kubenswrapper[4869]: I0312 15:36:03.015614 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555496-fjblp" event={"ID":"d404b3fc-a23e-44c6-9ca4-b2f4c41e4a6d","Type":"ContainerDied","Data":"6525aa150dac46af6468f9077ba6402d1b7dae6222425afb8f2f789e91e312ac"} Mar 12 15:36:04 crc kubenswrapper[4869]: I0312 15:36:04.679994 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555496-fjblp" Mar 12 15:36:04 crc kubenswrapper[4869]: I0312 15:36:04.832300 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d76jj\" (UniqueName: \"kubernetes.io/projected/d404b3fc-a23e-44c6-9ca4-b2f4c41e4a6d-kube-api-access-d76jj\") pod \"d404b3fc-a23e-44c6-9ca4-b2f4c41e4a6d\" (UID: \"d404b3fc-a23e-44c6-9ca4-b2f4c41e4a6d\") " Mar 12 15:36:04 crc kubenswrapper[4869]: I0312 15:36:04.845184 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d404b3fc-a23e-44c6-9ca4-b2f4c41e4a6d-kube-api-access-d76jj" (OuterVolumeSpecName: "kube-api-access-d76jj") pod "d404b3fc-a23e-44c6-9ca4-b2f4c41e4a6d" (UID: "d404b3fc-a23e-44c6-9ca4-b2f4c41e4a6d"). InnerVolumeSpecName "kube-api-access-d76jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:36:04 crc kubenswrapper[4869]: I0312 15:36:04.934673 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d76jj\" (UniqueName: \"kubernetes.io/projected/d404b3fc-a23e-44c6-9ca4-b2f4c41e4a6d-kube-api-access-d76jj\") on node \"crc\" DevicePath \"\"" Mar 12 15:36:05 crc kubenswrapper[4869]: I0312 15:36:05.036846 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555496-fjblp" event={"ID":"d404b3fc-a23e-44c6-9ca4-b2f4c41e4a6d","Type":"ContainerDied","Data":"47b7af043eed71c02e0d758aebacc714ad4b6dac42d498e9a62efb08abd0d9f9"} Mar 12 15:36:05 crc kubenswrapper[4869]: I0312 15:36:05.036893 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47b7af043eed71c02e0d758aebacc714ad4b6dac42d498e9a62efb08abd0d9f9" Mar 12 15:36:05 crc kubenswrapper[4869]: I0312 15:36:05.036948 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555496-fjblp" Mar 12 15:36:05 crc kubenswrapper[4869]: I0312 15:36:05.748702 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555490-t6mmg"] Mar 12 15:36:05 crc kubenswrapper[4869]: I0312 15:36:05.762561 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555490-t6mmg"] Mar 12 15:36:06 crc kubenswrapper[4869]: I0312 15:36:06.345631 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50f29abb-8a8d-4684-86f8-f89eff242676" path="/var/lib/kubelet/pods/50f29abb-8a8d-4684-86f8-f89eff242676/volumes" Mar 12 15:36:19 crc kubenswrapper[4869]: I0312 15:36:19.684664 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:36:19 crc kubenswrapper[4869]: I0312 15:36:19.685217 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:36:19 crc kubenswrapper[4869]: I0312 15:36:19.685278 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" Mar 12 15:36:19 crc kubenswrapper[4869]: I0312 15:36:19.686377 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fc439565269f3d0ef385b4279fbecd35febba51413b9ff6d43d5a2cd879b2a45"} pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 15:36:19 crc kubenswrapper[4869]: I0312 15:36:19.686458 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" containerID="cri-o://fc439565269f3d0ef385b4279fbecd35febba51413b9ff6d43d5a2cd879b2a45" gracePeriod=600 Mar 12 15:36:20 crc kubenswrapper[4869]: I0312 15:36:20.168670 4869 generic.go:334] "Generic (PLEG): container finished" podID="1621c994-94d2-4105-a988-f4739518ba91" containerID="fc439565269f3d0ef385b4279fbecd35febba51413b9ff6d43d5a2cd879b2a45" exitCode=0 Mar 12 15:36:20 crc kubenswrapper[4869]: I0312 15:36:20.169150 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerDied","Data":"fc439565269f3d0ef385b4279fbecd35febba51413b9ff6d43d5a2cd879b2a45"} Mar 12 15:36:20 crc kubenswrapper[4869]: I0312 15:36:20.169476 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerStarted","Data":"36d0bcf926e3be6f6106266c4e6cc533ce9b43631d5b1e2e852616edb70cc1f8"} Mar 12 15:36:20 crc kubenswrapper[4869]: I0312 15:36:20.169520 4869 scope.go:117] "RemoveContainer" containerID="c9a47bbb284d539c798de0e32d0c54787cc43e1f77e98632a15c2b240ce0d5b0" Mar 12 15:36:44 crc kubenswrapper[4869]: I0312 15:36:44.460687 4869 scope.go:117] "RemoveContainer" containerID="7f66a56bd537fafd7aeac991f2cbe5803d91dcf9fa0874d0d894b1c0849796c6" Mar 12 15:37:49 crc kubenswrapper[4869]: I0312 15:37:49.278642 4869 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-9xp9n container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.28:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 15:37:49 crc kubenswrapper[4869]: I0312 15:37:49.279146 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9xp9n" podUID="2d3a496a-d6d4-474b-8ab8-fffa4e661e07" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.28:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 15:37:49 crc kubenswrapper[4869]: I0312 15:37:49.484001 4869 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-9xp9n container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:5443/healthz\": context deadline exceeded" start-of-body= Mar 12 15:37:49 crc kubenswrapper[4869]: I0312 15:37:49.484055 4869 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9xp9n" podUID="2d3a496a-d6d4-474b-8ab8-fffa4e661e07" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.28:5443/healthz\": context deadline exceeded" Mar 12 15:38:00 crc kubenswrapper[4869]: I0312 15:38:00.144954 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555498-nwqlg"] Mar 12 15:38:00 crc kubenswrapper[4869]: E0312 15:38:00.145917 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d404b3fc-a23e-44c6-9ca4-b2f4c41e4a6d" containerName="oc" Mar 12 15:38:00 crc kubenswrapper[4869]: I0312 15:38:00.145931 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="d404b3fc-a23e-44c6-9ca4-b2f4c41e4a6d" containerName="oc" Mar 12 15:38:00 crc kubenswrapper[4869]: I0312 15:38:00.146135 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="d404b3fc-a23e-44c6-9ca4-b2f4c41e4a6d" containerName="oc" Mar 12 15:38:00 crc kubenswrapper[4869]: I0312 15:38:00.147059 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555498-nwqlg" Mar 12 15:38:00 crc kubenswrapper[4869]: I0312 15:38:00.149989 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:38:00 crc kubenswrapper[4869]: I0312 15:38:00.150351 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 15:38:00 crc kubenswrapper[4869]: I0312 15:38:00.150505 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:38:00 crc kubenswrapper[4869]: I0312 15:38:00.156005 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555498-nwqlg"] Mar 12 15:38:00 crc kubenswrapper[4869]: I0312 15:38:00.330692 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swnd6\" (UniqueName: \"kubernetes.io/projected/52818aa3-3c68-4f32-bdce-b46d8fbc7315-kube-api-access-swnd6\") pod \"auto-csr-approver-29555498-nwqlg\" (UID: \"52818aa3-3c68-4f32-bdce-b46d8fbc7315\") " pod="openshift-infra/auto-csr-approver-29555498-nwqlg" Mar 12 15:38:00 crc kubenswrapper[4869]: I0312 15:38:00.433960 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swnd6\" (UniqueName: \"kubernetes.io/projected/52818aa3-3c68-4f32-bdce-b46d8fbc7315-kube-api-access-swnd6\") pod \"auto-csr-approver-29555498-nwqlg\" (UID: \"52818aa3-3c68-4f32-bdce-b46d8fbc7315\") " pod="openshift-infra/auto-csr-approver-29555498-nwqlg" Mar 12 15:38:00 crc kubenswrapper[4869]: I0312 15:38:00.463565 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swnd6\" (UniqueName: \"kubernetes.io/projected/52818aa3-3c68-4f32-bdce-b46d8fbc7315-kube-api-access-swnd6\") pod \"auto-csr-approver-29555498-nwqlg\" (UID: \"52818aa3-3c68-4f32-bdce-b46d8fbc7315\") " pod="openshift-infra/auto-csr-approver-29555498-nwqlg" Mar 12 15:38:00 crc kubenswrapper[4869]: I0312 15:38:00.494493 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555498-nwqlg" Mar 12 15:38:00 crc kubenswrapper[4869]: I0312 15:38:00.979263 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555498-nwqlg"] Mar 12 15:38:01 crc kubenswrapper[4869]: I0312 15:38:01.303846 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555498-nwqlg" event={"ID":"52818aa3-3c68-4f32-bdce-b46d8fbc7315","Type":"ContainerStarted","Data":"aedf88a5dd476b944d80ba0efe2d09ce0c8b82e511bce3ca7783fe0601a430c9"} Mar 12 15:38:03 crc kubenswrapper[4869]: I0312 15:38:03.320694 4869 generic.go:334] "Generic (PLEG): container finished" podID="52818aa3-3c68-4f32-bdce-b46d8fbc7315" containerID="aa29d85398b129cdff3de361f66fe962f55883df5943e05b597a1f8bf6acbd0e" exitCode=0 Mar 12 15:38:03 crc kubenswrapper[4869]: I0312 15:38:03.320798 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555498-nwqlg" event={"ID":"52818aa3-3c68-4f32-bdce-b46d8fbc7315","Type":"ContainerDied","Data":"aa29d85398b129cdff3de361f66fe962f55883df5943e05b597a1f8bf6acbd0e"} Mar 12 15:38:04 crc kubenswrapper[4869]: I0312 15:38:04.823626 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555498-nwqlg" Mar 12 15:38:04 crc kubenswrapper[4869]: I0312 15:38:04.944418 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swnd6\" (UniqueName: \"kubernetes.io/projected/52818aa3-3c68-4f32-bdce-b46d8fbc7315-kube-api-access-swnd6\") pod \"52818aa3-3c68-4f32-bdce-b46d8fbc7315\" (UID: \"52818aa3-3c68-4f32-bdce-b46d8fbc7315\") " Mar 12 15:38:04 crc kubenswrapper[4869]: I0312 15:38:04.951203 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52818aa3-3c68-4f32-bdce-b46d8fbc7315-kube-api-access-swnd6" (OuterVolumeSpecName: "kube-api-access-swnd6") pod "52818aa3-3c68-4f32-bdce-b46d8fbc7315" (UID: "52818aa3-3c68-4f32-bdce-b46d8fbc7315"). InnerVolumeSpecName "kube-api-access-swnd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:38:05 crc kubenswrapper[4869]: I0312 15:38:05.047542 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swnd6\" (UniqueName: \"kubernetes.io/projected/52818aa3-3c68-4f32-bdce-b46d8fbc7315-kube-api-access-swnd6\") on node \"crc\" DevicePath \"\"" Mar 12 15:38:05 crc kubenswrapper[4869]: I0312 15:38:05.338403 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555498-nwqlg" event={"ID":"52818aa3-3c68-4f32-bdce-b46d8fbc7315","Type":"ContainerDied","Data":"aedf88a5dd476b944d80ba0efe2d09ce0c8b82e511bce3ca7783fe0601a430c9"} Mar 12 15:38:05 crc kubenswrapper[4869]: I0312 15:38:05.338448 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aedf88a5dd476b944d80ba0efe2d09ce0c8b82e511bce3ca7783fe0601a430c9" Mar 12 15:38:05 crc kubenswrapper[4869]: I0312 15:38:05.338490 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555498-nwqlg" Mar 12 15:38:05 crc kubenswrapper[4869]: I0312 15:38:05.900495 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555492-cpxgp"] Mar 12 15:38:05 crc kubenswrapper[4869]: I0312 15:38:05.909523 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555492-cpxgp"] Mar 12 15:38:06 crc kubenswrapper[4869]: I0312 15:38:06.346423 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="745ca9cb-c31f-4c01-aaff-a67e2f6cb59c" path="/var/lib/kubelet/pods/745ca9cb-c31f-4c01-aaff-a67e2f6cb59c/volumes" Mar 12 15:38:44 crc kubenswrapper[4869]: I0312 15:38:44.575194 4869 scope.go:117] "RemoveContainer" containerID="91be9c4955bb86d47a1ab3bbdc0770ad1a62689768bf9ba14393d1b1abbeaf55" Mar 12 15:38:49 crc kubenswrapper[4869]: I0312 15:38:49.684158 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:38:49 crc kubenswrapper[4869]: I0312 15:38:49.684839 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:39:19 crc kubenswrapper[4869]: I0312 15:39:19.683846 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:39:19 crc kubenswrapper[4869]: I0312 15:39:19.684397 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:39:49 crc kubenswrapper[4869]: I0312 15:39:49.684496 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:39:49 crc kubenswrapper[4869]: I0312 15:39:49.685225 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:39:49 crc kubenswrapper[4869]: I0312 15:39:49.685271 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" Mar 12 15:39:49 crc kubenswrapper[4869]: I0312 15:39:49.686305 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"36d0bcf926e3be6f6106266c4e6cc533ce9b43631d5b1e2e852616edb70cc1f8"} pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 15:39:49 crc kubenswrapper[4869]: I0312 15:39:49.686384 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" containerID="cri-o://36d0bcf926e3be6f6106266c4e6cc533ce9b43631d5b1e2e852616edb70cc1f8" gracePeriod=600 Mar 12 15:39:49 crc kubenswrapper[4869]: E0312 15:39:49.808964 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:39:50 crc kubenswrapper[4869]: I0312 15:39:50.290054 4869 generic.go:334] "Generic (PLEG): container finished" podID="1621c994-94d2-4105-a988-f4739518ba91" containerID="36d0bcf926e3be6f6106266c4e6cc533ce9b43631d5b1e2e852616edb70cc1f8" exitCode=0 Mar 12 15:39:50 crc kubenswrapper[4869]: I0312 15:39:50.290126 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerDied","Data":"36d0bcf926e3be6f6106266c4e6cc533ce9b43631d5b1e2e852616edb70cc1f8"} Mar 12 15:39:50 crc kubenswrapper[4869]: I0312 15:39:50.290396 4869 scope.go:117] "RemoveContainer" containerID="fc439565269f3d0ef385b4279fbecd35febba51413b9ff6d43d5a2cd879b2a45" Mar 12 15:39:50 crc kubenswrapper[4869]: I0312 15:39:50.291134 4869 scope.go:117] "RemoveContainer" containerID="36d0bcf926e3be6f6106266c4e6cc533ce9b43631d5b1e2e852616edb70cc1f8" Mar 12 15:39:50 crc kubenswrapper[4869]: E0312 15:39:50.291449 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:40:00 crc kubenswrapper[4869]: I0312 15:40:00.201971 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555500-84j67"] Mar 12 15:40:00 crc kubenswrapper[4869]: E0312 15:40:00.203635 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52818aa3-3c68-4f32-bdce-b46d8fbc7315" containerName="oc" Mar 12 15:40:00 crc kubenswrapper[4869]: I0312 15:40:00.203715 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="52818aa3-3c68-4f32-bdce-b46d8fbc7315" containerName="oc" Mar 12 15:40:00 crc kubenswrapper[4869]: I0312 15:40:00.204336 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="52818aa3-3c68-4f32-bdce-b46d8fbc7315" containerName="oc" Mar 12 15:40:00 crc kubenswrapper[4869]: I0312 15:40:00.205271 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555500-84j67" Mar 12 15:40:00 crc kubenswrapper[4869]: I0312 15:40:00.207583 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:40:00 crc kubenswrapper[4869]: I0312 15:40:00.207654 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:40:00 crc kubenswrapper[4869]: I0312 15:40:00.207946 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 15:40:00 crc kubenswrapper[4869]: I0312 15:40:00.213740 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555500-84j67"] Mar 12 15:40:00 crc kubenswrapper[4869]: I0312 15:40:00.312034 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cq47\" (UniqueName: \"kubernetes.io/projected/4b01772f-9975-4982-8fd3-8fef34017b68-kube-api-access-9cq47\") pod \"auto-csr-approver-29555500-84j67\" (UID: \"4b01772f-9975-4982-8fd3-8fef34017b68\") " pod="openshift-infra/auto-csr-approver-29555500-84j67" Mar 12 15:40:00 crc kubenswrapper[4869]: I0312 15:40:00.413995 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cq47\" (UniqueName: \"kubernetes.io/projected/4b01772f-9975-4982-8fd3-8fef34017b68-kube-api-access-9cq47\") pod \"auto-csr-approver-29555500-84j67\" (UID: \"4b01772f-9975-4982-8fd3-8fef34017b68\") " pod="openshift-infra/auto-csr-approver-29555500-84j67" Mar 12 15:40:00 crc kubenswrapper[4869]: I0312 15:40:00.448531 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cq47\" (UniqueName: \"kubernetes.io/projected/4b01772f-9975-4982-8fd3-8fef34017b68-kube-api-access-9cq47\") pod \"auto-csr-approver-29555500-84j67\" (UID: \"4b01772f-9975-4982-8fd3-8fef34017b68\") " pod="openshift-infra/auto-csr-approver-29555500-84j67" Mar 12 15:40:00 crc kubenswrapper[4869]: I0312 15:40:00.529514 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555500-84j67" Mar 12 15:40:01 crc kubenswrapper[4869]: I0312 15:40:01.000671 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555500-84j67"] Mar 12 15:40:01 crc kubenswrapper[4869]: I0312 15:40:01.003305 4869 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 15:40:01 crc kubenswrapper[4869]: I0312 15:40:01.388991 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555500-84j67" event={"ID":"4b01772f-9975-4982-8fd3-8fef34017b68","Type":"ContainerStarted","Data":"988e5f205651a0a6ef672197e72d39d1091938855798decfc576b969532d75dd"} Mar 12 15:40:03 crc kubenswrapper[4869]: I0312 15:40:03.404808 4869 generic.go:334] "Generic (PLEG): container finished" podID="4b01772f-9975-4982-8fd3-8fef34017b68" containerID="e119966be2a6568d1202b338d024b32867398a518dce955a6a9f36a224f886b0" exitCode=0 Mar 12 15:40:03 crc kubenswrapper[4869]: I0312 15:40:03.404909 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555500-84j67" event={"ID":"4b01772f-9975-4982-8fd3-8fef34017b68","Type":"ContainerDied","Data":"e119966be2a6568d1202b338d024b32867398a518dce955a6a9f36a224f886b0"} Mar 12 15:40:04 crc kubenswrapper[4869]: I0312 15:40:04.353846 4869 scope.go:117] "RemoveContainer" containerID="36d0bcf926e3be6f6106266c4e6cc533ce9b43631d5b1e2e852616edb70cc1f8" Mar 12 15:40:04 crc kubenswrapper[4869]: E0312 15:40:04.354896 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:40:05 crc kubenswrapper[4869]: I0312 15:40:05.046910 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555500-84j67" Mar 12 15:40:05 crc kubenswrapper[4869]: I0312 15:40:05.214358 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cq47\" (UniqueName: \"kubernetes.io/projected/4b01772f-9975-4982-8fd3-8fef34017b68-kube-api-access-9cq47\") pod \"4b01772f-9975-4982-8fd3-8fef34017b68\" (UID: \"4b01772f-9975-4982-8fd3-8fef34017b68\") " Mar 12 15:40:05 crc kubenswrapper[4869]: I0312 15:40:05.228859 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b01772f-9975-4982-8fd3-8fef34017b68-kube-api-access-9cq47" (OuterVolumeSpecName: "kube-api-access-9cq47") pod "4b01772f-9975-4982-8fd3-8fef34017b68" (UID: "4b01772f-9975-4982-8fd3-8fef34017b68"). InnerVolumeSpecName "kube-api-access-9cq47". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:40:05 crc kubenswrapper[4869]: I0312 15:40:05.317224 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cq47\" (UniqueName: \"kubernetes.io/projected/4b01772f-9975-4982-8fd3-8fef34017b68-kube-api-access-9cq47\") on node \"crc\" DevicePath \"\"" Mar 12 15:40:05 crc kubenswrapper[4869]: I0312 15:40:05.424212 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555500-84j67" event={"ID":"4b01772f-9975-4982-8fd3-8fef34017b68","Type":"ContainerDied","Data":"988e5f205651a0a6ef672197e72d39d1091938855798decfc576b969532d75dd"} Mar 12 15:40:05 crc kubenswrapper[4869]: I0312 15:40:05.424247 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="988e5f205651a0a6ef672197e72d39d1091938855798decfc576b969532d75dd" Mar 12 15:40:05 crc kubenswrapper[4869]: I0312 15:40:05.424288 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555500-84j67" Mar 12 15:40:06 crc kubenswrapper[4869]: I0312 15:40:06.133531 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555494-v95fd"] Mar 12 15:40:06 crc kubenswrapper[4869]: I0312 15:40:06.142297 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555494-v95fd"] Mar 12 15:40:06 crc kubenswrapper[4869]: I0312 15:40:06.346764 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd32572d-ce1d-4c27-b381-351c1866a2f4" path="/var/lib/kubelet/pods/fd32572d-ce1d-4c27-b381-351c1866a2f4/volumes" Mar 12 15:40:19 crc kubenswrapper[4869]: I0312 15:40:19.095427 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hvrb7"] Mar 12 15:40:19 crc kubenswrapper[4869]: E0312 15:40:19.096525 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b01772f-9975-4982-8fd3-8fef34017b68" containerName="oc" Mar 12 15:40:19 crc kubenswrapper[4869]: I0312 15:40:19.096555 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b01772f-9975-4982-8fd3-8fef34017b68" containerName="oc" Mar 12 15:40:19 crc kubenswrapper[4869]: I0312 15:40:19.096724 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b01772f-9975-4982-8fd3-8fef34017b68" containerName="oc" Mar 12 15:40:19 crc kubenswrapper[4869]: I0312 15:40:19.098170 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hvrb7" Mar 12 15:40:19 crc kubenswrapper[4869]: I0312 15:40:19.128137 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hvrb7"] Mar 12 15:40:19 crc kubenswrapper[4869]: I0312 15:40:19.202469 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfb2caba-04d8-4e4b-9f13-c739b2f9150c-utilities\") pod \"redhat-operators-hvrb7\" (UID: \"bfb2caba-04d8-4e4b-9f13-c739b2f9150c\") " pod="openshift-marketplace/redhat-operators-hvrb7" Mar 12 15:40:19 crc kubenswrapper[4869]: I0312 15:40:19.202692 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfb2caba-04d8-4e4b-9f13-c739b2f9150c-catalog-content\") pod \"redhat-operators-hvrb7\" (UID: \"bfb2caba-04d8-4e4b-9f13-c739b2f9150c\") " pod="openshift-marketplace/redhat-operators-hvrb7" Mar 12 15:40:19 crc kubenswrapper[4869]: I0312 15:40:19.202886 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfg6q\" (UniqueName: \"kubernetes.io/projected/bfb2caba-04d8-4e4b-9f13-c739b2f9150c-kube-api-access-jfg6q\") pod \"redhat-operators-hvrb7\" (UID: \"bfb2caba-04d8-4e4b-9f13-c739b2f9150c\") " pod="openshift-marketplace/redhat-operators-hvrb7" Mar 12 15:40:19 crc kubenswrapper[4869]: I0312 15:40:19.304955 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfb2caba-04d8-4e4b-9f13-c739b2f9150c-utilities\") pod \"redhat-operators-hvrb7\" (UID: \"bfb2caba-04d8-4e4b-9f13-c739b2f9150c\") " pod="openshift-marketplace/redhat-operators-hvrb7" Mar 12 15:40:19 crc kubenswrapper[4869]: I0312 15:40:19.305046 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfb2caba-04d8-4e4b-9f13-c739b2f9150c-catalog-content\") pod \"redhat-operators-hvrb7\" (UID: \"bfb2caba-04d8-4e4b-9f13-c739b2f9150c\") " pod="openshift-marketplace/redhat-operators-hvrb7" Mar 12 15:40:19 crc kubenswrapper[4869]: I0312 15:40:19.305101 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfg6q\" (UniqueName: \"kubernetes.io/projected/bfb2caba-04d8-4e4b-9f13-c739b2f9150c-kube-api-access-jfg6q\") pod \"redhat-operators-hvrb7\" (UID: \"bfb2caba-04d8-4e4b-9f13-c739b2f9150c\") " pod="openshift-marketplace/redhat-operators-hvrb7" Mar 12 15:40:19 crc kubenswrapper[4869]: I0312 15:40:19.305577 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfb2caba-04d8-4e4b-9f13-c739b2f9150c-catalog-content\") pod \"redhat-operators-hvrb7\" (UID: \"bfb2caba-04d8-4e4b-9f13-c739b2f9150c\") " pod="openshift-marketplace/redhat-operators-hvrb7" Mar 12 15:40:19 crc kubenswrapper[4869]: I0312 15:40:19.305608 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfb2caba-04d8-4e4b-9f13-c739b2f9150c-utilities\") pod \"redhat-operators-hvrb7\" (UID: \"bfb2caba-04d8-4e4b-9f13-c739b2f9150c\") " pod="openshift-marketplace/redhat-operators-hvrb7" Mar 12 15:40:19 crc kubenswrapper[4869]: I0312 15:40:19.323712 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfg6q\" (UniqueName: \"kubernetes.io/projected/bfb2caba-04d8-4e4b-9f13-c739b2f9150c-kube-api-access-jfg6q\") pod \"redhat-operators-hvrb7\" (UID: \"bfb2caba-04d8-4e4b-9f13-c739b2f9150c\") " pod="openshift-marketplace/redhat-operators-hvrb7" Mar 12 15:40:19 crc kubenswrapper[4869]: I0312 15:40:19.337072 4869 scope.go:117] "RemoveContainer" containerID="36d0bcf926e3be6f6106266c4e6cc533ce9b43631d5b1e2e852616edb70cc1f8" Mar 12 15:40:19 crc kubenswrapper[4869]: E0312 15:40:19.337508 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:40:19 crc kubenswrapper[4869]: I0312 15:40:19.424470 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hvrb7" Mar 12 15:40:19 crc kubenswrapper[4869]: I0312 15:40:19.949153 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hvrb7"] Mar 12 15:40:20 crc kubenswrapper[4869]: I0312 15:40:20.578018 4869 generic.go:334] "Generic (PLEG): container finished" podID="bfb2caba-04d8-4e4b-9f13-c739b2f9150c" containerID="859a8ca83d131847e10f3098573bbf39c0b6c0d37fe2fdc47deb8ff38e27ecd2" exitCode=0 Mar 12 15:40:20 crc kubenswrapper[4869]: I0312 15:40:20.578285 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvrb7" event={"ID":"bfb2caba-04d8-4e4b-9f13-c739b2f9150c","Type":"ContainerDied","Data":"859a8ca83d131847e10f3098573bbf39c0b6c0d37fe2fdc47deb8ff38e27ecd2"} Mar 12 15:40:20 crc kubenswrapper[4869]: I0312 15:40:20.578308 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvrb7" event={"ID":"bfb2caba-04d8-4e4b-9f13-c739b2f9150c","Type":"ContainerStarted","Data":"d68d6e2e67857d210ede31225ef60a31f320e69a08b288f6947678b027164757"} Mar 12 15:40:21 crc kubenswrapper[4869]: I0312 15:40:21.588819 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvrb7" event={"ID":"bfb2caba-04d8-4e4b-9f13-c739b2f9150c","Type":"ContainerStarted","Data":"f2a6e4ab354a4423ca753ce2e13ae5e2c80bbe6081679ec51b55b763808b4ba6"} Mar 12 15:40:27 crc kubenswrapper[4869]: I0312 15:40:27.835436 4869 generic.go:334] "Generic (PLEG): container finished" podID="bfb2caba-04d8-4e4b-9f13-c739b2f9150c" containerID="f2a6e4ab354a4423ca753ce2e13ae5e2c80bbe6081679ec51b55b763808b4ba6" exitCode=0 Mar 12 15:40:27 crc kubenswrapper[4869]: I0312 15:40:27.835489 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvrb7" event={"ID":"bfb2caba-04d8-4e4b-9f13-c739b2f9150c","Type":"ContainerDied","Data":"f2a6e4ab354a4423ca753ce2e13ae5e2c80bbe6081679ec51b55b763808b4ba6"} Mar 12 15:40:28 crc kubenswrapper[4869]: I0312 15:40:28.848667 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvrb7" event={"ID":"bfb2caba-04d8-4e4b-9f13-c739b2f9150c","Type":"ContainerStarted","Data":"bc9ad80caa94f7b8020732cee841348f1ea12461b6f295be0ba8fa959b12cabc"} Mar 12 15:40:28 crc kubenswrapper[4869]: I0312 15:40:28.874922 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hvrb7" podStartSLOduration=2.012785571 podStartE2EDuration="9.874899078s" podCreationTimestamp="2026-03-12 15:40:19 +0000 UTC" firstStartedPulling="2026-03-12 15:40:20.579713782 +0000 UTC m=+3172.864939060" lastFinishedPulling="2026-03-12 15:40:28.441827289 +0000 UTC m=+3180.727052567" observedRunningTime="2026-03-12 15:40:28.874834086 +0000 UTC m=+3181.160059384" watchObservedRunningTime="2026-03-12 15:40:28.874899078 +0000 UTC m=+3181.160124356" Mar 12 15:40:29 crc kubenswrapper[4869]: I0312 15:40:29.425596 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hvrb7" Mar 12 15:40:29 crc kubenswrapper[4869]: I0312 15:40:29.425653 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hvrb7" Mar 12 15:40:30 crc kubenswrapper[4869]: I0312 15:40:30.473624 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hvrb7" podUID="bfb2caba-04d8-4e4b-9f13-c739b2f9150c" containerName="registry-server" probeResult="failure" output=< Mar 12 15:40:30 crc kubenswrapper[4869]: timeout: failed to connect service ":50051" within 1s Mar 12 15:40:30 crc kubenswrapper[4869]: > Mar 12 15:40:32 crc kubenswrapper[4869]: I0312 15:40:32.336865 4869 scope.go:117] "RemoveContainer" containerID="36d0bcf926e3be6f6106266c4e6cc533ce9b43631d5b1e2e852616edb70cc1f8" Mar 12 15:40:32 crc kubenswrapper[4869]: E0312 15:40:32.337401 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:40:40 crc kubenswrapper[4869]: I0312 15:40:40.472512 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hvrb7" podUID="bfb2caba-04d8-4e4b-9f13-c739b2f9150c" containerName="registry-server" probeResult="failure" output=< Mar 12 15:40:40 crc kubenswrapper[4869]: timeout: failed to connect service ":50051" within 1s Mar 12 15:40:40 crc kubenswrapper[4869]: > Mar 12 15:40:44 crc kubenswrapper[4869]: I0312 15:40:44.678085 4869 scope.go:117] "RemoveContainer" containerID="fe0bb8275b71d5db0e6927500c30a4ef55af23c7dc5194db3b67b9f6d0256d87" Mar 12 15:40:46 crc kubenswrapper[4869]: I0312 15:40:46.337016 4869 scope.go:117] "RemoveContainer" containerID="36d0bcf926e3be6f6106266c4e6cc533ce9b43631d5b1e2e852616edb70cc1f8" Mar 12 15:40:46 crc kubenswrapper[4869]: E0312 15:40:46.337790 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:40:50 crc kubenswrapper[4869]: I0312 15:40:50.489130 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hvrb7" podUID="bfb2caba-04d8-4e4b-9f13-c739b2f9150c" containerName="registry-server" probeResult="failure" output=< Mar 12 15:40:50 crc kubenswrapper[4869]: timeout: failed to connect service ":50051" within 1s Mar 12 15:40:50 crc kubenswrapper[4869]: > Mar 12 15:40:58 crc kubenswrapper[4869]: I0312 15:40:58.364557 4869 scope.go:117] "RemoveContainer" containerID="36d0bcf926e3be6f6106266c4e6cc533ce9b43631d5b1e2e852616edb70cc1f8" Mar 12 15:40:58 crc kubenswrapper[4869]: E0312 15:40:58.365858 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:41:00 crc kubenswrapper[4869]: I0312 15:41:00.469069 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hvrb7" podUID="bfb2caba-04d8-4e4b-9f13-c739b2f9150c" containerName="registry-server" probeResult="failure" output=< Mar 12 15:41:00 crc kubenswrapper[4869]: timeout: failed to connect service ":50051" within 1s Mar 12 15:41:00 crc kubenswrapper[4869]: > Mar 12 15:41:09 crc kubenswrapper[4869]: I0312 15:41:09.475171 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hvrb7" Mar 12 15:41:09 crc kubenswrapper[4869]: I0312 15:41:09.527397 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hvrb7" Mar 12 15:41:09 crc kubenswrapper[4869]: I0312 15:41:09.729280 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hvrb7"] Mar 12 15:41:10 crc kubenswrapper[4869]: I0312 15:41:10.338081 4869 scope.go:117] "RemoveContainer" containerID="36d0bcf926e3be6f6106266c4e6cc533ce9b43631d5b1e2e852616edb70cc1f8" Mar 12 15:41:10 crc kubenswrapper[4869]: E0312 15:41:10.338314 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:41:11 crc kubenswrapper[4869]: I0312 15:41:11.194680 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hvrb7" podUID="bfb2caba-04d8-4e4b-9f13-c739b2f9150c" containerName="registry-server" containerID="cri-o://bc9ad80caa94f7b8020732cee841348f1ea12461b6f295be0ba8fa959b12cabc" gracePeriod=2 Mar 12 15:41:11 crc kubenswrapper[4869]: I0312 15:41:11.891372 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hvrb7" Mar 12 15:41:12 crc kubenswrapper[4869]: I0312 15:41:12.024649 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfb2caba-04d8-4e4b-9f13-c739b2f9150c-catalog-content\") pod \"bfb2caba-04d8-4e4b-9f13-c739b2f9150c\" (UID: \"bfb2caba-04d8-4e4b-9f13-c739b2f9150c\") " Mar 12 15:41:12 crc kubenswrapper[4869]: I0312 15:41:12.024729 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfg6q\" (UniqueName: \"kubernetes.io/projected/bfb2caba-04d8-4e4b-9f13-c739b2f9150c-kube-api-access-jfg6q\") pod \"bfb2caba-04d8-4e4b-9f13-c739b2f9150c\" (UID: \"bfb2caba-04d8-4e4b-9f13-c739b2f9150c\") " Mar 12 15:41:12 crc kubenswrapper[4869]: I0312 15:41:12.024948 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfb2caba-04d8-4e4b-9f13-c739b2f9150c-utilities\") pod \"bfb2caba-04d8-4e4b-9f13-c739b2f9150c\" (UID: \"bfb2caba-04d8-4e4b-9f13-c739b2f9150c\") " Mar 12 15:41:12 crc kubenswrapper[4869]: I0312 15:41:12.025963 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfb2caba-04d8-4e4b-9f13-c739b2f9150c-utilities" (OuterVolumeSpecName: "utilities") pod "bfb2caba-04d8-4e4b-9f13-c739b2f9150c" (UID: "bfb2caba-04d8-4e4b-9f13-c739b2f9150c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:41:12 crc kubenswrapper[4869]: I0312 15:41:12.038760 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfb2caba-04d8-4e4b-9f13-c739b2f9150c-kube-api-access-jfg6q" (OuterVolumeSpecName: "kube-api-access-jfg6q") pod "bfb2caba-04d8-4e4b-9f13-c739b2f9150c" (UID: "bfb2caba-04d8-4e4b-9f13-c739b2f9150c"). InnerVolumeSpecName "kube-api-access-jfg6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:41:12 crc kubenswrapper[4869]: I0312 15:41:12.127598 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfg6q\" (UniqueName: \"kubernetes.io/projected/bfb2caba-04d8-4e4b-9f13-c739b2f9150c-kube-api-access-jfg6q\") on node \"crc\" DevicePath \"\"" Mar 12 15:41:12 crc kubenswrapper[4869]: I0312 15:41:12.127648 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfb2caba-04d8-4e4b-9f13-c739b2f9150c-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:41:12 crc kubenswrapper[4869]: I0312 15:41:12.134597 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfb2caba-04d8-4e4b-9f13-c739b2f9150c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bfb2caba-04d8-4e4b-9f13-c739b2f9150c" (UID: "bfb2caba-04d8-4e4b-9f13-c739b2f9150c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:41:12 crc kubenswrapper[4869]: I0312 15:41:12.202395 4869 generic.go:334] "Generic (PLEG): container finished" podID="bfb2caba-04d8-4e4b-9f13-c739b2f9150c" containerID="bc9ad80caa94f7b8020732cee841348f1ea12461b6f295be0ba8fa959b12cabc" exitCode=0 Mar 12 15:41:12 crc kubenswrapper[4869]: I0312 15:41:12.202436 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvrb7" event={"ID":"bfb2caba-04d8-4e4b-9f13-c739b2f9150c","Type":"ContainerDied","Data":"bc9ad80caa94f7b8020732cee841348f1ea12461b6f295be0ba8fa959b12cabc"} Mar 12 15:41:12 crc kubenswrapper[4869]: I0312 15:41:12.202461 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hvrb7" event={"ID":"bfb2caba-04d8-4e4b-9f13-c739b2f9150c","Type":"ContainerDied","Data":"d68d6e2e67857d210ede31225ef60a31f320e69a08b288f6947678b027164757"} Mar 12 15:41:12 crc kubenswrapper[4869]: I0312 15:41:12.202476 4869 scope.go:117] "RemoveContainer" containerID="bc9ad80caa94f7b8020732cee841348f1ea12461b6f295be0ba8fa959b12cabc" Mar 12 15:41:12 crc kubenswrapper[4869]: I0312 15:41:12.202601 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hvrb7" Mar 12 15:41:12 crc kubenswrapper[4869]: I0312 15:41:12.229576 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfb2caba-04d8-4e4b-9f13-c739b2f9150c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:41:12 crc kubenswrapper[4869]: I0312 15:41:12.236264 4869 scope.go:117] "RemoveContainer" containerID="f2a6e4ab354a4423ca753ce2e13ae5e2c80bbe6081679ec51b55b763808b4ba6" Mar 12 15:41:12 crc kubenswrapper[4869]: I0312 15:41:12.261615 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hvrb7"] Mar 12 15:41:12 crc kubenswrapper[4869]: I0312 15:41:12.269686 4869 scope.go:117] "RemoveContainer" containerID="859a8ca83d131847e10f3098573bbf39c0b6c0d37fe2fdc47deb8ff38e27ecd2" Mar 12 15:41:12 crc kubenswrapper[4869]: I0312 15:41:12.272679 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hvrb7"] Mar 12 15:41:12 crc kubenswrapper[4869]: I0312 15:41:12.316678 4869 scope.go:117] "RemoveContainer" containerID="bc9ad80caa94f7b8020732cee841348f1ea12461b6f295be0ba8fa959b12cabc" Mar 12 15:41:12 crc kubenswrapper[4869]: E0312 15:41:12.317220 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc9ad80caa94f7b8020732cee841348f1ea12461b6f295be0ba8fa959b12cabc\": container with ID starting with bc9ad80caa94f7b8020732cee841348f1ea12461b6f295be0ba8fa959b12cabc not found: ID does not exist" containerID="bc9ad80caa94f7b8020732cee841348f1ea12461b6f295be0ba8fa959b12cabc" Mar 12 15:41:12 crc kubenswrapper[4869]: I0312 15:41:12.317287 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc9ad80caa94f7b8020732cee841348f1ea12461b6f295be0ba8fa959b12cabc"} err="failed to get container status \"bc9ad80caa94f7b8020732cee841348f1ea12461b6f295be0ba8fa959b12cabc\": rpc error: code = NotFound desc = could not find container \"bc9ad80caa94f7b8020732cee841348f1ea12461b6f295be0ba8fa959b12cabc\": container with ID starting with bc9ad80caa94f7b8020732cee841348f1ea12461b6f295be0ba8fa959b12cabc not found: ID does not exist" Mar 12 15:41:12 crc kubenswrapper[4869]: I0312 15:41:12.317321 4869 scope.go:117] "RemoveContainer" containerID="f2a6e4ab354a4423ca753ce2e13ae5e2c80bbe6081679ec51b55b763808b4ba6" Mar 12 15:41:12 crc kubenswrapper[4869]: E0312 15:41:12.317836 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2a6e4ab354a4423ca753ce2e13ae5e2c80bbe6081679ec51b55b763808b4ba6\": container with ID starting with f2a6e4ab354a4423ca753ce2e13ae5e2c80bbe6081679ec51b55b763808b4ba6 not found: ID does not exist" containerID="f2a6e4ab354a4423ca753ce2e13ae5e2c80bbe6081679ec51b55b763808b4ba6" Mar 12 15:41:12 crc kubenswrapper[4869]: I0312 15:41:12.317884 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2a6e4ab354a4423ca753ce2e13ae5e2c80bbe6081679ec51b55b763808b4ba6"} err="failed to get container status \"f2a6e4ab354a4423ca753ce2e13ae5e2c80bbe6081679ec51b55b763808b4ba6\": rpc error: code = NotFound desc = could not find container \"f2a6e4ab354a4423ca753ce2e13ae5e2c80bbe6081679ec51b55b763808b4ba6\": container with ID starting with f2a6e4ab354a4423ca753ce2e13ae5e2c80bbe6081679ec51b55b763808b4ba6 not found: ID does not exist" Mar 12 15:41:12 crc kubenswrapper[4869]: I0312 15:41:12.317906 4869 scope.go:117] "RemoveContainer" containerID="859a8ca83d131847e10f3098573bbf39c0b6c0d37fe2fdc47deb8ff38e27ecd2" Mar 12 15:41:12 crc kubenswrapper[4869]: E0312 15:41:12.318315 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"859a8ca83d131847e10f3098573bbf39c0b6c0d37fe2fdc47deb8ff38e27ecd2\": container with ID starting with 859a8ca83d131847e10f3098573bbf39c0b6c0d37fe2fdc47deb8ff38e27ecd2 not found: ID does not exist" containerID="859a8ca83d131847e10f3098573bbf39c0b6c0d37fe2fdc47deb8ff38e27ecd2" Mar 12 15:41:12 crc kubenswrapper[4869]: I0312 15:41:12.318420 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"859a8ca83d131847e10f3098573bbf39c0b6c0d37fe2fdc47deb8ff38e27ecd2"} err="failed to get container status \"859a8ca83d131847e10f3098573bbf39c0b6c0d37fe2fdc47deb8ff38e27ecd2\": rpc error: code = NotFound desc = could not find container \"859a8ca83d131847e10f3098573bbf39c0b6c0d37fe2fdc47deb8ff38e27ecd2\": container with ID starting with 859a8ca83d131847e10f3098573bbf39c0b6c0d37fe2fdc47deb8ff38e27ecd2 not found: ID does not exist" Mar 12 15:41:12 crc kubenswrapper[4869]: I0312 15:41:12.348460 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfb2caba-04d8-4e4b-9f13-c739b2f9150c" path="/var/lib/kubelet/pods/bfb2caba-04d8-4e4b-9f13-c739b2f9150c/volumes" Mar 12 15:41:23 crc kubenswrapper[4869]: I0312 15:41:23.337881 4869 scope.go:117] "RemoveContainer" containerID="36d0bcf926e3be6f6106266c4e6cc533ce9b43631d5b1e2e852616edb70cc1f8" Mar 12 15:41:23 crc kubenswrapper[4869]: E0312 15:41:23.338636 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:41:35 crc kubenswrapper[4869]: I0312 15:41:35.336432 4869 scope.go:117] "RemoveContainer" containerID="36d0bcf926e3be6f6106266c4e6cc533ce9b43631d5b1e2e852616edb70cc1f8" Mar 12 15:41:35 crc kubenswrapper[4869]: E0312 15:41:35.337220 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:41:50 crc kubenswrapper[4869]: I0312 15:41:50.337111 4869 scope.go:117] "RemoveContainer" containerID="36d0bcf926e3be6f6106266c4e6cc533ce9b43631d5b1e2e852616edb70cc1f8" Mar 12 15:41:50 crc kubenswrapper[4869]: E0312 15:41:50.337942 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:42:00 crc kubenswrapper[4869]: I0312 15:42:00.149041 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555502-wwbtd"] Mar 12 15:42:00 crc kubenswrapper[4869]: E0312 15:42:00.150193 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb2caba-04d8-4e4b-9f13-c739b2f9150c" containerName="extract-utilities" Mar 12 15:42:00 crc kubenswrapper[4869]: I0312 15:42:00.150212 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb2caba-04d8-4e4b-9f13-c739b2f9150c" containerName="extract-utilities" Mar 12 15:42:00 crc kubenswrapper[4869]: E0312 15:42:00.150258 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb2caba-04d8-4e4b-9f13-c739b2f9150c" containerName="registry-server" Mar 12 15:42:00 crc kubenswrapper[4869]: I0312 15:42:00.150267 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb2caba-04d8-4e4b-9f13-c739b2f9150c" containerName="registry-server" Mar 12 15:42:00 crc kubenswrapper[4869]: E0312 15:42:00.150281 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb2caba-04d8-4e4b-9f13-c739b2f9150c" containerName="extract-content" Mar 12 15:42:00 crc kubenswrapper[4869]: I0312 15:42:00.150290 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb2caba-04d8-4e4b-9f13-c739b2f9150c" containerName="extract-content" Mar 12 15:42:00 crc kubenswrapper[4869]: I0312 15:42:00.150577 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb2caba-04d8-4e4b-9f13-c739b2f9150c" containerName="registry-server" Mar 12 15:42:00 crc kubenswrapper[4869]: I0312 15:42:00.151440 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555502-wwbtd" Mar 12 15:42:00 crc kubenswrapper[4869]: I0312 15:42:00.156448 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 15:42:00 crc kubenswrapper[4869]: I0312 15:42:00.156932 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:42:00 crc kubenswrapper[4869]: I0312 15:42:00.161241 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:42:00 crc kubenswrapper[4869]: I0312 15:42:00.184661 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555502-wwbtd"] Mar 12 15:42:00 crc kubenswrapper[4869]: I0312 15:42:00.203329 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfk27\" (UniqueName: \"kubernetes.io/projected/072ae06d-72e8-4bf6-a46e-49cb7683bf85-kube-api-access-qfk27\") pod \"auto-csr-approver-29555502-wwbtd\" (UID: \"072ae06d-72e8-4bf6-a46e-49cb7683bf85\") " pod="openshift-infra/auto-csr-approver-29555502-wwbtd" Mar 12 15:42:00 crc kubenswrapper[4869]: I0312 15:42:00.304912 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfk27\" (UniqueName: \"kubernetes.io/projected/072ae06d-72e8-4bf6-a46e-49cb7683bf85-kube-api-access-qfk27\") pod \"auto-csr-approver-29555502-wwbtd\" (UID: \"072ae06d-72e8-4bf6-a46e-49cb7683bf85\") " pod="openshift-infra/auto-csr-approver-29555502-wwbtd" Mar 12 15:42:00 crc kubenswrapper[4869]: I0312 15:42:00.328311 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfk27\" (UniqueName: \"kubernetes.io/projected/072ae06d-72e8-4bf6-a46e-49cb7683bf85-kube-api-access-qfk27\") pod \"auto-csr-approver-29555502-wwbtd\" (UID: \"072ae06d-72e8-4bf6-a46e-49cb7683bf85\") " pod="openshift-infra/auto-csr-approver-29555502-wwbtd" Mar 12 15:42:00 crc kubenswrapper[4869]: I0312 15:42:00.480204 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555502-wwbtd" Mar 12 15:42:00 crc kubenswrapper[4869]: I0312 15:42:00.980206 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555502-wwbtd"] Mar 12 15:42:01 crc kubenswrapper[4869]: I0312 15:42:01.641085 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555502-wwbtd" event={"ID":"072ae06d-72e8-4bf6-a46e-49cb7683bf85","Type":"ContainerStarted","Data":"32fde2dc2ca977462292d936bea32c6812b49ed4c836153f34c030b7ca14b8ee"} Mar 12 15:42:02 crc kubenswrapper[4869]: I0312 15:42:02.650020 4869 generic.go:334] "Generic (PLEG): container finished" podID="072ae06d-72e8-4bf6-a46e-49cb7683bf85" containerID="915f2488eb0b8258c653ff9f60ac237d82fe756e3d9a1e8689fbd320662b0846" exitCode=0 Mar 12 15:42:02 crc kubenswrapper[4869]: I0312 15:42:02.650195 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555502-wwbtd" event={"ID":"072ae06d-72e8-4bf6-a46e-49cb7683bf85","Type":"ContainerDied","Data":"915f2488eb0b8258c653ff9f60ac237d82fe756e3d9a1e8689fbd320662b0846"} Mar 12 15:42:04 crc kubenswrapper[4869]: I0312 15:42:04.301320 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555502-wwbtd" Mar 12 15:42:04 crc kubenswrapper[4869]: I0312 15:42:04.382809 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfk27\" (UniqueName: \"kubernetes.io/projected/072ae06d-72e8-4bf6-a46e-49cb7683bf85-kube-api-access-qfk27\") pod \"072ae06d-72e8-4bf6-a46e-49cb7683bf85\" (UID: \"072ae06d-72e8-4bf6-a46e-49cb7683bf85\") " Mar 12 15:42:04 crc kubenswrapper[4869]: I0312 15:42:04.393774 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/072ae06d-72e8-4bf6-a46e-49cb7683bf85-kube-api-access-qfk27" (OuterVolumeSpecName: "kube-api-access-qfk27") pod "072ae06d-72e8-4bf6-a46e-49cb7683bf85" (UID: "072ae06d-72e8-4bf6-a46e-49cb7683bf85"). InnerVolumeSpecName "kube-api-access-qfk27". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:42:04 crc kubenswrapper[4869]: I0312 15:42:04.485896 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfk27\" (UniqueName: \"kubernetes.io/projected/072ae06d-72e8-4bf6-a46e-49cb7683bf85-kube-api-access-qfk27\") on node \"crc\" DevicePath \"\"" Mar 12 15:42:04 crc kubenswrapper[4869]: I0312 15:42:04.666077 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555502-wwbtd" event={"ID":"072ae06d-72e8-4bf6-a46e-49cb7683bf85","Type":"ContainerDied","Data":"32fde2dc2ca977462292d936bea32c6812b49ed4c836153f34c030b7ca14b8ee"} Mar 12 15:42:04 crc kubenswrapper[4869]: I0312 15:42:04.666125 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32fde2dc2ca977462292d936bea32c6812b49ed4c836153f34c030b7ca14b8ee" Mar 12 15:42:04 crc kubenswrapper[4869]: I0312 15:42:04.666184 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555502-wwbtd" Mar 12 15:42:05 crc kubenswrapper[4869]: I0312 15:42:05.341917 4869 scope.go:117] "RemoveContainer" containerID="36d0bcf926e3be6f6106266c4e6cc533ce9b43631d5b1e2e852616edb70cc1f8" Mar 12 15:42:05 crc kubenswrapper[4869]: E0312 15:42:05.342588 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:42:05 crc kubenswrapper[4869]: I0312 15:42:05.373786 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555496-fjblp"] Mar 12 15:42:05 crc kubenswrapper[4869]: I0312 15:42:05.382362 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555496-fjblp"] Mar 12 15:42:06 crc kubenswrapper[4869]: I0312 15:42:06.347199 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d404b3fc-a23e-44c6-9ca4-b2f4c41e4a6d" path="/var/lib/kubelet/pods/d404b3fc-a23e-44c6-9ca4-b2f4c41e4a6d/volumes" Mar 12 15:42:19 crc kubenswrapper[4869]: I0312 15:42:19.336528 4869 scope.go:117] "RemoveContainer" containerID="36d0bcf926e3be6f6106266c4e6cc533ce9b43631d5b1e2e852616edb70cc1f8" Mar 12 15:42:19 crc kubenswrapper[4869]: E0312 15:42:19.338225 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:42:33 crc kubenswrapper[4869]: I0312 15:42:33.336718 4869 scope.go:117] "RemoveContainer" containerID="36d0bcf926e3be6f6106266c4e6cc533ce9b43631d5b1e2e852616edb70cc1f8" Mar 12 15:42:33 crc kubenswrapper[4869]: E0312 15:42:33.337441 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:42:44 crc kubenswrapper[4869]: I0312 15:42:44.808735 4869 scope.go:117] "RemoveContainer" containerID="6525aa150dac46af6468f9077ba6402d1b7dae6222425afb8f2f789e91e312ac" Mar 12 15:42:45 crc kubenswrapper[4869]: I0312 15:42:45.337648 4869 scope.go:117] "RemoveContainer" containerID="36d0bcf926e3be6f6106266c4e6cc533ce9b43631d5b1e2e852616edb70cc1f8" Mar 12 15:42:45 crc kubenswrapper[4869]: E0312 15:42:45.338294 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:42:58 crc kubenswrapper[4869]: I0312 15:42:58.343085 4869 scope.go:117] "RemoveContainer" containerID="36d0bcf926e3be6f6106266c4e6cc533ce9b43631d5b1e2e852616edb70cc1f8" Mar 12 15:42:58 crc kubenswrapper[4869]: E0312 15:42:58.344206 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:43:05 crc kubenswrapper[4869]: I0312 15:43:05.705329 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gd5nn"] Mar 12 15:43:05 crc kubenswrapper[4869]: E0312 15:43:05.706228 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="072ae06d-72e8-4bf6-a46e-49cb7683bf85" containerName="oc" Mar 12 15:43:05 crc kubenswrapper[4869]: I0312 15:43:05.706239 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="072ae06d-72e8-4bf6-a46e-49cb7683bf85" containerName="oc" Mar 12 15:43:05 crc kubenswrapper[4869]: I0312 15:43:05.706468 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="072ae06d-72e8-4bf6-a46e-49cb7683bf85" containerName="oc" Mar 12 15:43:05 crc kubenswrapper[4869]: I0312 15:43:05.707839 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gd5nn" Mar 12 15:43:05 crc kubenswrapper[4869]: I0312 15:43:05.721210 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gd5nn"] Mar 12 15:43:05 crc kubenswrapper[4869]: I0312 15:43:05.769560 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc6bd\" (UniqueName: \"kubernetes.io/projected/e84eb100-e84b-4234-b763-114f335a6559-kube-api-access-pc6bd\") pod \"redhat-marketplace-gd5nn\" (UID: \"e84eb100-e84b-4234-b763-114f335a6559\") " pod="openshift-marketplace/redhat-marketplace-gd5nn" Mar 12 15:43:05 crc kubenswrapper[4869]: I0312 15:43:05.770194 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e84eb100-e84b-4234-b763-114f335a6559-utilities\") pod \"redhat-marketplace-gd5nn\" (UID: \"e84eb100-e84b-4234-b763-114f335a6559\") " pod="openshift-marketplace/redhat-marketplace-gd5nn" Mar 12 15:43:05 crc kubenswrapper[4869]: I0312 15:43:05.770220 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e84eb100-e84b-4234-b763-114f335a6559-catalog-content\") pod \"redhat-marketplace-gd5nn\" (UID: \"e84eb100-e84b-4234-b763-114f335a6559\") " pod="openshift-marketplace/redhat-marketplace-gd5nn" Mar 12 15:43:05 crc kubenswrapper[4869]: I0312 15:43:05.872364 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e84eb100-e84b-4234-b763-114f335a6559-utilities\") pod \"redhat-marketplace-gd5nn\" (UID: \"e84eb100-e84b-4234-b763-114f335a6559\") " pod="openshift-marketplace/redhat-marketplace-gd5nn" Mar 12 15:43:05 crc kubenswrapper[4869]: I0312 15:43:05.872406 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e84eb100-e84b-4234-b763-114f335a6559-catalog-content\") pod \"redhat-marketplace-gd5nn\" (UID: \"e84eb100-e84b-4234-b763-114f335a6559\") " pod="openshift-marketplace/redhat-marketplace-gd5nn" Mar 12 15:43:05 crc kubenswrapper[4869]: I0312 15:43:05.872458 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc6bd\" (UniqueName: \"kubernetes.io/projected/e84eb100-e84b-4234-b763-114f335a6559-kube-api-access-pc6bd\") pod \"redhat-marketplace-gd5nn\" (UID: \"e84eb100-e84b-4234-b763-114f335a6559\") " pod="openshift-marketplace/redhat-marketplace-gd5nn" Mar 12 15:43:05 crc kubenswrapper[4869]: I0312 15:43:05.873049 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e84eb100-e84b-4234-b763-114f335a6559-utilities\") pod \"redhat-marketplace-gd5nn\" (UID: \"e84eb100-e84b-4234-b763-114f335a6559\") " pod="openshift-marketplace/redhat-marketplace-gd5nn" Mar 12 15:43:05 crc kubenswrapper[4869]: I0312 15:43:05.873102 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e84eb100-e84b-4234-b763-114f335a6559-catalog-content\") pod \"redhat-marketplace-gd5nn\" (UID: \"e84eb100-e84b-4234-b763-114f335a6559\") " pod="openshift-marketplace/redhat-marketplace-gd5nn" Mar 12 15:43:05 crc kubenswrapper[4869]: I0312 15:43:05.896417 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc6bd\" (UniqueName: \"kubernetes.io/projected/e84eb100-e84b-4234-b763-114f335a6559-kube-api-access-pc6bd\") pod \"redhat-marketplace-gd5nn\" (UID: \"e84eb100-e84b-4234-b763-114f335a6559\") " pod="openshift-marketplace/redhat-marketplace-gd5nn" Mar 12 15:43:06 crc kubenswrapper[4869]: I0312 15:43:06.034771 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gd5nn" Mar 12 15:43:06 crc kubenswrapper[4869]: I0312 15:43:06.510201 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gd5nn"] Mar 12 15:43:07 crc kubenswrapper[4869]: I0312 15:43:07.185809 4869 generic.go:334] "Generic (PLEG): container finished" podID="e84eb100-e84b-4234-b763-114f335a6559" containerID="d83d63d7eaaf3cc2d90cf21448c9ec1dc3c08788cd316eaa4d8d2ed683639e26" exitCode=0 Mar 12 15:43:07 crc kubenswrapper[4869]: I0312 15:43:07.185935 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gd5nn" event={"ID":"e84eb100-e84b-4234-b763-114f335a6559","Type":"ContainerDied","Data":"d83d63d7eaaf3cc2d90cf21448c9ec1dc3c08788cd316eaa4d8d2ed683639e26"} Mar 12 15:43:07 crc kubenswrapper[4869]: I0312 15:43:07.186134 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gd5nn" event={"ID":"e84eb100-e84b-4234-b763-114f335a6559","Type":"ContainerStarted","Data":"a891dedf9580c878d207864bc5e1e565b24be8105f8ed487cc87a056c3c9a42a"} Mar 12 15:43:08 crc kubenswrapper[4869]: I0312 15:43:08.204503 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gd5nn" event={"ID":"e84eb100-e84b-4234-b763-114f335a6559","Type":"ContainerStarted","Data":"31fb12ea9a95423ef661bbdf86176a6106c47f5e3097cdef16a58784ff9fcc20"} Mar 12 15:43:10 crc kubenswrapper[4869]: I0312 15:43:10.223000 4869 generic.go:334] "Generic (PLEG): container finished" podID="e84eb100-e84b-4234-b763-114f335a6559" containerID="31fb12ea9a95423ef661bbdf86176a6106c47f5e3097cdef16a58784ff9fcc20" exitCode=0 Mar 12 15:43:10 crc kubenswrapper[4869]: I0312 15:43:10.223089 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gd5nn" event={"ID":"e84eb100-e84b-4234-b763-114f335a6559","Type":"ContainerDied","Data":"31fb12ea9a95423ef661bbdf86176a6106c47f5e3097cdef16a58784ff9fcc20"} Mar 12 15:43:11 crc kubenswrapper[4869]: I0312 15:43:11.243089 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gd5nn" event={"ID":"e84eb100-e84b-4234-b763-114f335a6559","Type":"ContainerStarted","Data":"25188770e644fb6ddb1e45d6fb238fe093558f2b676d9848bdba95d3a8f2f930"} Mar 12 15:43:11 crc kubenswrapper[4869]: I0312 15:43:11.270959 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gd5nn" podStartSLOduration=2.753210427 podStartE2EDuration="6.270934873s" podCreationTimestamp="2026-03-12 15:43:05 +0000 UTC" firstStartedPulling="2026-03-12 15:43:07.187337923 +0000 UTC m=+3339.472563201" lastFinishedPulling="2026-03-12 15:43:10.705062369 +0000 UTC m=+3342.990287647" observedRunningTime="2026-03-12 15:43:11.265174146 +0000 UTC m=+3343.550399444" watchObservedRunningTime="2026-03-12 15:43:11.270934873 +0000 UTC m=+3343.556160151" Mar 12 15:43:12 crc kubenswrapper[4869]: I0312 15:43:12.336532 4869 scope.go:117] "RemoveContainer" containerID="36d0bcf926e3be6f6106266c4e6cc533ce9b43631d5b1e2e852616edb70cc1f8" Mar 12 15:43:12 crc kubenswrapper[4869]: E0312 15:43:12.337070 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:43:16 crc kubenswrapper[4869]: I0312 15:43:16.035388 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gd5nn" Mar 12 15:43:16 crc kubenswrapper[4869]: I0312 15:43:16.035916 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gd5nn" Mar 12 15:43:16 crc kubenswrapper[4869]: I0312 15:43:16.083564 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gd5nn" Mar 12 15:43:16 crc kubenswrapper[4869]: I0312 15:43:16.346150 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gd5nn" Mar 12 15:43:16 crc kubenswrapper[4869]: I0312 15:43:16.397921 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gd5nn"] Mar 12 15:43:18 crc kubenswrapper[4869]: I0312 15:43:18.302575 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gd5nn" podUID="e84eb100-e84b-4234-b763-114f335a6559" containerName="registry-server" containerID="cri-o://25188770e644fb6ddb1e45d6fb238fe093558f2b676d9848bdba95d3a8f2f930" gracePeriod=2 Mar 12 15:43:18 crc kubenswrapper[4869]: E0312 15:43:18.516520 4869 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode84eb100_e84b_4234_b763_114f335a6559.slice/crio-conmon-25188770e644fb6ddb1e45d6fb238fe093558f2b676d9848bdba95d3a8f2f930.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode84eb100_e84b_4234_b763_114f335a6559.slice/crio-25188770e644fb6ddb1e45d6fb238fe093558f2b676d9848bdba95d3a8f2f930.scope\": RecentStats: unable to find data in memory cache]" Mar 12 15:43:19 crc kubenswrapper[4869]: I0312 15:43:19.049087 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gd5nn" Mar 12 15:43:19 crc kubenswrapper[4869]: I0312 15:43:19.183498 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e84eb100-e84b-4234-b763-114f335a6559-catalog-content\") pod \"e84eb100-e84b-4234-b763-114f335a6559\" (UID: \"e84eb100-e84b-4234-b763-114f335a6559\") " Mar 12 15:43:19 crc kubenswrapper[4869]: I0312 15:43:19.183794 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc6bd\" (UniqueName: \"kubernetes.io/projected/e84eb100-e84b-4234-b763-114f335a6559-kube-api-access-pc6bd\") pod \"e84eb100-e84b-4234-b763-114f335a6559\" (UID: \"e84eb100-e84b-4234-b763-114f335a6559\") " Mar 12 15:43:19 crc kubenswrapper[4869]: I0312 15:43:19.183927 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e84eb100-e84b-4234-b763-114f335a6559-utilities\") pod \"e84eb100-e84b-4234-b763-114f335a6559\" (UID: \"e84eb100-e84b-4234-b763-114f335a6559\") " Mar 12 15:43:19 crc kubenswrapper[4869]: I0312 15:43:19.184965 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e84eb100-e84b-4234-b763-114f335a6559-utilities" (OuterVolumeSpecName: "utilities") pod "e84eb100-e84b-4234-b763-114f335a6559" (UID: "e84eb100-e84b-4234-b763-114f335a6559"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:43:19 crc kubenswrapper[4869]: I0312 15:43:19.191564 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e84eb100-e84b-4234-b763-114f335a6559-kube-api-access-pc6bd" (OuterVolumeSpecName: "kube-api-access-pc6bd") pod "e84eb100-e84b-4234-b763-114f335a6559" (UID: "e84eb100-e84b-4234-b763-114f335a6559"). InnerVolumeSpecName "kube-api-access-pc6bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:43:19 crc kubenswrapper[4869]: I0312 15:43:19.223927 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e84eb100-e84b-4234-b763-114f335a6559-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e84eb100-e84b-4234-b763-114f335a6559" (UID: "e84eb100-e84b-4234-b763-114f335a6559"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:43:19 crc kubenswrapper[4869]: I0312 15:43:19.286124 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc6bd\" (UniqueName: \"kubernetes.io/projected/e84eb100-e84b-4234-b763-114f335a6559-kube-api-access-pc6bd\") on node \"crc\" DevicePath \"\"" Mar 12 15:43:19 crc kubenswrapper[4869]: I0312 15:43:19.286515 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e84eb100-e84b-4234-b763-114f335a6559-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:43:19 crc kubenswrapper[4869]: I0312 15:43:19.286528 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e84eb100-e84b-4234-b763-114f335a6559-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:43:19 crc kubenswrapper[4869]: I0312 15:43:19.316191 4869 generic.go:334] "Generic (PLEG): container finished" podID="e84eb100-e84b-4234-b763-114f335a6559" containerID="25188770e644fb6ddb1e45d6fb238fe093558f2b676d9848bdba95d3a8f2f930" exitCode=0 Mar 12 15:43:19 crc kubenswrapper[4869]: I0312 15:43:19.316245 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gd5nn" event={"ID":"e84eb100-e84b-4234-b763-114f335a6559","Type":"ContainerDied","Data":"25188770e644fb6ddb1e45d6fb238fe093558f2b676d9848bdba95d3a8f2f930"} Mar 12 15:43:19 crc kubenswrapper[4869]: I0312 15:43:19.316252 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gd5nn" Mar 12 15:43:19 crc kubenswrapper[4869]: I0312 15:43:19.316279 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gd5nn" event={"ID":"e84eb100-e84b-4234-b763-114f335a6559","Type":"ContainerDied","Data":"a891dedf9580c878d207864bc5e1e565b24be8105f8ed487cc87a056c3c9a42a"} Mar 12 15:43:19 crc kubenswrapper[4869]: I0312 15:43:19.316304 4869 scope.go:117] "RemoveContainer" containerID="25188770e644fb6ddb1e45d6fb238fe093558f2b676d9848bdba95d3a8f2f930" Mar 12 15:43:19 crc kubenswrapper[4869]: I0312 15:43:19.335951 4869 scope.go:117] "RemoveContainer" containerID="31fb12ea9a95423ef661bbdf86176a6106c47f5e3097cdef16a58784ff9fcc20" Mar 12 15:43:19 crc kubenswrapper[4869]: I0312 15:43:19.359257 4869 scope.go:117] "RemoveContainer" containerID="d83d63d7eaaf3cc2d90cf21448c9ec1dc3c08788cd316eaa4d8d2ed683639e26" Mar 12 15:43:19 crc kubenswrapper[4869]: I0312 15:43:19.369348 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gd5nn"] Mar 12 15:43:19 crc kubenswrapper[4869]: I0312 15:43:19.377690 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gd5nn"] Mar 12 15:43:19 crc kubenswrapper[4869]: I0312 15:43:19.418396 4869 scope.go:117] "RemoveContainer" containerID="25188770e644fb6ddb1e45d6fb238fe093558f2b676d9848bdba95d3a8f2f930" Mar 12 15:43:19 crc kubenswrapper[4869]: E0312 15:43:19.418921 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25188770e644fb6ddb1e45d6fb238fe093558f2b676d9848bdba95d3a8f2f930\": container with ID starting with 25188770e644fb6ddb1e45d6fb238fe093558f2b676d9848bdba95d3a8f2f930 not found: ID does not exist" containerID="25188770e644fb6ddb1e45d6fb238fe093558f2b676d9848bdba95d3a8f2f930" Mar 12 15:43:19 crc kubenswrapper[4869]: I0312 15:43:19.419007 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25188770e644fb6ddb1e45d6fb238fe093558f2b676d9848bdba95d3a8f2f930"} err="failed to get container status \"25188770e644fb6ddb1e45d6fb238fe093558f2b676d9848bdba95d3a8f2f930\": rpc error: code = NotFound desc = could not find container \"25188770e644fb6ddb1e45d6fb238fe093558f2b676d9848bdba95d3a8f2f930\": container with ID starting with 25188770e644fb6ddb1e45d6fb238fe093558f2b676d9848bdba95d3a8f2f930 not found: ID does not exist" Mar 12 15:43:19 crc kubenswrapper[4869]: I0312 15:43:19.419058 4869 scope.go:117] "RemoveContainer" containerID="31fb12ea9a95423ef661bbdf86176a6106c47f5e3097cdef16a58784ff9fcc20" Mar 12 15:43:19 crc kubenswrapper[4869]: E0312 15:43:19.419582 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31fb12ea9a95423ef661bbdf86176a6106c47f5e3097cdef16a58784ff9fcc20\": container with ID starting with 31fb12ea9a95423ef661bbdf86176a6106c47f5e3097cdef16a58784ff9fcc20 not found: ID does not exist" containerID="31fb12ea9a95423ef661bbdf86176a6106c47f5e3097cdef16a58784ff9fcc20" Mar 12 15:43:19 crc kubenswrapper[4869]: I0312 15:43:19.419623 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31fb12ea9a95423ef661bbdf86176a6106c47f5e3097cdef16a58784ff9fcc20"} err="failed to get container status \"31fb12ea9a95423ef661bbdf86176a6106c47f5e3097cdef16a58784ff9fcc20\": rpc error: code = NotFound desc = could not find container \"31fb12ea9a95423ef661bbdf86176a6106c47f5e3097cdef16a58784ff9fcc20\": container with ID starting with 31fb12ea9a95423ef661bbdf86176a6106c47f5e3097cdef16a58784ff9fcc20 not found: ID does not exist" Mar 12 15:43:19 crc kubenswrapper[4869]: I0312 15:43:19.419652 4869 scope.go:117] "RemoveContainer" containerID="d83d63d7eaaf3cc2d90cf21448c9ec1dc3c08788cd316eaa4d8d2ed683639e26" Mar 12 15:43:19 crc kubenswrapper[4869]: E0312 15:43:19.420059 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d83d63d7eaaf3cc2d90cf21448c9ec1dc3c08788cd316eaa4d8d2ed683639e26\": container with ID starting with d83d63d7eaaf3cc2d90cf21448c9ec1dc3c08788cd316eaa4d8d2ed683639e26 not found: ID does not exist" containerID="d83d63d7eaaf3cc2d90cf21448c9ec1dc3c08788cd316eaa4d8d2ed683639e26" Mar 12 15:43:19 crc kubenswrapper[4869]: I0312 15:43:19.420115 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d83d63d7eaaf3cc2d90cf21448c9ec1dc3c08788cd316eaa4d8d2ed683639e26"} err="failed to get container status \"d83d63d7eaaf3cc2d90cf21448c9ec1dc3c08788cd316eaa4d8d2ed683639e26\": rpc error: code = NotFound desc = could not find container \"d83d63d7eaaf3cc2d90cf21448c9ec1dc3c08788cd316eaa4d8d2ed683639e26\": container with ID starting with d83d63d7eaaf3cc2d90cf21448c9ec1dc3c08788cd316eaa4d8d2ed683639e26 not found: ID does not exist" Mar 12 15:43:20 crc kubenswrapper[4869]: I0312 15:43:20.348750 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e84eb100-e84b-4234-b763-114f335a6559" path="/var/lib/kubelet/pods/e84eb100-e84b-4234-b763-114f335a6559/volumes" Mar 12 15:43:25 crc kubenswrapper[4869]: I0312 15:43:25.337152 4869 scope.go:117] "RemoveContainer" containerID="36d0bcf926e3be6f6106266c4e6cc533ce9b43631d5b1e2e852616edb70cc1f8" Mar 12 15:43:25 crc kubenswrapper[4869]: E0312 15:43:25.337872 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:43:36 crc kubenswrapper[4869]: I0312 15:43:36.338204 4869 scope.go:117] "RemoveContainer" containerID="36d0bcf926e3be6f6106266c4e6cc533ce9b43631d5b1e2e852616edb70cc1f8" Mar 12 15:43:36 crc kubenswrapper[4869]: E0312 15:43:36.339426 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:43:49 crc kubenswrapper[4869]: I0312 15:43:49.337651 4869 scope.go:117] "RemoveContainer" containerID="36d0bcf926e3be6f6106266c4e6cc533ce9b43631d5b1e2e852616edb70cc1f8" Mar 12 15:43:49 crc kubenswrapper[4869]: E0312 15:43:49.338349 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:44:00 crc kubenswrapper[4869]: I0312 15:44:00.164804 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555504-pqjjc"] Mar 12 15:44:00 crc kubenswrapper[4869]: E0312 15:44:00.166120 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e84eb100-e84b-4234-b763-114f335a6559" containerName="extract-utilities" Mar 12 15:44:00 crc kubenswrapper[4869]: I0312 15:44:00.166134 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e84eb100-e84b-4234-b763-114f335a6559" containerName="extract-utilities" Mar 12 15:44:00 crc kubenswrapper[4869]: E0312 15:44:00.166177 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e84eb100-e84b-4234-b763-114f335a6559" containerName="extract-content" Mar 12 15:44:00 crc kubenswrapper[4869]: I0312 15:44:00.166190 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e84eb100-e84b-4234-b763-114f335a6559" containerName="extract-content" Mar 12 15:44:00 crc kubenswrapper[4869]: E0312 15:44:00.166199 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e84eb100-e84b-4234-b763-114f335a6559" containerName="registry-server" Mar 12 15:44:00 crc kubenswrapper[4869]: I0312 15:44:00.166206 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e84eb100-e84b-4234-b763-114f335a6559" containerName="registry-server" Mar 12 15:44:00 crc kubenswrapper[4869]: I0312 15:44:00.167900 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="e84eb100-e84b-4234-b763-114f335a6559" containerName="registry-server" Mar 12 15:44:00 crc kubenswrapper[4869]: I0312 15:44:00.168997 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555504-pqjjc" Mar 12 15:44:00 crc kubenswrapper[4869]: I0312 15:44:00.176153 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 15:44:00 crc kubenswrapper[4869]: I0312 15:44:00.177000 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:44:00 crc kubenswrapper[4869]: I0312 15:44:00.177180 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:44:00 crc kubenswrapper[4869]: I0312 15:44:00.192241 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555504-pqjjc"] Mar 12 15:44:00 crc kubenswrapper[4869]: I0312 15:44:00.256413 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29xbv\" (UniqueName: \"kubernetes.io/projected/ad2a80b3-7549-42db-a5c0-6dc0a5ddb6a5-kube-api-access-29xbv\") pod \"auto-csr-approver-29555504-pqjjc\" (UID: \"ad2a80b3-7549-42db-a5c0-6dc0a5ddb6a5\") " pod="openshift-infra/auto-csr-approver-29555504-pqjjc" Mar 12 15:44:00 crc kubenswrapper[4869]: I0312 15:44:00.360346 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29xbv\" (UniqueName: \"kubernetes.io/projected/ad2a80b3-7549-42db-a5c0-6dc0a5ddb6a5-kube-api-access-29xbv\") pod \"auto-csr-approver-29555504-pqjjc\" (UID: \"ad2a80b3-7549-42db-a5c0-6dc0a5ddb6a5\") " pod="openshift-infra/auto-csr-approver-29555504-pqjjc" Mar 12 15:44:00 crc kubenswrapper[4869]: I0312 15:44:00.380710 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29xbv\" (UniqueName: \"kubernetes.io/projected/ad2a80b3-7549-42db-a5c0-6dc0a5ddb6a5-kube-api-access-29xbv\") pod \"auto-csr-approver-29555504-pqjjc\" (UID: \"ad2a80b3-7549-42db-a5c0-6dc0a5ddb6a5\") " pod="openshift-infra/auto-csr-approver-29555504-pqjjc" Mar 12 15:44:00 crc kubenswrapper[4869]: I0312 15:44:00.497768 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555504-pqjjc" Mar 12 15:44:01 crc kubenswrapper[4869]: I0312 15:44:01.038482 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555504-pqjjc"] Mar 12 15:44:01 crc kubenswrapper[4869]: I0312 15:44:01.689864 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555504-pqjjc" event={"ID":"ad2a80b3-7549-42db-a5c0-6dc0a5ddb6a5","Type":"ContainerStarted","Data":"449e84c476855289806c936b46757ec903f32de9dfcdf8424f285c69851dc7c2"} Mar 12 15:44:02 crc kubenswrapper[4869]: I0312 15:44:02.700757 4869 generic.go:334] "Generic (PLEG): container finished" podID="ad2a80b3-7549-42db-a5c0-6dc0a5ddb6a5" containerID="43a32a753f159228714ea82b5b413508bcff228809087b93133227aecdfcb4c9" exitCode=0 Mar 12 15:44:02 crc kubenswrapper[4869]: I0312 15:44:02.701466 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555504-pqjjc" event={"ID":"ad2a80b3-7549-42db-a5c0-6dc0a5ddb6a5","Type":"ContainerDied","Data":"43a32a753f159228714ea82b5b413508bcff228809087b93133227aecdfcb4c9"} Mar 12 15:44:03 crc kubenswrapper[4869]: I0312 15:44:03.336645 4869 scope.go:117] "RemoveContainer" containerID="36d0bcf926e3be6f6106266c4e6cc533ce9b43631d5b1e2e852616edb70cc1f8" Mar 12 15:44:03 crc kubenswrapper[4869]: E0312 15:44:03.337092 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:44:04 crc kubenswrapper[4869]: I0312 15:44:04.365916 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555504-pqjjc" Mar 12 15:44:04 crc kubenswrapper[4869]: I0312 15:44:04.457759 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29xbv\" (UniqueName: \"kubernetes.io/projected/ad2a80b3-7549-42db-a5c0-6dc0a5ddb6a5-kube-api-access-29xbv\") pod \"ad2a80b3-7549-42db-a5c0-6dc0a5ddb6a5\" (UID: \"ad2a80b3-7549-42db-a5c0-6dc0a5ddb6a5\") " Mar 12 15:44:04 crc kubenswrapper[4869]: I0312 15:44:04.478737 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad2a80b3-7549-42db-a5c0-6dc0a5ddb6a5-kube-api-access-29xbv" (OuterVolumeSpecName: "kube-api-access-29xbv") pod "ad2a80b3-7549-42db-a5c0-6dc0a5ddb6a5" (UID: "ad2a80b3-7549-42db-a5c0-6dc0a5ddb6a5"). InnerVolumeSpecName "kube-api-access-29xbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:44:04 crc kubenswrapper[4869]: I0312 15:44:04.560011 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29xbv\" (UniqueName: \"kubernetes.io/projected/ad2a80b3-7549-42db-a5c0-6dc0a5ddb6a5-kube-api-access-29xbv\") on node \"crc\" DevicePath \"\"" Mar 12 15:44:04 crc kubenswrapper[4869]: I0312 15:44:04.721856 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555504-pqjjc" event={"ID":"ad2a80b3-7549-42db-a5c0-6dc0a5ddb6a5","Type":"ContainerDied","Data":"449e84c476855289806c936b46757ec903f32de9dfcdf8424f285c69851dc7c2"} Mar 12 15:44:04 crc kubenswrapper[4869]: I0312 15:44:04.721909 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="449e84c476855289806c936b46757ec903f32de9dfcdf8424f285c69851dc7c2" Mar 12 15:44:04 crc kubenswrapper[4869]: I0312 15:44:04.721925 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555504-pqjjc" Mar 12 15:44:05 crc kubenswrapper[4869]: I0312 15:44:05.873122 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555498-nwqlg"] Mar 12 15:44:05 crc kubenswrapper[4869]: I0312 15:44:05.886218 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555498-nwqlg"] Mar 12 15:44:06 crc kubenswrapper[4869]: I0312 15:44:06.347135 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52818aa3-3c68-4f32-bdce-b46d8fbc7315" path="/var/lib/kubelet/pods/52818aa3-3c68-4f32-bdce-b46d8fbc7315/volumes" Mar 12 15:44:15 crc kubenswrapper[4869]: I0312 15:44:15.336744 4869 scope.go:117] "RemoveContainer" containerID="36d0bcf926e3be6f6106266c4e6cc533ce9b43631d5b1e2e852616edb70cc1f8" Mar 12 15:44:15 crc kubenswrapper[4869]: E0312 15:44:15.338309 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:44:28 crc kubenswrapper[4869]: I0312 15:44:28.343718 4869 scope.go:117] "RemoveContainer" containerID="36d0bcf926e3be6f6106266c4e6cc533ce9b43631d5b1e2e852616edb70cc1f8" Mar 12 15:44:28 crc kubenswrapper[4869]: E0312 15:44:28.344938 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:44:43 crc kubenswrapper[4869]: I0312 15:44:43.337274 4869 scope.go:117] "RemoveContainer" containerID="36d0bcf926e3be6f6106266c4e6cc533ce9b43631d5b1e2e852616edb70cc1f8" Mar 12 15:44:43 crc kubenswrapper[4869]: E0312 15:44:43.339753 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:44:44 crc kubenswrapper[4869]: I0312 15:44:44.908142 4869 scope.go:117] "RemoveContainer" containerID="aa29d85398b129cdff3de361f66fe962f55883df5943e05b597a1f8bf6acbd0e" Mar 12 15:44:58 crc kubenswrapper[4869]: I0312 15:44:58.344120 4869 scope.go:117] "RemoveContainer" containerID="36d0bcf926e3be6f6106266c4e6cc533ce9b43631d5b1e2e852616edb70cc1f8" Mar 12 15:44:59 crc kubenswrapper[4869]: I0312 15:44:59.338909 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerStarted","Data":"b6d851f5a5fde8f34bd804a8c177dc9fd9c07eee2c915ed4e71e8a9cfb1b421b"} Mar 12 15:45:00 crc kubenswrapper[4869]: I0312 15:45:00.159923 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555505-gbtkn"] Mar 12 15:45:00 crc kubenswrapper[4869]: E0312 15:45:00.160799 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad2a80b3-7549-42db-a5c0-6dc0a5ddb6a5" containerName="oc" Mar 12 15:45:00 crc kubenswrapper[4869]: I0312 15:45:00.160813 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad2a80b3-7549-42db-a5c0-6dc0a5ddb6a5" containerName="oc" Mar 12 15:45:00 crc kubenswrapper[4869]: I0312 15:45:00.161040 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad2a80b3-7549-42db-a5c0-6dc0a5ddb6a5" containerName="oc" Mar 12 15:45:00 crc kubenswrapper[4869]: I0312 15:45:00.161642 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-gbtkn" Mar 12 15:45:00 crc kubenswrapper[4869]: I0312 15:45:00.164858 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 12 15:45:00 crc kubenswrapper[4869]: I0312 15:45:00.164920 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 12 15:45:00 crc kubenswrapper[4869]: I0312 15:45:00.176708 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555505-gbtkn"] Mar 12 15:45:00 crc kubenswrapper[4869]: I0312 15:45:00.284953 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fe6a526a-08d0-48fd-b316-2e7b27414f31-secret-volume\") pod \"collect-profiles-29555505-gbtkn\" (UID: \"fe6a526a-08d0-48fd-b316-2e7b27414f31\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-gbtkn" Mar 12 15:45:00 crc kubenswrapper[4869]: I0312 15:45:00.285020 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn7lp\" (UniqueName: \"kubernetes.io/projected/fe6a526a-08d0-48fd-b316-2e7b27414f31-kube-api-access-wn7lp\") pod \"collect-profiles-29555505-gbtkn\" (UID: \"fe6a526a-08d0-48fd-b316-2e7b27414f31\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-gbtkn" Mar 12 15:45:00 crc kubenswrapper[4869]: I0312 15:45:00.285091 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fe6a526a-08d0-48fd-b316-2e7b27414f31-config-volume\") pod \"collect-profiles-29555505-gbtkn\" (UID: \"fe6a526a-08d0-48fd-b316-2e7b27414f31\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-gbtkn" Mar 12 15:45:00 crc kubenswrapper[4869]: I0312 15:45:00.386997 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fe6a526a-08d0-48fd-b316-2e7b27414f31-secret-volume\") pod \"collect-profiles-29555505-gbtkn\" (UID: \"fe6a526a-08d0-48fd-b316-2e7b27414f31\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-gbtkn" Mar 12 15:45:00 crc kubenswrapper[4869]: I0312 15:45:00.387053 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn7lp\" (UniqueName: \"kubernetes.io/projected/fe6a526a-08d0-48fd-b316-2e7b27414f31-kube-api-access-wn7lp\") pod \"collect-profiles-29555505-gbtkn\" (UID: \"fe6a526a-08d0-48fd-b316-2e7b27414f31\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-gbtkn" Mar 12 15:45:00 crc kubenswrapper[4869]: I0312 15:45:00.387099 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fe6a526a-08d0-48fd-b316-2e7b27414f31-config-volume\") pod \"collect-profiles-29555505-gbtkn\" (UID: \"fe6a526a-08d0-48fd-b316-2e7b27414f31\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-gbtkn" Mar 12 15:45:00 crc kubenswrapper[4869]: I0312 15:45:00.388044 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fe6a526a-08d0-48fd-b316-2e7b27414f31-config-volume\") pod \"collect-profiles-29555505-gbtkn\" (UID: \"fe6a526a-08d0-48fd-b316-2e7b27414f31\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-gbtkn" Mar 12 15:45:00 crc kubenswrapper[4869]: I0312 15:45:00.406558 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fe6a526a-08d0-48fd-b316-2e7b27414f31-secret-volume\") pod \"collect-profiles-29555505-gbtkn\" (UID: \"fe6a526a-08d0-48fd-b316-2e7b27414f31\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-gbtkn" Mar 12 15:45:00 crc kubenswrapper[4869]: I0312 15:45:00.408029 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn7lp\" (UniqueName: \"kubernetes.io/projected/fe6a526a-08d0-48fd-b316-2e7b27414f31-kube-api-access-wn7lp\") pod \"collect-profiles-29555505-gbtkn\" (UID: \"fe6a526a-08d0-48fd-b316-2e7b27414f31\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-gbtkn" Mar 12 15:45:00 crc kubenswrapper[4869]: I0312 15:45:00.478846 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-gbtkn" Mar 12 15:45:01 crc kubenswrapper[4869]: I0312 15:45:01.102696 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555505-gbtkn"] Mar 12 15:45:01 crc kubenswrapper[4869]: I0312 15:45:01.373145 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-gbtkn" event={"ID":"fe6a526a-08d0-48fd-b316-2e7b27414f31","Type":"ContainerStarted","Data":"ea5a28cca1cde6a9a96959c3846dfb1f6f00e4339b66de374f1e9f1890fc7d90"} Mar 12 15:45:01 crc kubenswrapper[4869]: I0312 15:45:01.373638 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-gbtkn" event={"ID":"fe6a526a-08d0-48fd-b316-2e7b27414f31","Type":"ContainerStarted","Data":"b97201292e07e2248c92423b39aaefae466ae303600bf4901091cdd595d86c5a"} Mar 12 15:45:01 crc kubenswrapper[4869]: I0312 15:45:01.395181 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-gbtkn" podStartSLOduration=1.395160919 podStartE2EDuration="1.395160919s" podCreationTimestamp="2026-03-12 15:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:45:01.392273057 +0000 UTC m=+3453.677498335" watchObservedRunningTime="2026-03-12 15:45:01.395160919 +0000 UTC m=+3453.680386197" Mar 12 15:45:02 crc kubenswrapper[4869]: I0312 15:45:02.382654 4869 generic.go:334] "Generic (PLEG): container finished" podID="fe6a526a-08d0-48fd-b316-2e7b27414f31" containerID="ea5a28cca1cde6a9a96959c3846dfb1f6f00e4339b66de374f1e9f1890fc7d90" exitCode=0 Mar 12 15:45:02 crc kubenswrapper[4869]: I0312 15:45:02.382721 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-gbtkn" event={"ID":"fe6a526a-08d0-48fd-b316-2e7b27414f31","Type":"ContainerDied","Data":"ea5a28cca1cde6a9a96959c3846dfb1f6f00e4339b66de374f1e9f1890fc7d90"} Mar 12 15:45:04 crc kubenswrapper[4869]: I0312 15:45:04.078600 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-gbtkn" Mar 12 15:45:04 crc kubenswrapper[4869]: I0312 15:45:04.115143 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fe6a526a-08d0-48fd-b316-2e7b27414f31-secret-volume\") pod \"fe6a526a-08d0-48fd-b316-2e7b27414f31\" (UID: \"fe6a526a-08d0-48fd-b316-2e7b27414f31\") " Mar 12 15:45:04 crc kubenswrapper[4869]: I0312 15:45:04.115282 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fe6a526a-08d0-48fd-b316-2e7b27414f31-config-volume\") pod \"fe6a526a-08d0-48fd-b316-2e7b27414f31\" (UID: \"fe6a526a-08d0-48fd-b316-2e7b27414f31\") " Mar 12 15:45:04 crc kubenswrapper[4869]: I0312 15:45:04.115304 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn7lp\" (UniqueName: \"kubernetes.io/projected/fe6a526a-08d0-48fd-b316-2e7b27414f31-kube-api-access-wn7lp\") pod \"fe6a526a-08d0-48fd-b316-2e7b27414f31\" (UID: \"fe6a526a-08d0-48fd-b316-2e7b27414f31\") " Mar 12 15:45:04 crc kubenswrapper[4869]: I0312 15:45:04.118812 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe6a526a-08d0-48fd-b316-2e7b27414f31-config-volume" (OuterVolumeSpecName: "config-volume") pod "fe6a526a-08d0-48fd-b316-2e7b27414f31" (UID: "fe6a526a-08d0-48fd-b316-2e7b27414f31"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:45:04 crc kubenswrapper[4869]: I0312 15:45:04.142960 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe6a526a-08d0-48fd-b316-2e7b27414f31-kube-api-access-wn7lp" (OuterVolumeSpecName: "kube-api-access-wn7lp") pod "fe6a526a-08d0-48fd-b316-2e7b27414f31" (UID: "fe6a526a-08d0-48fd-b316-2e7b27414f31"). InnerVolumeSpecName "kube-api-access-wn7lp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:45:04 crc kubenswrapper[4869]: I0312 15:45:04.149361 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6a526a-08d0-48fd-b316-2e7b27414f31-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fe6a526a-08d0-48fd-b316-2e7b27414f31" (UID: "fe6a526a-08d0-48fd-b316-2e7b27414f31"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:45:04 crc kubenswrapper[4869]: I0312 15:45:04.217089 4869 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fe6a526a-08d0-48fd-b316-2e7b27414f31-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 12 15:45:04 crc kubenswrapper[4869]: I0312 15:45:04.217119 4869 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fe6a526a-08d0-48fd-b316-2e7b27414f31-config-volume\") on node \"crc\" DevicePath \"\"" Mar 12 15:45:04 crc kubenswrapper[4869]: I0312 15:45:04.217129 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wn7lp\" (UniqueName: \"kubernetes.io/projected/fe6a526a-08d0-48fd-b316-2e7b27414f31-kube-api-access-wn7lp\") on node \"crc\" DevicePath \"\"" Mar 12 15:45:04 crc kubenswrapper[4869]: I0312 15:45:04.398767 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-gbtkn" event={"ID":"fe6a526a-08d0-48fd-b316-2e7b27414f31","Type":"ContainerDied","Data":"b97201292e07e2248c92423b39aaefae466ae303600bf4901091cdd595d86c5a"} Mar 12 15:45:04 crc kubenswrapper[4869]: I0312 15:45:04.398806 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b97201292e07e2248c92423b39aaefae466ae303600bf4901091cdd595d86c5a" Mar 12 15:45:04 crc kubenswrapper[4869]: I0312 15:45:04.398805 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-gbtkn" Mar 12 15:45:04 crc kubenswrapper[4869]: I0312 15:45:04.472033 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555460-f925k"] Mar 12 15:45:04 crc kubenswrapper[4869]: I0312 15:45:04.480856 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555460-f925k"] Mar 12 15:45:06 crc kubenswrapper[4869]: I0312 15:45:06.351353 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6119e4f3-9261-46ad-a270-8ce6a7b7bba4" path="/var/lib/kubelet/pods/6119e4f3-9261-46ad-a270-8ce6a7b7bba4/volumes" Mar 12 15:45:39 crc kubenswrapper[4869]: I0312 15:45:39.483659 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nvkt4"] Mar 12 15:45:39 crc kubenswrapper[4869]: E0312 15:45:39.484605 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe6a526a-08d0-48fd-b316-2e7b27414f31" containerName="collect-profiles" Mar 12 15:45:39 crc kubenswrapper[4869]: I0312 15:45:39.484619 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe6a526a-08d0-48fd-b316-2e7b27414f31" containerName="collect-profiles" Mar 12 15:45:39 crc kubenswrapper[4869]: I0312 15:45:39.484827 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe6a526a-08d0-48fd-b316-2e7b27414f31" containerName="collect-profiles" Mar 12 15:45:39 crc kubenswrapper[4869]: I0312 15:45:39.486151 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nvkt4" Mar 12 15:45:39 crc kubenswrapper[4869]: I0312 15:45:39.515479 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nvkt4"] Mar 12 15:45:39 crc kubenswrapper[4869]: I0312 15:45:39.602850 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19b18a27-3cac-49dd-ab86-2e5d12dd182d-utilities\") pod \"community-operators-nvkt4\" (UID: \"19b18a27-3cac-49dd-ab86-2e5d12dd182d\") " pod="openshift-marketplace/community-operators-nvkt4" Mar 12 15:45:39 crc kubenswrapper[4869]: I0312 15:45:39.602936 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19b18a27-3cac-49dd-ab86-2e5d12dd182d-catalog-content\") pod \"community-operators-nvkt4\" (UID: \"19b18a27-3cac-49dd-ab86-2e5d12dd182d\") " pod="openshift-marketplace/community-operators-nvkt4" Mar 12 15:45:39 crc kubenswrapper[4869]: I0312 15:45:39.603167 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h2km\" (UniqueName: \"kubernetes.io/projected/19b18a27-3cac-49dd-ab86-2e5d12dd182d-kube-api-access-7h2km\") pod \"community-operators-nvkt4\" (UID: \"19b18a27-3cac-49dd-ab86-2e5d12dd182d\") " pod="openshift-marketplace/community-operators-nvkt4" Mar 12 15:45:39 crc kubenswrapper[4869]: I0312 15:45:39.690425 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pmhcb"] Mar 12 15:45:39 crc kubenswrapper[4869]: I0312 15:45:39.692475 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pmhcb" Mar 12 15:45:39 crc kubenswrapper[4869]: I0312 15:45:39.704414 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h2km\" (UniqueName: \"kubernetes.io/projected/19b18a27-3cac-49dd-ab86-2e5d12dd182d-kube-api-access-7h2km\") pod \"community-operators-nvkt4\" (UID: \"19b18a27-3cac-49dd-ab86-2e5d12dd182d\") " pod="openshift-marketplace/community-operators-nvkt4" Mar 12 15:45:39 crc kubenswrapper[4869]: I0312 15:45:39.704490 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19b18a27-3cac-49dd-ab86-2e5d12dd182d-utilities\") pod \"community-operators-nvkt4\" (UID: \"19b18a27-3cac-49dd-ab86-2e5d12dd182d\") " pod="openshift-marketplace/community-operators-nvkt4" Mar 12 15:45:39 crc kubenswrapper[4869]: I0312 15:45:39.704531 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19b18a27-3cac-49dd-ab86-2e5d12dd182d-catalog-content\") pod \"community-operators-nvkt4\" (UID: \"19b18a27-3cac-49dd-ab86-2e5d12dd182d\") " pod="openshift-marketplace/community-operators-nvkt4" Mar 12 15:45:39 crc kubenswrapper[4869]: I0312 15:45:39.705038 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19b18a27-3cac-49dd-ab86-2e5d12dd182d-catalog-content\") pod \"community-operators-nvkt4\" (UID: \"19b18a27-3cac-49dd-ab86-2e5d12dd182d\") " pod="openshift-marketplace/community-operators-nvkt4" Mar 12 15:45:39 crc kubenswrapper[4869]: I0312 15:45:39.705527 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19b18a27-3cac-49dd-ab86-2e5d12dd182d-utilities\") pod \"community-operators-nvkt4\" (UID: \"19b18a27-3cac-49dd-ab86-2e5d12dd182d\") " pod="openshift-marketplace/community-operators-nvkt4" Mar 12 15:45:39 crc kubenswrapper[4869]: I0312 15:45:39.719299 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pmhcb"] Mar 12 15:45:39 crc kubenswrapper[4869]: I0312 15:45:39.750545 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h2km\" (UniqueName: \"kubernetes.io/projected/19b18a27-3cac-49dd-ab86-2e5d12dd182d-kube-api-access-7h2km\") pod \"community-operators-nvkt4\" (UID: \"19b18a27-3cac-49dd-ab86-2e5d12dd182d\") " pod="openshift-marketplace/community-operators-nvkt4" Mar 12 15:45:39 crc kubenswrapper[4869]: I0312 15:45:39.807448 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0503241-7eeb-4e8d-86b8-a742693d9c07-catalog-content\") pod \"certified-operators-pmhcb\" (UID: \"b0503241-7eeb-4e8d-86b8-a742693d9c07\") " pod="openshift-marketplace/certified-operators-pmhcb" Mar 12 15:45:39 crc kubenswrapper[4869]: I0312 15:45:39.807890 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0503241-7eeb-4e8d-86b8-a742693d9c07-utilities\") pod \"certified-operators-pmhcb\" (UID: \"b0503241-7eeb-4e8d-86b8-a742693d9c07\") " pod="openshift-marketplace/certified-operators-pmhcb" Mar 12 15:45:39 crc kubenswrapper[4869]: I0312 15:45:39.807915 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n624\" (UniqueName: \"kubernetes.io/projected/b0503241-7eeb-4e8d-86b8-a742693d9c07-kube-api-access-5n624\") pod \"certified-operators-pmhcb\" (UID: \"b0503241-7eeb-4e8d-86b8-a742693d9c07\") " pod="openshift-marketplace/certified-operators-pmhcb" Mar 12 15:45:39 crc kubenswrapper[4869]: I0312 15:45:39.820634 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nvkt4" Mar 12 15:45:39 crc kubenswrapper[4869]: I0312 15:45:39.910322 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0503241-7eeb-4e8d-86b8-a742693d9c07-catalog-content\") pod \"certified-operators-pmhcb\" (UID: \"b0503241-7eeb-4e8d-86b8-a742693d9c07\") " pod="openshift-marketplace/certified-operators-pmhcb" Mar 12 15:45:39 crc kubenswrapper[4869]: I0312 15:45:39.910379 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0503241-7eeb-4e8d-86b8-a742693d9c07-utilities\") pod \"certified-operators-pmhcb\" (UID: \"b0503241-7eeb-4e8d-86b8-a742693d9c07\") " pod="openshift-marketplace/certified-operators-pmhcb" Mar 12 15:45:39 crc kubenswrapper[4869]: I0312 15:45:39.910405 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n624\" (UniqueName: \"kubernetes.io/projected/b0503241-7eeb-4e8d-86b8-a742693d9c07-kube-api-access-5n624\") pod \"certified-operators-pmhcb\" (UID: \"b0503241-7eeb-4e8d-86b8-a742693d9c07\") " pod="openshift-marketplace/certified-operators-pmhcb" Mar 12 15:45:39 crc kubenswrapper[4869]: I0312 15:45:39.911376 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0503241-7eeb-4e8d-86b8-a742693d9c07-catalog-content\") pod \"certified-operators-pmhcb\" (UID: \"b0503241-7eeb-4e8d-86b8-a742693d9c07\") " pod="openshift-marketplace/certified-operators-pmhcb" Mar 12 15:45:39 crc kubenswrapper[4869]: I0312 15:45:39.911432 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0503241-7eeb-4e8d-86b8-a742693d9c07-utilities\") pod \"certified-operators-pmhcb\" (UID: \"b0503241-7eeb-4e8d-86b8-a742693d9c07\") " pod="openshift-marketplace/certified-operators-pmhcb" Mar 12 15:45:39 crc kubenswrapper[4869]: I0312 15:45:39.930609 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n624\" (UniqueName: \"kubernetes.io/projected/b0503241-7eeb-4e8d-86b8-a742693d9c07-kube-api-access-5n624\") pod \"certified-operators-pmhcb\" (UID: \"b0503241-7eeb-4e8d-86b8-a742693d9c07\") " pod="openshift-marketplace/certified-operators-pmhcb" Mar 12 15:45:40 crc kubenswrapper[4869]: I0312 15:45:40.014506 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pmhcb" Mar 12 15:45:40 crc kubenswrapper[4869]: I0312 15:45:40.463841 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nvkt4"] Mar 12 15:45:40 crc kubenswrapper[4869]: I0312 15:45:40.727974 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pmhcb"] Mar 12 15:45:40 crc kubenswrapper[4869]: W0312 15:45:40.744697 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0503241_7eeb_4e8d_86b8_a742693d9c07.slice/crio-0624c1d6d89fd31c565b7f7185c6eff25863e8c0431a949357b0bd3463e99467 WatchSource:0}: Error finding container 0624c1d6d89fd31c565b7f7185c6eff25863e8c0431a949357b0bd3463e99467: Status 404 returned error can't find the container with id 0624c1d6d89fd31c565b7f7185c6eff25863e8c0431a949357b0bd3463e99467 Mar 12 15:45:40 crc kubenswrapper[4869]: I0312 15:45:40.747015 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvkt4" event={"ID":"19b18a27-3cac-49dd-ab86-2e5d12dd182d","Type":"ContainerStarted","Data":"8ae21ec08b17c4e39abbcca89a2f1d00098023b09899960458b39583734254ca"} Mar 12 15:45:40 crc kubenswrapper[4869]: I0312 15:45:40.747055 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvkt4" event={"ID":"19b18a27-3cac-49dd-ab86-2e5d12dd182d","Type":"ContainerStarted","Data":"bc4cb14dac9536b4997d220592470d90be4346b2fa093416793acdcf5cf72e9d"} Mar 12 15:45:41 crc kubenswrapper[4869]: I0312 15:45:41.759456 4869 generic.go:334] "Generic (PLEG): container finished" podID="b0503241-7eeb-4e8d-86b8-a742693d9c07" containerID="a2fac089b5e0861e0b0d29e4218fafccd328665ed4098f35fd962b99e473264c" exitCode=0 Mar 12 15:45:41 crc kubenswrapper[4869]: I0312 15:45:41.759567 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pmhcb" event={"ID":"b0503241-7eeb-4e8d-86b8-a742693d9c07","Type":"ContainerDied","Data":"a2fac089b5e0861e0b0d29e4218fafccd328665ed4098f35fd962b99e473264c"} Mar 12 15:45:41 crc kubenswrapper[4869]: I0312 15:45:41.759874 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pmhcb" event={"ID":"b0503241-7eeb-4e8d-86b8-a742693d9c07","Type":"ContainerStarted","Data":"0624c1d6d89fd31c565b7f7185c6eff25863e8c0431a949357b0bd3463e99467"} Mar 12 15:45:41 crc kubenswrapper[4869]: I0312 15:45:41.762691 4869 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 15:45:41 crc kubenswrapper[4869]: I0312 15:45:41.764920 4869 generic.go:334] "Generic (PLEG): container finished" podID="19b18a27-3cac-49dd-ab86-2e5d12dd182d" containerID="8ae21ec08b17c4e39abbcca89a2f1d00098023b09899960458b39583734254ca" exitCode=0 Mar 12 15:45:41 crc kubenswrapper[4869]: I0312 15:45:41.764974 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvkt4" event={"ID":"19b18a27-3cac-49dd-ab86-2e5d12dd182d","Type":"ContainerDied","Data":"8ae21ec08b17c4e39abbcca89a2f1d00098023b09899960458b39583734254ca"} Mar 12 15:45:42 crc kubenswrapper[4869]: I0312 15:45:42.775308 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvkt4" event={"ID":"19b18a27-3cac-49dd-ab86-2e5d12dd182d","Type":"ContainerStarted","Data":"8aeaefabd6c0e1b51858b0ee52721583bfd8509dfcea06d646264eed7cfe8ca5"} Mar 12 15:45:43 crc kubenswrapper[4869]: I0312 15:45:43.799481 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pmhcb" event={"ID":"b0503241-7eeb-4e8d-86b8-a742693d9c07","Type":"ContainerStarted","Data":"6354aca810b0a8d64c157540f206089675434b6ba079db7ec3327cfaf6179fc5"} Mar 12 15:45:44 crc kubenswrapper[4869]: I0312 15:45:44.994084 4869 scope.go:117] "RemoveContainer" containerID="58ecb8bf03ff4dcc85710784c1aa389c9aeffc9e295fef9f20523709c63a9d10" Mar 12 15:45:45 crc kubenswrapper[4869]: I0312 15:45:45.817892 4869 generic.go:334] "Generic (PLEG): container finished" podID="b0503241-7eeb-4e8d-86b8-a742693d9c07" containerID="6354aca810b0a8d64c157540f206089675434b6ba079db7ec3327cfaf6179fc5" exitCode=0 Mar 12 15:45:45 crc kubenswrapper[4869]: I0312 15:45:45.817964 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pmhcb" event={"ID":"b0503241-7eeb-4e8d-86b8-a742693d9c07","Type":"ContainerDied","Data":"6354aca810b0a8d64c157540f206089675434b6ba079db7ec3327cfaf6179fc5"} Mar 12 15:45:45 crc kubenswrapper[4869]: I0312 15:45:45.820625 4869 generic.go:334] "Generic (PLEG): container finished" podID="19b18a27-3cac-49dd-ab86-2e5d12dd182d" containerID="8aeaefabd6c0e1b51858b0ee52721583bfd8509dfcea06d646264eed7cfe8ca5" exitCode=0 Mar 12 15:45:45 crc kubenswrapper[4869]: I0312 15:45:45.820672 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvkt4" event={"ID":"19b18a27-3cac-49dd-ab86-2e5d12dd182d","Type":"ContainerDied","Data":"8aeaefabd6c0e1b51858b0ee52721583bfd8509dfcea06d646264eed7cfe8ca5"} Mar 12 15:45:46 crc kubenswrapper[4869]: I0312 15:45:46.831385 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pmhcb" event={"ID":"b0503241-7eeb-4e8d-86b8-a742693d9c07","Type":"ContainerStarted","Data":"bd1672b012eef00d58d4bebe40b60aeeacc73e19af3d0438e232312087a4c829"} Mar 12 15:45:46 crc kubenswrapper[4869]: I0312 15:45:46.835336 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvkt4" event={"ID":"19b18a27-3cac-49dd-ab86-2e5d12dd182d","Type":"ContainerStarted","Data":"5f4e59faa2caf94910291c406d672b41fc3cd4fd79ca91745b985d220a056c26"} Mar 12 15:45:46 crc kubenswrapper[4869]: I0312 15:45:46.856253 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pmhcb" podStartSLOduration=3.387005115 podStartE2EDuration="7.856231818s" podCreationTimestamp="2026-03-12 15:45:39 +0000 UTC" firstStartedPulling="2026-03-12 15:45:41.762399841 +0000 UTC m=+3494.047625119" lastFinishedPulling="2026-03-12 15:45:46.231626544 +0000 UTC m=+3498.516851822" observedRunningTime="2026-03-12 15:45:46.850176886 +0000 UTC m=+3499.135402184" watchObservedRunningTime="2026-03-12 15:45:46.856231818 +0000 UTC m=+3499.141457096" Mar 12 15:45:46 crc kubenswrapper[4869]: I0312 15:45:46.884334 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nvkt4" podStartSLOduration=3.445434115 podStartE2EDuration="7.884312767s" podCreationTimestamp="2026-03-12 15:45:39 +0000 UTC" firstStartedPulling="2026-03-12 15:45:41.782696107 +0000 UTC m=+3494.067921385" lastFinishedPulling="2026-03-12 15:45:46.221574759 +0000 UTC m=+3498.506800037" observedRunningTime="2026-03-12 15:45:46.874426856 +0000 UTC m=+3499.159652154" watchObservedRunningTime="2026-03-12 15:45:46.884312767 +0000 UTC m=+3499.169538055" Mar 12 15:45:49 crc kubenswrapper[4869]: I0312 15:45:49.821264 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nvkt4" Mar 12 15:45:49 crc kubenswrapper[4869]: I0312 15:45:49.822731 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nvkt4" Mar 12 15:45:50 crc kubenswrapper[4869]: I0312 15:45:50.015772 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pmhcb" Mar 12 15:45:50 crc kubenswrapper[4869]: I0312 15:45:50.017058 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pmhcb" Mar 12 15:45:50 crc kubenswrapper[4869]: I0312 15:45:50.886233 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-nvkt4" podUID="19b18a27-3cac-49dd-ab86-2e5d12dd182d" containerName="registry-server" probeResult="failure" output=< Mar 12 15:45:50 crc kubenswrapper[4869]: timeout: failed to connect service ":50051" within 1s Mar 12 15:45:50 crc kubenswrapper[4869]: > Mar 12 15:45:51 crc kubenswrapper[4869]: I0312 15:45:51.069406 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-pmhcb" podUID="b0503241-7eeb-4e8d-86b8-a742693d9c07" containerName="registry-server" probeResult="failure" output=< Mar 12 15:45:51 crc kubenswrapper[4869]: timeout: failed to connect service ":50051" within 1s Mar 12 15:45:51 crc kubenswrapper[4869]: > Mar 12 15:45:59 crc kubenswrapper[4869]: I0312 15:45:59.868086 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nvkt4" Mar 12 15:45:59 crc kubenswrapper[4869]: I0312 15:45:59.924008 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nvkt4" Mar 12 15:46:00 crc kubenswrapper[4869]: I0312 15:46:00.067036 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pmhcb" Mar 12 15:46:00 crc kubenswrapper[4869]: I0312 15:46:00.110298 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nvkt4"] Mar 12 15:46:00 crc kubenswrapper[4869]: I0312 15:46:00.115407 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pmhcb" Mar 12 15:46:00 crc kubenswrapper[4869]: I0312 15:46:00.140132 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555506-msbjw"] Mar 12 15:46:00 crc kubenswrapper[4869]: I0312 15:46:00.141828 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555506-msbjw" Mar 12 15:46:00 crc kubenswrapper[4869]: I0312 15:46:00.147205 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:46:00 crc kubenswrapper[4869]: I0312 15:46:00.147513 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:46:00 crc kubenswrapper[4869]: I0312 15:46:00.147963 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 15:46:00 crc kubenswrapper[4869]: I0312 15:46:00.152604 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555506-msbjw"] Mar 12 15:46:00 crc kubenswrapper[4869]: I0312 15:46:00.198268 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdgcb\" (UniqueName: \"kubernetes.io/projected/a426389f-7b59-40e0-bfe1-566f38c040a9-kube-api-access-fdgcb\") pod \"auto-csr-approver-29555506-msbjw\" (UID: \"a426389f-7b59-40e0-bfe1-566f38c040a9\") " pod="openshift-infra/auto-csr-approver-29555506-msbjw" Mar 12 15:46:00 crc kubenswrapper[4869]: I0312 15:46:00.300564 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdgcb\" (UniqueName: \"kubernetes.io/projected/a426389f-7b59-40e0-bfe1-566f38c040a9-kube-api-access-fdgcb\") pod \"auto-csr-approver-29555506-msbjw\" (UID: \"a426389f-7b59-40e0-bfe1-566f38c040a9\") " pod="openshift-infra/auto-csr-approver-29555506-msbjw" Mar 12 15:46:00 crc kubenswrapper[4869]: I0312 15:46:00.330126 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdgcb\" (UniqueName: \"kubernetes.io/projected/a426389f-7b59-40e0-bfe1-566f38c040a9-kube-api-access-fdgcb\") pod \"auto-csr-approver-29555506-msbjw\" (UID: \"a426389f-7b59-40e0-bfe1-566f38c040a9\") " pod="openshift-infra/auto-csr-approver-29555506-msbjw" Mar 12 15:46:00 crc kubenswrapper[4869]: I0312 15:46:00.464596 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555506-msbjw" Mar 12 15:46:00 crc kubenswrapper[4869]: I0312 15:46:00.899439 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555506-msbjw"] Mar 12 15:46:00 crc kubenswrapper[4869]: I0312 15:46:00.950842 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555506-msbjw" event={"ID":"a426389f-7b59-40e0-bfe1-566f38c040a9","Type":"ContainerStarted","Data":"11166207ba5814080ddd37092e5af1bbbf0222e3fb9548ee5693c523bf65b9eb"} Mar 12 15:46:00 crc kubenswrapper[4869]: I0312 15:46:00.951006 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nvkt4" podUID="19b18a27-3cac-49dd-ab86-2e5d12dd182d" containerName="registry-server" containerID="cri-o://5f4e59faa2caf94910291c406d672b41fc3cd4fd79ca91745b985d220a056c26" gracePeriod=2 Mar 12 15:46:01 crc kubenswrapper[4869]: I0312 15:46:01.607814 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nvkt4" Mar 12 15:46:01 crc kubenswrapper[4869]: I0312 15:46:01.637587 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19b18a27-3cac-49dd-ab86-2e5d12dd182d-catalog-content\") pod \"19b18a27-3cac-49dd-ab86-2e5d12dd182d\" (UID: \"19b18a27-3cac-49dd-ab86-2e5d12dd182d\") " Mar 12 15:46:01 crc kubenswrapper[4869]: I0312 15:46:01.637945 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19b18a27-3cac-49dd-ab86-2e5d12dd182d-utilities\") pod \"19b18a27-3cac-49dd-ab86-2e5d12dd182d\" (UID: \"19b18a27-3cac-49dd-ab86-2e5d12dd182d\") " Mar 12 15:46:01 crc kubenswrapper[4869]: I0312 15:46:01.638203 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h2km\" (UniqueName: \"kubernetes.io/projected/19b18a27-3cac-49dd-ab86-2e5d12dd182d-kube-api-access-7h2km\") pod \"19b18a27-3cac-49dd-ab86-2e5d12dd182d\" (UID: \"19b18a27-3cac-49dd-ab86-2e5d12dd182d\") " Mar 12 15:46:01 crc kubenswrapper[4869]: I0312 15:46:01.638877 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19b18a27-3cac-49dd-ab86-2e5d12dd182d-utilities" (OuterVolumeSpecName: "utilities") pod "19b18a27-3cac-49dd-ab86-2e5d12dd182d" (UID: "19b18a27-3cac-49dd-ab86-2e5d12dd182d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:46:01 crc kubenswrapper[4869]: I0312 15:46:01.645209 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19b18a27-3cac-49dd-ab86-2e5d12dd182d-kube-api-access-7h2km" (OuterVolumeSpecName: "kube-api-access-7h2km") pod "19b18a27-3cac-49dd-ab86-2e5d12dd182d" (UID: "19b18a27-3cac-49dd-ab86-2e5d12dd182d"). InnerVolumeSpecName "kube-api-access-7h2km". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:46:01 crc kubenswrapper[4869]: I0312 15:46:01.689092 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19b18a27-3cac-49dd-ab86-2e5d12dd182d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "19b18a27-3cac-49dd-ab86-2e5d12dd182d" (UID: "19b18a27-3cac-49dd-ab86-2e5d12dd182d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:46:01 crc kubenswrapper[4869]: I0312 15:46:01.740936 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19b18a27-3cac-49dd-ab86-2e5d12dd182d-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:46:01 crc kubenswrapper[4869]: I0312 15:46:01.740981 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7h2km\" (UniqueName: \"kubernetes.io/projected/19b18a27-3cac-49dd-ab86-2e5d12dd182d-kube-api-access-7h2km\") on node \"crc\" DevicePath \"\"" Mar 12 15:46:01 crc kubenswrapper[4869]: I0312 15:46:01.740993 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19b18a27-3cac-49dd-ab86-2e5d12dd182d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:46:01 crc kubenswrapper[4869]: I0312 15:46:01.961213 4869 generic.go:334] "Generic (PLEG): container finished" podID="19b18a27-3cac-49dd-ab86-2e5d12dd182d" containerID="5f4e59faa2caf94910291c406d672b41fc3cd4fd79ca91745b985d220a056c26" exitCode=0 Mar 12 15:46:01 crc kubenswrapper[4869]: I0312 15:46:01.961262 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvkt4" event={"ID":"19b18a27-3cac-49dd-ab86-2e5d12dd182d","Type":"ContainerDied","Data":"5f4e59faa2caf94910291c406d672b41fc3cd4fd79ca91745b985d220a056c26"} Mar 12 15:46:01 crc kubenswrapper[4869]: I0312 15:46:01.961281 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nvkt4" Mar 12 15:46:01 crc kubenswrapper[4869]: I0312 15:46:01.961346 4869 scope.go:117] "RemoveContainer" containerID="5f4e59faa2caf94910291c406d672b41fc3cd4fd79ca91745b985d220a056c26" Mar 12 15:46:01 crc kubenswrapper[4869]: I0312 15:46:01.961332 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvkt4" event={"ID":"19b18a27-3cac-49dd-ab86-2e5d12dd182d","Type":"ContainerDied","Data":"bc4cb14dac9536b4997d220592470d90be4346b2fa093416793acdcf5cf72e9d"} Mar 12 15:46:01 crc kubenswrapper[4869]: I0312 15:46:01.987724 4869 scope.go:117] "RemoveContainer" containerID="8aeaefabd6c0e1b51858b0ee52721583bfd8509dfcea06d646264eed7cfe8ca5" Mar 12 15:46:02 crc kubenswrapper[4869]: I0312 15:46:02.030340 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nvkt4"] Mar 12 15:46:02 crc kubenswrapper[4869]: I0312 15:46:02.032759 4869 scope.go:117] "RemoveContainer" containerID="8ae21ec08b17c4e39abbcca89a2f1d00098023b09899960458b39583734254ca" Mar 12 15:46:02 crc kubenswrapper[4869]: I0312 15:46:02.038345 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nvkt4"] Mar 12 15:46:02 crc kubenswrapper[4869]: I0312 15:46:02.105986 4869 scope.go:117] "RemoveContainer" containerID="5f4e59faa2caf94910291c406d672b41fc3cd4fd79ca91745b985d220a056c26" Mar 12 15:46:02 crc kubenswrapper[4869]: E0312 15:46:02.111509 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f4e59faa2caf94910291c406d672b41fc3cd4fd79ca91745b985d220a056c26\": container with ID starting with 5f4e59faa2caf94910291c406d672b41fc3cd4fd79ca91745b985d220a056c26 not found: ID does not exist" containerID="5f4e59faa2caf94910291c406d672b41fc3cd4fd79ca91745b985d220a056c26" Mar 12 15:46:02 crc kubenswrapper[4869]: I0312 15:46:02.111586 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f4e59faa2caf94910291c406d672b41fc3cd4fd79ca91745b985d220a056c26"} err="failed to get container status \"5f4e59faa2caf94910291c406d672b41fc3cd4fd79ca91745b985d220a056c26\": rpc error: code = NotFound desc = could not find container \"5f4e59faa2caf94910291c406d672b41fc3cd4fd79ca91745b985d220a056c26\": container with ID starting with 5f4e59faa2caf94910291c406d672b41fc3cd4fd79ca91745b985d220a056c26 not found: ID does not exist" Mar 12 15:46:02 crc kubenswrapper[4869]: I0312 15:46:02.111619 4869 scope.go:117] "RemoveContainer" containerID="8aeaefabd6c0e1b51858b0ee52721583bfd8509dfcea06d646264eed7cfe8ca5" Mar 12 15:46:02 crc kubenswrapper[4869]: E0312 15:46:02.112269 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8aeaefabd6c0e1b51858b0ee52721583bfd8509dfcea06d646264eed7cfe8ca5\": container with ID starting with 8aeaefabd6c0e1b51858b0ee52721583bfd8509dfcea06d646264eed7cfe8ca5 not found: ID does not exist" containerID="8aeaefabd6c0e1b51858b0ee52721583bfd8509dfcea06d646264eed7cfe8ca5" Mar 12 15:46:02 crc kubenswrapper[4869]: I0312 15:46:02.112310 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aeaefabd6c0e1b51858b0ee52721583bfd8509dfcea06d646264eed7cfe8ca5"} err="failed to get container status \"8aeaefabd6c0e1b51858b0ee52721583bfd8509dfcea06d646264eed7cfe8ca5\": rpc error: code = NotFound desc = could not find container \"8aeaefabd6c0e1b51858b0ee52721583bfd8509dfcea06d646264eed7cfe8ca5\": container with ID starting with 8aeaefabd6c0e1b51858b0ee52721583bfd8509dfcea06d646264eed7cfe8ca5 not found: ID does not exist" Mar 12 15:46:02 crc kubenswrapper[4869]: I0312 15:46:02.112408 4869 scope.go:117] "RemoveContainer" containerID="8ae21ec08b17c4e39abbcca89a2f1d00098023b09899960458b39583734254ca" Mar 12 15:46:02 crc kubenswrapper[4869]: E0312 15:46:02.112814 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ae21ec08b17c4e39abbcca89a2f1d00098023b09899960458b39583734254ca\": container with ID starting with 8ae21ec08b17c4e39abbcca89a2f1d00098023b09899960458b39583734254ca not found: ID does not exist" containerID="8ae21ec08b17c4e39abbcca89a2f1d00098023b09899960458b39583734254ca" Mar 12 15:46:02 crc kubenswrapper[4869]: I0312 15:46:02.112854 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ae21ec08b17c4e39abbcca89a2f1d00098023b09899960458b39583734254ca"} err="failed to get container status \"8ae21ec08b17c4e39abbcca89a2f1d00098023b09899960458b39583734254ca\": rpc error: code = NotFound desc = could not find container \"8ae21ec08b17c4e39abbcca89a2f1d00098023b09899960458b39583734254ca\": container with ID starting with 8ae21ec08b17c4e39abbcca89a2f1d00098023b09899960458b39583734254ca not found: ID does not exist" Mar 12 15:46:02 crc kubenswrapper[4869]: I0312 15:46:02.305518 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pmhcb"] Mar 12 15:46:02 crc kubenswrapper[4869]: I0312 15:46:02.307411 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pmhcb" podUID="b0503241-7eeb-4e8d-86b8-a742693d9c07" containerName="registry-server" containerID="cri-o://bd1672b012eef00d58d4bebe40b60aeeacc73e19af3d0438e232312087a4c829" gracePeriod=2 Mar 12 15:46:02 crc kubenswrapper[4869]: I0312 15:46:02.346593 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19b18a27-3cac-49dd-ab86-2e5d12dd182d" path="/var/lib/kubelet/pods/19b18a27-3cac-49dd-ab86-2e5d12dd182d/volumes" Mar 12 15:46:02 crc kubenswrapper[4869]: E0312 15:46:02.680214 4869 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda426389f_7b59_40e0_bfe1_566f38c040a9.slice/crio-303f1f48ff121d1153c23a709e267796d746852aa7c3486ad789dffb3c647cac.scope\": RecentStats: unable to find data in memory cache]" Mar 12 15:46:02 crc kubenswrapper[4869]: I0312 15:46:02.971170 4869 generic.go:334] "Generic (PLEG): container finished" podID="b0503241-7eeb-4e8d-86b8-a742693d9c07" containerID="bd1672b012eef00d58d4bebe40b60aeeacc73e19af3d0438e232312087a4c829" exitCode=0 Mar 12 15:46:02 crc kubenswrapper[4869]: I0312 15:46:02.971254 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pmhcb" event={"ID":"b0503241-7eeb-4e8d-86b8-a742693d9c07","Type":"ContainerDied","Data":"bd1672b012eef00d58d4bebe40b60aeeacc73e19af3d0438e232312087a4c829"} Mar 12 15:46:02 crc kubenswrapper[4869]: I0312 15:46:02.971642 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pmhcb" event={"ID":"b0503241-7eeb-4e8d-86b8-a742693d9c07","Type":"ContainerDied","Data":"0624c1d6d89fd31c565b7f7185c6eff25863e8c0431a949357b0bd3463e99467"} Mar 12 15:46:02 crc kubenswrapper[4869]: I0312 15:46:02.971663 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0624c1d6d89fd31c565b7f7185c6eff25863e8c0431a949357b0bd3463e99467" Mar 12 15:46:02 crc kubenswrapper[4869]: I0312 15:46:02.973344 4869 generic.go:334] "Generic (PLEG): container finished" podID="a426389f-7b59-40e0-bfe1-566f38c040a9" containerID="303f1f48ff121d1153c23a709e267796d746852aa7c3486ad789dffb3c647cac" exitCode=0 Mar 12 15:46:02 crc kubenswrapper[4869]: I0312 15:46:02.973396 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555506-msbjw" event={"ID":"a426389f-7b59-40e0-bfe1-566f38c040a9","Type":"ContainerDied","Data":"303f1f48ff121d1153c23a709e267796d746852aa7c3486ad789dffb3c647cac"} Mar 12 15:46:03 crc kubenswrapper[4869]: I0312 15:46:03.039335 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pmhcb" Mar 12 15:46:03 crc kubenswrapper[4869]: I0312 15:46:03.066449 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0503241-7eeb-4e8d-86b8-a742693d9c07-utilities\") pod \"b0503241-7eeb-4e8d-86b8-a742693d9c07\" (UID: \"b0503241-7eeb-4e8d-86b8-a742693d9c07\") " Mar 12 15:46:03 crc kubenswrapper[4869]: I0312 15:46:03.066500 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5n624\" (UniqueName: \"kubernetes.io/projected/b0503241-7eeb-4e8d-86b8-a742693d9c07-kube-api-access-5n624\") pod \"b0503241-7eeb-4e8d-86b8-a742693d9c07\" (UID: \"b0503241-7eeb-4e8d-86b8-a742693d9c07\") " Mar 12 15:46:03 crc kubenswrapper[4869]: I0312 15:46:03.066881 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0503241-7eeb-4e8d-86b8-a742693d9c07-catalog-content\") pod \"b0503241-7eeb-4e8d-86b8-a742693d9c07\" (UID: \"b0503241-7eeb-4e8d-86b8-a742693d9c07\") " Mar 12 15:46:03 crc kubenswrapper[4869]: I0312 15:46:03.068504 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0503241-7eeb-4e8d-86b8-a742693d9c07-utilities" (OuterVolumeSpecName: "utilities") pod "b0503241-7eeb-4e8d-86b8-a742693d9c07" (UID: "b0503241-7eeb-4e8d-86b8-a742693d9c07"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:46:03 crc kubenswrapper[4869]: I0312 15:46:03.082850 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0503241-7eeb-4e8d-86b8-a742693d9c07-kube-api-access-5n624" (OuterVolumeSpecName: "kube-api-access-5n624") pod "b0503241-7eeb-4e8d-86b8-a742693d9c07" (UID: "b0503241-7eeb-4e8d-86b8-a742693d9c07"). InnerVolumeSpecName "kube-api-access-5n624". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:46:03 crc kubenswrapper[4869]: I0312 15:46:03.129668 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0503241-7eeb-4e8d-86b8-a742693d9c07-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b0503241-7eeb-4e8d-86b8-a742693d9c07" (UID: "b0503241-7eeb-4e8d-86b8-a742693d9c07"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:46:03 crc kubenswrapper[4869]: I0312 15:46:03.169650 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0503241-7eeb-4e8d-86b8-a742693d9c07-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:46:03 crc kubenswrapper[4869]: I0312 15:46:03.169685 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5n624\" (UniqueName: \"kubernetes.io/projected/b0503241-7eeb-4e8d-86b8-a742693d9c07-kube-api-access-5n624\") on node \"crc\" DevicePath \"\"" Mar 12 15:46:03 crc kubenswrapper[4869]: I0312 15:46:03.169697 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0503241-7eeb-4e8d-86b8-a742693d9c07-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:46:03 crc kubenswrapper[4869]: I0312 15:46:03.985005 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pmhcb" Mar 12 15:46:04 crc kubenswrapper[4869]: I0312 15:46:04.027679 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pmhcb"] Mar 12 15:46:04 crc kubenswrapper[4869]: I0312 15:46:04.035563 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pmhcb"] Mar 12 15:46:04 crc kubenswrapper[4869]: I0312 15:46:04.350950 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0503241-7eeb-4e8d-86b8-a742693d9c07" path="/var/lib/kubelet/pods/b0503241-7eeb-4e8d-86b8-a742693d9c07/volumes" Mar 12 15:46:04 crc kubenswrapper[4869]: I0312 15:46:04.609388 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555506-msbjw" Mar 12 15:46:04 crc kubenswrapper[4869]: I0312 15:46:04.702280 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdgcb\" (UniqueName: \"kubernetes.io/projected/a426389f-7b59-40e0-bfe1-566f38c040a9-kube-api-access-fdgcb\") pod \"a426389f-7b59-40e0-bfe1-566f38c040a9\" (UID: \"a426389f-7b59-40e0-bfe1-566f38c040a9\") " Mar 12 15:46:04 crc kubenswrapper[4869]: I0312 15:46:04.710979 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a426389f-7b59-40e0-bfe1-566f38c040a9-kube-api-access-fdgcb" (OuterVolumeSpecName: "kube-api-access-fdgcb") pod "a426389f-7b59-40e0-bfe1-566f38c040a9" (UID: "a426389f-7b59-40e0-bfe1-566f38c040a9"). InnerVolumeSpecName "kube-api-access-fdgcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:46:04 crc kubenswrapper[4869]: I0312 15:46:04.804558 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdgcb\" (UniqueName: \"kubernetes.io/projected/a426389f-7b59-40e0-bfe1-566f38c040a9-kube-api-access-fdgcb\") on node \"crc\" DevicePath \"\"" Mar 12 15:46:04 crc kubenswrapper[4869]: I0312 15:46:04.992926 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555506-msbjw" event={"ID":"a426389f-7b59-40e0-bfe1-566f38c040a9","Type":"ContainerDied","Data":"11166207ba5814080ddd37092e5af1bbbf0222e3fb9548ee5693c523bf65b9eb"} Mar 12 15:46:04 crc kubenswrapper[4869]: I0312 15:46:04.992997 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11166207ba5814080ddd37092e5af1bbbf0222e3fb9548ee5693c523bf65b9eb" Mar 12 15:46:04 crc kubenswrapper[4869]: I0312 15:46:04.993063 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555506-msbjw" Mar 12 15:46:05 crc kubenswrapper[4869]: I0312 15:46:05.678618 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555500-84j67"] Mar 12 15:46:05 crc kubenswrapper[4869]: I0312 15:46:05.692114 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555500-84j67"] Mar 12 15:46:06 crc kubenswrapper[4869]: I0312 15:46:06.350129 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b01772f-9975-4982-8fd3-8fef34017b68" path="/var/lib/kubelet/pods/4b01772f-9975-4982-8fd3-8fef34017b68/volumes" Mar 12 15:46:45 crc kubenswrapper[4869]: I0312 15:46:45.092084 4869 scope.go:117] "RemoveContainer" containerID="e119966be2a6568d1202b338d024b32867398a518dce955a6a9f36a224f886b0" Mar 12 15:47:19 crc kubenswrapper[4869]: I0312 15:47:19.684453 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:47:19 crc kubenswrapper[4869]: I0312 15:47:19.685185 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:47:49 crc kubenswrapper[4869]: I0312 15:47:49.684129 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:47:49 crc kubenswrapper[4869]: I0312 15:47:49.685805 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:48:00 crc kubenswrapper[4869]: I0312 15:48:00.142436 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555508-cgxdt"] Mar 12 15:48:00 crc kubenswrapper[4869]: E0312 15:48:00.143522 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19b18a27-3cac-49dd-ab86-2e5d12dd182d" containerName="registry-server" Mar 12 15:48:00 crc kubenswrapper[4869]: I0312 15:48:00.143559 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="19b18a27-3cac-49dd-ab86-2e5d12dd182d" containerName="registry-server" Mar 12 15:48:00 crc kubenswrapper[4869]: E0312 15:48:00.143584 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a426389f-7b59-40e0-bfe1-566f38c040a9" containerName="oc" Mar 12 15:48:00 crc kubenswrapper[4869]: I0312 15:48:00.143592 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="a426389f-7b59-40e0-bfe1-566f38c040a9" containerName="oc" Mar 12 15:48:00 crc kubenswrapper[4869]: E0312 15:48:00.143610 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19b18a27-3cac-49dd-ab86-2e5d12dd182d" containerName="extract-content" Mar 12 15:48:00 crc kubenswrapper[4869]: I0312 15:48:00.143618 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="19b18a27-3cac-49dd-ab86-2e5d12dd182d" containerName="extract-content" Mar 12 15:48:00 crc kubenswrapper[4869]: E0312 15:48:00.143636 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0503241-7eeb-4e8d-86b8-a742693d9c07" containerName="extract-content" Mar 12 15:48:00 crc kubenswrapper[4869]: I0312 15:48:00.143645 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0503241-7eeb-4e8d-86b8-a742693d9c07" containerName="extract-content" Mar 12 15:48:00 crc kubenswrapper[4869]: E0312 15:48:00.143673 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0503241-7eeb-4e8d-86b8-a742693d9c07" containerName="registry-server" Mar 12 15:48:00 crc kubenswrapper[4869]: I0312 15:48:00.143680 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0503241-7eeb-4e8d-86b8-a742693d9c07" containerName="registry-server" Mar 12 15:48:00 crc kubenswrapper[4869]: E0312 15:48:00.143693 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19b18a27-3cac-49dd-ab86-2e5d12dd182d" containerName="extract-utilities" Mar 12 15:48:00 crc kubenswrapper[4869]: I0312 15:48:00.143701 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="19b18a27-3cac-49dd-ab86-2e5d12dd182d" containerName="extract-utilities" Mar 12 15:48:00 crc kubenswrapper[4869]: E0312 15:48:00.143719 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0503241-7eeb-4e8d-86b8-a742693d9c07" containerName="extract-utilities" Mar 12 15:48:00 crc kubenswrapper[4869]: I0312 15:48:00.143726 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0503241-7eeb-4e8d-86b8-a742693d9c07" containerName="extract-utilities" Mar 12 15:48:00 crc kubenswrapper[4869]: I0312 15:48:00.143929 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="19b18a27-3cac-49dd-ab86-2e5d12dd182d" containerName="registry-server" Mar 12 15:48:00 crc kubenswrapper[4869]: I0312 15:48:00.143966 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0503241-7eeb-4e8d-86b8-a742693d9c07" containerName="registry-server" Mar 12 15:48:00 crc kubenswrapper[4869]: I0312 15:48:00.143993 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="a426389f-7b59-40e0-bfe1-566f38c040a9" containerName="oc" Mar 12 15:48:00 crc kubenswrapper[4869]: I0312 15:48:00.144831 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555508-cgxdt" Mar 12 15:48:00 crc kubenswrapper[4869]: I0312 15:48:00.146271 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:48:00 crc kubenswrapper[4869]: I0312 15:48:00.147177 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:48:00 crc kubenswrapper[4869]: I0312 15:48:00.147211 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 15:48:00 crc kubenswrapper[4869]: I0312 15:48:00.162179 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555508-cgxdt"] Mar 12 15:48:00 crc kubenswrapper[4869]: I0312 15:48:00.201007 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srk7m\" (UniqueName: \"kubernetes.io/projected/56100cbd-9c62-4118-9e12-cbcff7d35d81-kube-api-access-srk7m\") pod \"auto-csr-approver-29555508-cgxdt\" (UID: \"56100cbd-9c62-4118-9e12-cbcff7d35d81\") " pod="openshift-infra/auto-csr-approver-29555508-cgxdt" Mar 12 15:48:00 crc kubenswrapper[4869]: I0312 15:48:00.302756 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srk7m\" (UniqueName: \"kubernetes.io/projected/56100cbd-9c62-4118-9e12-cbcff7d35d81-kube-api-access-srk7m\") pod \"auto-csr-approver-29555508-cgxdt\" (UID: \"56100cbd-9c62-4118-9e12-cbcff7d35d81\") " pod="openshift-infra/auto-csr-approver-29555508-cgxdt" Mar 12 15:48:00 crc kubenswrapper[4869]: I0312 15:48:00.326371 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srk7m\" (UniqueName: \"kubernetes.io/projected/56100cbd-9c62-4118-9e12-cbcff7d35d81-kube-api-access-srk7m\") pod \"auto-csr-approver-29555508-cgxdt\" (UID: \"56100cbd-9c62-4118-9e12-cbcff7d35d81\") " pod="openshift-infra/auto-csr-approver-29555508-cgxdt" Mar 12 15:48:00 crc kubenswrapper[4869]: I0312 15:48:00.460953 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555508-cgxdt" Mar 12 15:48:00 crc kubenswrapper[4869]: I0312 15:48:00.918960 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555508-cgxdt"] Mar 12 15:48:01 crc kubenswrapper[4869]: I0312 15:48:01.090053 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555508-cgxdt" event={"ID":"56100cbd-9c62-4118-9e12-cbcff7d35d81","Type":"ContainerStarted","Data":"78b623e34c95c69a9fdae13b9cf29dfd1c1996f0a7116c67af3026451a522cf0"} Mar 12 15:48:03 crc kubenswrapper[4869]: I0312 15:48:03.132152 4869 generic.go:334] "Generic (PLEG): container finished" podID="56100cbd-9c62-4118-9e12-cbcff7d35d81" containerID="ffd97784ab29b96a4eebeefa77fcf59843cf36af55c472d7c2b79f2468d3d1c5" exitCode=0 Mar 12 15:48:03 crc kubenswrapper[4869]: I0312 15:48:03.132245 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555508-cgxdt" event={"ID":"56100cbd-9c62-4118-9e12-cbcff7d35d81","Type":"ContainerDied","Data":"ffd97784ab29b96a4eebeefa77fcf59843cf36af55c472d7c2b79f2468d3d1c5"} Mar 12 15:48:04 crc kubenswrapper[4869]: I0312 15:48:04.897687 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555508-cgxdt" Mar 12 15:48:04 crc kubenswrapper[4869]: I0312 15:48:04.998773 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srk7m\" (UniqueName: \"kubernetes.io/projected/56100cbd-9c62-4118-9e12-cbcff7d35d81-kube-api-access-srk7m\") pod \"56100cbd-9c62-4118-9e12-cbcff7d35d81\" (UID: \"56100cbd-9c62-4118-9e12-cbcff7d35d81\") " Mar 12 15:48:05 crc kubenswrapper[4869]: I0312 15:48:05.023376 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56100cbd-9c62-4118-9e12-cbcff7d35d81-kube-api-access-srk7m" (OuterVolumeSpecName: "kube-api-access-srk7m") pod "56100cbd-9c62-4118-9e12-cbcff7d35d81" (UID: "56100cbd-9c62-4118-9e12-cbcff7d35d81"). InnerVolumeSpecName "kube-api-access-srk7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:48:05 crc kubenswrapper[4869]: I0312 15:48:05.100938 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srk7m\" (UniqueName: \"kubernetes.io/projected/56100cbd-9c62-4118-9e12-cbcff7d35d81-kube-api-access-srk7m\") on node \"crc\" DevicePath \"\"" Mar 12 15:48:05 crc kubenswrapper[4869]: I0312 15:48:05.150884 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555508-cgxdt" event={"ID":"56100cbd-9c62-4118-9e12-cbcff7d35d81","Type":"ContainerDied","Data":"78b623e34c95c69a9fdae13b9cf29dfd1c1996f0a7116c67af3026451a522cf0"} Mar 12 15:48:05 crc kubenswrapper[4869]: I0312 15:48:05.150931 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78b623e34c95c69a9fdae13b9cf29dfd1c1996f0a7116c67af3026451a522cf0" Mar 12 15:48:05 crc kubenswrapper[4869]: I0312 15:48:05.150943 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555508-cgxdt" Mar 12 15:48:05 crc kubenswrapper[4869]: I0312 15:48:05.973335 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555502-wwbtd"] Mar 12 15:48:05 crc kubenswrapper[4869]: I0312 15:48:05.984581 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555502-wwbtd"] Mar 12 15:48:06 crc kubenswrapper[4869]: I0312 15:48:06.351627 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="072ae06d-72e8-4bf6-a46e-49cb7683bf85" path="/var/lib/kubelet/pods/072ae06d-72e8-4bf6-a46e-49cb7683bf85/volumes" Mar 12 15:48:19 crc kubenswrapper[4869]: I0312 15:48:19.684480 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:48:19 crc kubenswrapper[4869]: I0312 15:48:19.685252 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:48:19 crc kubenswrapper[4869]: I0312 15:48:19.685311 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" Mar 12 15:48:19 crc kubenswrapper[4869]: I0312 15:48:19.686247 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b6d851f5a5fde8f34bd804a8c177dc9fd9c07eee2c915ed4e71e8a9cfb1b421b"} pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 15:48:19 crc kubenswrapper[4869]: I0312 15:48:19.686352 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" containerID="cri-o://b6d851f5a5fde8f34bd804a8c177dc9fd9c07eee2c915ed4e71e8a9cfb1b421b" gracePeriod=600 Mar 12 15:48:20 crc kubenswrapper[4869]: I0312 15:48:20.289795 4869 generic.go:334] "Generic (PLEG): container finished" podID="1621c994-94d2-4105-a988-f4739518ba91" containerID="b6d851f5a5fde8f34bd804a8c177dc9fd9c07eee2c915ed4e71e8a9cfb1b421b" exitCode=0 Mar 12 15:48:20 crc kubenswrapper[4869]: I0312 15:48:20.290279 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerDied","Data":"b6d851f5a5fde8f34bd804a8c177dc9fd9c07eee2c915ed4e71e8a9cfb1b421b"} Mar 12 15:48:20 crc kubenswrapper[4869]: I0312 15:48:20.290402 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerStarted","Data":"1c13a2dfc9197e0c3d6275fadcbbd981ea89dbc3fccb1c99e4066afa6307103c"} Mar 12 15:48:20 crc kubenswrapper[4869]: I0312 15:48:20.290446 4869 scope.go:117] "RemoveContainer" containerID="36d0bcf926e3be6f6106266c4e6cc533ce9b43631d5b1e2e852616edb70cc1f8" Mar 12 15:48:45 crc kubenswrapper[4869]: I0312 15:48:45.213340 4869 scope.go:117] "RemoveContainer" containerID="915f2488eb0b8258c653ff9f60ac237d82fe756e3d9a1e8689fbd320662b0846" Mar 12 15:50:00 crc kubenswrapper[4869]: I0312 15:50:00.145988 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555510-697s4"] Mar 12 15:50:00 crc kubenswrapper[4869]: E0312 15:50:00.146875 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56100cbd-9c62-4118-9e12-cbcff7d35d81" containerName="oc" Mar 12 15:50:00 crc kubenswrapper[4869]: I0312 15:50:00.146889 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="56100cbd-9c62-4118-9e12-cbcff7d35d81" containerName="oc" Mar 12 15:50:00 crc kubenswrapper[4869]: I0312 15:50:00.147075 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="56100cbd-9c62-4118-9e12-cbcff7d35d81" containerName="oc" Mar 12 15:50:00 crc kubenswrapper[4869]: I0312 15:50:00.147702 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555510-697s4" Mar 12 15:50:00 crc kubenswrapper[4869]: I0312 15:50:00.154863 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555510-697s4"] Mar 12 15:50:00 crc kubenswrapper[4869]: I0312 15:50:00.186158 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:50:00 crc kubenswrapper[4869]: I0312 15:50:00.186186 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 15:50:00 crc kubenswrapper[4869]: I0312 15:50:00.188104 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:50:00 crc kubenswrapper[4869]: I0312 15:50:00.332769 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cdwq\" (UniqueName: \"kubernetes.io/projected/21cfe119-de9a-4446-b328-801c994bbb61-kube-api-access-4cdwq\") pod \"auto-csr-approver-29555510-697s4\" (UID: \"21cfe119-de9a-4446-b328-801c994bbb61\") " pod="openshift-infra/auto-csr-approver-29555510-697s4" Mar 12 15:50:00 crc kubenswrapper[4869]: I0312 15:50:00.435252 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cdwq\" (UniqueName: \"kubernetes.io/projected/21cfe119-de9a-4446-b328-801c994bbb61-kube-api-access-4cdwq\") pod \"auto-csr-approver-29555510-697s4\" (UID: \"21cfe119-de9a-4446-b328-801c994bbb61\") " pod="openshift-infra/auto-csr-approver-29555510-697s4" Mar 12 15:50:00 crc kubenswrapper[4869]: I0312 15:50:00.462265 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cdwq\" (UniqueName: \"kubernetes.io/projected/21cfe119-de9a-4446-b328-801c994bbb61-kube-api-access-4cdwq\") pod \"auto-csr-approver-29555510-697s4\" (UID: \"21cfe119-de9a-4446-b328-801c994bbb61\") " pod="openshift-infra/auto-csr-approver-29555510-697s4" Mar 12 15:50:00 crc kubenswrapper[4869]: I0312 15:50:00.500877 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555510-697s4" Mar 12 15:50:01 crc kubenswrapper[4869]: I0312 15:50:01.019166 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555510-697s4"] Mar 12 15:50:01 crc kubenswrapper[4869]: I0312 15:50:01.134519 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555510-697s4" event={"ID":"21cfe119-de9a-4446-b328-801c994bbb61","Type":"ContainerStarted","Data":"256c30b27dc7159f310bcdb23ee8e91d414f9c6575ca145fe8411566fe19c3c7"} Mar 12 15:50:03 crc kubenswrapper[4869]: I0312 15:50:03.153498 4869 generic.go:334] "Generic (PLEG): container finished" podID="21cfe119-de9a-4446-b328-801c994bbb61" containerID="9a61966355c148cecce6dcbab801999052bb473bac5778b3f8cdd14818f94068" exitCode=0 Mar 12 15:50:03 crc kubenswrapper[4869]: I0312 15:50:03.153613 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555510-697s4" event={"ID":"21cfe119-de9a-4446-b328-801c994bbb61","Type":"ContainerDied","Data":"9a61966355c148cecce6dcbab801999052bb473bac5778b3f8cdd14818f94068"} Mar 12 15:50:04 crc kubenswrapper[4869]: I0312 15:50:04.748562 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555510-697s4" Mar 12 15:50:04 crc kubenswrapper[4869]: I0312 15:50:04.834355 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cdwq\" (UniqueName: \"kubernetes.io/projected/21cfe119-de9a-4446-b328-801c994bbb61-kube-api-access-4cdwq\") pod \"21cfe119-de9a-4446-b328-801c994bbb61\" (UID: \"21cfe119-de9a-4446-b328-801c994bbb61\") " Mar 12 15:50:04 crc kubenswrapper[4869]: I0312 15:50:04.853766 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21cfe119-de9a-4446-b328-801c994bbb61-kube-api-access-4cdwq" (OuterVolumeSpecName: "kube-api-access-4cdwq") pod "21cfe119-de9a-4446-b328-801c994bbb61" (UID: "21cfe119-de9a-4446-b328-801c994bbb61"). InnerVolumeSpecName "kube-api-access-4cdwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:50:04 crc kubenswrapper[4869]: I0312 15:50:04.937479 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cdwq\" (UniqueName: \"kubernetes.io/projected/21cfe119-de9a-4446-b328-801c994bbb61-kube-api-access-4cdwq\") on node \"crc\" DevicePath \"\"" Mar 12 15:50:05 crc kubenswrapper[4869]: I0312 15:50:05.172770 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555510-697s4" event={"ID":"21cfe119-de9a-4446-b328-801c994bbb61","Type":"ContainerDied","Data":"256c30b27dc7159f310bcdb23ee8e91d414f9c6575ca145fe8411566fe19c3c7"} Mar 12 15:50:05 crc kubenswrapper[4869]: I0312 15:50:05.172808 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="256c30b27dc7159f310bcdb23ee8e91d414f9c6575ca145fe8411566fe19c3c7" Mar 12 15:50:05 crc kubenswrapper[4869]: I0312 15:50:05.172823 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555510-697s4" Mar 12 15:50:05 crc kubenswrapper[4869]: I0312 15:50:05.819885 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555504-pqjjc"] Mar 12 15:50:05 crc kubenswrapper[4869]: I0312 15:50:05.829952 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555504-pqjjc"] Mar 12 15:50:06 crc kubenswrapper[4869]: I0312 15:50:06.347771 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad2a80b3-7549-42db-a5c0-6dc0a5ddb6a5" path="/var/lib/kubelet/pods/ad2a80b3-7549-42db-a5c0-6dc0a5ddb6a5/volumes" Mar 12 15:50:45 crc kubenswrapper[4869]: I0312 15:50:45.381450 4869 scope.go:117] "RemoveContainer" containerID="43a32a753f159228714ea82b5b413508bcff228809087b93133227aecdfcb4c9" Mar 12 15:50:49 crc kubenswrapper[4869]: I0312 15:50:49.684244 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:50:49 crc kubenswrapper[4869]: I0312 15:50:49.684554 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:50:51 crc kubenswrapper[4869]: I0312 15:50:51.158809 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lwfgn"] Mar 12 15:50:51 crc kubenswrapper[4869]: E0312 15:50:51.159450 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21cfe119-de9a-4446-b328-801c994bbb61" containerName="oc" Mar 12 15:50:51 crc kubenswrapper[4869]: I0312 15:50:51.159463 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="21cfe119-de9a-4446-b328-801c994bbb61" containerName="oc" Mar 12 15:50:51 crc kubenswrapper[4869]: I0312 15:50:51.159717 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="21cfe119-de9a-4446-b328-801c994bbb61" containerName="oc" Mar 12 15:50:51 crc kubenswrapper[4869]: I0312 15:50:51.160938 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lwfgn" Mar 12 15:50:51 crc kubenswrapper[4869]: I0312 15:50:51.168675 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lwfgn"] Mar 12 15:50:51 crc kubenswrapper[4869]: I0312 15:50:51.249318 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82t2k\" (UniqueName: \"kubernetes.io/projected/8ffa51a1-0569-406e-aebb-3e4498a2353a-kube-api-access-82t2k\") pod \"redhat-operators-lwfgn\" (UID: \"8ffa51a1-0569-406e-aebb-3e4498a2353a\") " pod="openshift-marketplace/redhat-operators-lwfgn" Mar 12 15:50:51 crc kubenswrapper[4869]: I0312 15:50:51.249816 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ffa51a1-0569-406e-aebb-3e4498a2353a-utilities\") pod \"redhat-operators-lwfgn\" (UID: \"8ffa51a1-0569-406e-aebb-3e4498a2353a\") " pod="openshift-marketplace/redhat-operators-lwfgn" Mar 12 15:50:51 crc kubenswrapper[4869]: I0312 15:50:51.249964 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ffa51a1-0569-406e-aebb-3e4498a2353a-catalog-content\") pod \"redhat-operators-lwfgn\" (UID: \"8ffa51a1-0569-406e-aebb-3e4498a2353a\") " pod="openshift-marketplace/redhat-operators-lwfgn" Mar 12 15:50:51 crc kubenswrapper[4869]: I0312 15:50:51.352336 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82t2k\" (UniqueName: \"kubernetes.io/projected/8ffa51a1-0569-406e-aebb-3e4498a2353a-kube-api-access-82t2k\") pod \"redhat-operators-lwfgn\" (UID: \"8ffa51a1-0569-406e-aebb-3e4498a2353a\") " pod="openshift-marketplace/redhat-operators-lwfgn" Mar 12 15:50:51 crc kubenswrapper[4869]: I0312 15:50:51.352470 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ffa51a1-0569-406e-aebb-3e4498a2353a-utilities\") pod \"redhat-operators-lwfgn\" (UID: \"8ffa51a1-0569-406e-aebb-3e4498a2353a\") " pod="openshift-marketplace/redhat-operators-lwfgn" Mar 12 15:50:51 crc kubenswrapper[4869]: I0312 15:50:51.352600 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ffa51a1-0569-406e-aebb-3e4498a2353a-catalog-content\") pod \"redhat-operators-lwfgn\" (UID: \"8ffa51a1-0569-406e-aebb-3e4498a2353a\") " pod="openshift-marketplace/redhat-operators-lwfgn" Mar 12 15:50:51 crc kubenswrapper[4869]: I0312 15:50:51.352916 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ffa51a1-0569-406e-aebb-3e4498a2353a-utilities\") pod \"redhat-operators-lwfgn\" (UID: \"8ffa51a1-0569-406e-aebb-3e4498a2353a\") " pod="openshift-marketplace/redhat-operators-lwfgn" Mar 12 15:50:51 crc kubenswrapper[4869]: I0312 15:50:51.353031 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ffa51a1-0569-406e-aebb-3e4498a2353a-catalog-content\") pod \"redhat-operators-lwfgn\" (UID: \"8ffa51a1-0569-406e-aebb-3e4498a2353a\") " pod="openshift-marketplace/redhat-operators-lwfgn" Mar 12 15:50:51 crc kubenswrapper[4869]: I0312 15:50:51.374189 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82t2k\" (UniqueName: \"kubernetes.io/projected/8ffa51a1-0569-406e-aebb-3e4498a2353a-kube-api-access-82t2k\") pod \"redhat-operators-lwfgn\" (UID: \"8ffa51a1-0569-406e-aebb-3e4498a2353a\") " pod="openshift-marketplace/redhat-operators-lwfgn" Mar 12 15:50:51 crc kubenswrapper[4869]: I0312 15:50:51.484237 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lwfgn" Mar 12 15:50:51 crc kubenswrapper[4869]: I0312 15:50:51.991361 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lwfgn"] Mar 12 15:50:52 crc kubenswrapper[4869]: W0312 15:50:52.003190 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ffa51a1_0569_406e_aebb_3e4498a2353a.slice/crio-ef27ed069be83b9b433ccd89a707c044de5e17399c3387d52fa881cb2b2dec3c WatchSource:0}: Error finding container ef27ed069be83b9b433ccd89a707c044de5e17399c3387d52fa881cb2b2dec3c: Status 404 returned error can't find the container with id ef27ed069be83b9b433ccd89a707c044de5e17399c3387d52fa881cb2b2dec3c Mar 12 15:50:52 crc kubenswrapper[4869]: I0312 15:50:52.644607 4869 generic.go:334] "Generic (PLEG): container finished" podID="8ffa51a1-0569-406e-aebb-3e4498a2353a" containerID="b0df2687a82ea04a7ee31fb75f304d313414f94ec7520d97e6e8c343293b509b" exitCode=0 Mar 12 15:50:52 crc kubenswrapper[4869]: I0312 15:50:52.644771 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lwfgn" event={"ID":"8ffa51a1-0569-406e-aebb-3e4498a2353a","Type":"ContainerDied","Data":"b0df2687a82ea04a7ee31fb75f304d313414f94ec7520d97e6e8c343293b509b"} Mar 12 15:50:52 crc kubenswrapper[4869]: I0312 15:50:52.644857 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lwfgn" event={"ID":"8ffa51a1-0569-406e-aebb-3e4498a2353a","Type":"ContainerStarted","Data":"ef27ed069be83b9b433ccd89a707c044de5e17399c3387d52fa881cb2b2dec3c"} Mar 12 15:50:52 crc kubenswrapper[4869]: I0312 15:50:52.649029 4869 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 15:50:53 crc kubenswrapper[4869]: I0312 15:50:53.656256 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lwfgn" event={"ID":"8ffa51a1-0569-406e-aebb-3e4498a2353a","Type":"ContainerStarted","Data":"58087fbda78a620908a92bf7bdc40b8797eb722702345c636a75ee13b3298438"} Mar 12 15:50:59 crc kubenswrapper[4869]: I0312 15:50:59.706098 4869 generic.go:334] "Generic (PLEG): container finished" podID="8ffa51a1-0569-406e-aebb-3e4498a2353a" containerID="58087fbda78a620908a92bf7bdc40b8797eb722702345c636a75ee13b3298438" exitCode=0 Mar 12 15:50:59 crc kubenswrapper[4869]: I0312 15:50:59.706176 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lwfgn" event={"ID":"8ffa51a1-0569-406e-aebb-3e4498a2353a","Type":"ContainerDied","Data":"58087fbda78a620908a92bf7bdc40b8797eb722702345c636a75ee13b3298438"} Mar 12 15:51:00 crc kubenswrapper[4869]: I0312 15:51:00.719921 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lwfgn" event={"ID":"8ffa51a1-0569-406e-aebb-3e4498a2353a","Type":"ContainerStarted","Data":"fe45be81389618379904dcec72b307bb04fbec4eb919a93cbb1add41cb4cac2a"} Mar 12 15:51:00 crc kubenswrapper[4869]: I0312 15:51:00.749095 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lwfgn" podStartSLOduration=2.317379812 podStartE2EDuration="9.74907347s" podCreationTimestamp="2026-03-12 15:50:51 +0000 UTC" firstStartedPulling="2026-03-12 15:50:52.648829382 +0000 UTC m=+3804.934054660" lastFinishedPulling="2026-03-12 15:51:00.08052304 +0000 UTC m=+3812.365748318" observedRunningTime="2026-03-12 15:51:00.743317976 +0000 UTC m=+3813.028543264" watchObservedRunningTime="2026-03-12 15:51:00.74907347 +0000 UTC m=+3813.034298758" Mar 12 15:51:01 crc kubenswrapper[4869]: I0312 15:51:01.485012 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lwfgn" Mar 12 15:51:01 crc kubenswrapper[4869]: I0312 15:51:01.485295 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lwfgn" Mar 12 15:51:02 crc kubenswrapper[4869]: I0312 15:51:02.616995 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lwfgn" podUID="8ffa51a1-0569-406e-aebb-3e4498a2353a" containerName="registry-server" probeResult="failure" output=< Mar 12 15:51:02 crc kubenswrapper[4869]: timeout: failed to connect service ":50051" within 1s Mar 12 15:51:02 crc kubenswrapper[4869]: > Mar 12 15:51:11 crc kubenswrapper[4869]: I0312 15:51:11.546970 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lwfgn" Mar 12 15:51:11 crc kubenswrapper[4869]: I0312 15:51:11.627527 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lwfgn" Mar 12 15:51:11 crc kubenswrapper[4869]: I0312 15:51:11.791042 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lwfgn"] Mar 12 15:51:12 crc kubenswrapper[4869]: I0312 15:51:12.817780 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lwfgn" podUID="8ffa51a1-0569-406e-aebb-3e4498a2353a" containerName="registry-server" containerID="cri-o://fe45be81389618379904dcec72b307bb04fbec4eb919a93cbb1add41cb4cac2a" gracePeriod=2 Mar 12 15:51:13 crc kubenswrapper[4869]: I0312 15:51:13.609878 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lwfgn" Mar 12 15:51:13 crc kubenswrapper[4869]: I0312 15:51:13.716820 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82t2k\" (UniqueName: \"kubernetes.io/projected/8ffa51a1-0569-406e-aebb-3e4498a2353a-kube-api-access-82t2k\") pod \"8ffa51a1-0569-406e-aebb-3e4498a2353a\" (UID: \"8ffa51a1-0569-406e-aebb-3e4498a2353a\") " Mar 12 15:51:13 crc kubenswrapper[4869]: I0312 15:51:13.716943 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ffa51a1-0569-406e-aebb-3e4498a2353a-catalog-content\") pod \"8ffa51a1-0569-406e-aebb-3e4498a2353a\" (UID: \"8ffa51a1-0569-406e-aebb-3e4498a2353a\") " Mar 12 15:51:13 crc kubenswrapper[4869]: I0312 15:51:13.717057 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ffa51a1-0569-406e-aebb-3e4498a2353a-utilities\") pod \"8ffa51a1-0569-406e-aebb-3e4498a2353a\" (UID: \"8ffa51a1-0569-406e-aebb-3e4498a2353a\") " Mar 12 15:51:13 crc kubenswrapper[4869]: I0312 15:51:13.717900 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ffa51a1-0569-406e-aebb-3e4498a2353a-utilities" (OuterVolumeSpecName: "utilities") pod "8ffa51a1-0569-406e-aebb-3e4498a2353a" (UID: "8ffa51a1-0569-406e-aebb-3e4498a2353a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:51:13 crc kubenswrapper[4869]: I0312 15:51:13.723173 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ffa51a1-0569-406e-aebb-3e4498a2353a-kube-api-access-82t2k" (OuterVolumeSpecName: "kube-api-access-82t2k") pod "8ffa51a1-0569-406e-aebb-3e4498a2353a" (UID: "8ffa51a1-0569-406e-aebb-3e4498a2353a"). InnerVolumeSpecName "kube-api-access-82t2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:51:13 crc kubenswrapper[4869]: I0312 15:51:13.818827 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82t2k\" (UniqueName: \"kubernetes.io/projected/8ffa51a1-0569-406e-aebb-3e4498a2353a-kube-api-access-82t2k\") on node \"crc\" DevicePath \"\"" Mar 12 15:51:13 crc kubenswrapper[4869]: I0312 15:51:13.818857 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ffa51a1-0569-406e-aebb-3e4498a2353a-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:51:13 crc kubenswrapper[4869]: I0312 15:51:13.836869 4869 generic.go:334] "Generic (PLEG): container finished" podID="8ffa51a1-0569-406e-aebb-3e4498a2353a" containerID="fe45be81389618379904dcec72b307bb04fbec4eb919a93cbb1add41cb4cac2a" exitCode=0 Mar 12 15:51:13 crc kubenswrapper[4869]: I0312 15:51:13.836920 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lwfgn" event={"ID":"8ffa51a1-0569-406e-aebb-3e4498a2353a","Type":"ContainerDied","Data":"fe45be81389618379904dcec72b307bb04fbec4eb919a93cbb1add41cb4cac2a"} Mar 12 15:51:13 crc kubenswrapper[4869]: I0312 15:51:13.836950 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lwfgn" event={"ID":"8ffa51a1-0569-406e-aebb-3e4498a2353a","Type":"ContainerDied","Data":"ef27ed069be83b9b433ccd89a707c044de5e17399c3387d52fa881cb2b2dec3c"} Mar 12 15:51:13 crc kubenswrapper[4869]: I0312 15:51:13.836969 4869 scope.go:117] "RemoveContainer" containerID="fe45be81389618379904dcec72b307bb04fbec4eb919a93cbb1add41cb4cac2a" Mar 12 15:51:13 crc kubenswrapper[4869]: I0312 15:51:13.837113 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lwfgn" Mar 12 15:51:13 crc kubenswrapper[4869]: I0312 15:51:13.863090 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ffa51a1-0569-406e-aebb-3e4498a2353a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ffa51a1-0569-406e-aebb-3e4498a2353a" (UID: "8ffa51a1-0569-406e-aebb-3e4498a2353a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:51:13 crc kubenswrapper[4869]: I0312 15:51:13.863518 4869 scope.go:117] "RemoveContainer" containerID="58087fbda78a620908a92bf7bdc40b8797eb722702345c636a75ee13b3298438" Mar 12 15:51:13 crc kubenswrapper[4869]: I0312 15:51:13.891983 4869 scope.go:117] "RemoveContainer" containerID="b0df2687a82ea04a7ee31fb75f304d313414f94ec7520d97e6e8c343293b509b" Mar 12 15:51:13 crc kubenswrapper[4869]: I0312 15:51:13.920946 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ffa51a1-0569-406e-aebb-3e4498a2353a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:51:13 crc kubenswrapper[4869]: I0312 15:51:13.922127 4869 scope.go:117] "RemoveContainer" containerID="fe45be81389618379904dcec72b307bb04fbec4eb919a93cbb1add41cb4cac2a" Mar 12 15:51:13 crc kubenswrapper[4869]: E0312 15:51:13.922609 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe45be81389618379904dcec72b307bb04fbec4eb919a93cbb1add41cb4cac2a\": container with ID starting with fe45be81389618379904dcec72b307bb04fbec4eb919a93cbb1add41cb4cac2a not found: ID does not exist" containerID="fe45be81389618379904dcec72b307bb04fbec4eb919a93cbb1add41cb4cac2a" Mar 12 15:51:13 crc kubenswrapper[4869]: I0312 15:51:13.922661 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe45be81389618379904dcec72b307bb04fbec4eb919a93cbb1add41cb4cac2a"} err="failed to get container status \"fe45be81389618379904dcec72b307bb04fbec4eb919a93cbb1add41cb4cac2a\": rpc error: code = NotFound desc = could not find container \"fe45be81389618379904dcec72b307bb04fbec4eb919a93cbb1add41cb4cac2a\": container with ID starting with fe45be81389618379904dcec72b307bb04fbec4eb919a93cbb1add41cb4cac2a not found: ID does not exist" Mar 12 15:51:13 crc kubenswrapper[4869]: I0312 15:51:13.922714 4869 scope.go:117] "RemoveContainer" containerID="58087fbda78a620908a92bf7bdc40b8797eb722702345c636a75ee13b3298438" Mar 12 15:51:13 crc kubenswrapper[4869]: E0312 15:51:13.922996 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58087fbda78a620908a92bf7bdc40b8797eb722702345c636a75ee13b3298438\": container with ID starting with 58087fbda78a620908a92bf7bdc40b8797eb722702345c636a75ee13b3298438 not found: ID does not exist" containerID="58087fbda78a620908a92bf7bdc40b8797eb722702345c636a75ee13b3298438" Mar 12 15:51:13 crc kubenswrapper[4869]: I0312 15:51:13.923017 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58087fbda78a620908a92bf7bdc40b8797eb722702345c636a75ee13b3298438"} err="failed to get container status \"58087fbda78a620908a92bf7bdc40b8797eb722702345c636a75ee13b3298438\": rpc error: code = NotFound desc = could not find container \"58087fbda78a620908a92bf7bdc40b8797eb722702345c636a75ee13b3298438\": container with ID starting with 58087fbda78a620908a92bf7bdc40b8797eb722702345c636a75ee13b3298438 not found: ID does not exist" Mar 12 15:51:13 crc kubenswrapper[4869]: I0312 15:51:13.923048 4869 scope.go:117] "RemoveContainer" containerID="b0df2687a82ea04a7ee31fb75f304d313414f94ec7520d97e6e8c343293b509b" Mar 12 15:51:13 crc kubenswrapper[4869]: E0312 15:51:13.923270 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0df2687a82ea04a7ee31fb75f304d313414f94ec7520d97e6e8c343293b509b\": container with ID starting with b0df2687a82ea04a7ee31fb75f304d313414f94ec7520d97e6e8c343293b509b not found: ID does not exist" containerID="b0df2687a82ea04a7ee31fb75f304d313414f94ec7520d97e6e8c343293b509b" Mar 12 15:51:13 crc kubenswrapper[4869]: I0312 15:51:13.923312 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0df2687a82ea04a7ee31fb75f304d313414f94ec7520d97e6e8c343293b509b"} err="failed to get container status \"b0df2687a82ea04a7ee31fb75f304d313414f94ec7520d97e6e8c343293b509b\": rpc error: code = NotFound desc = could not find container \"b0df2687a82ea04a7ee31fb75f304d313414f94ec7520d97e6e8c343293b509b\": container with ID starting with b0df2687a82ea04a7ee31fb75f304d313414f94ec7520d97e6e8c343293b509b not found: ID does not exist" Mar 12 15:51:14 crc kubenswrapper[4869]: I0312 15:51:14.170098 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lwfgn"] Mar 12 15:51:14 crc kubenswrapper[4869]: I0312 15:51:14.178907 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lwfgn"] Mar 12 15:51:14 crc kubenswrapper[4869]: I0312 15:51:14.348197 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ffa51a1-0569-406e-aebb-3e4498a2353a" path="/var/lib/kubelet/pods/8ffa51a1-0569-406e-aebb-3e4498a2353a/volumes" Mar 12 15:51:19 crc kubenswrapper[4869]: I0312 15:51:19.683940 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:51:19 crc kubenswrapper[4869]: I0312 15:51:19.684439 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:51:45 crc kubenswrapper[4869]: I0312 15:51:45.470910 4869 scope.go:117] "RemoveContainer" containerID="a2fac089b5e0861e0b0d29e4218fafccd328665ed4098f35fd962b99e473264c" Mar 12 15:51:45 crc kubenswrapper[4869]: I0312 15:51:45.511083 4869 scope.go:117] "RemoveContainer" containerID="6354aca810b0a8d64c157540f206089675434b6ba079db7ec3327cfaf6179fc5" Mar 12 15:51:49 crc kubenswrapper[4869]: I0312 15:51:49.683875 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:51:49 crc kubenswrapper[4869]: I0312 15:51:49.684753 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:51:49 crc kubenswrapper[4869]: I0312 15:51:49.684826 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" Mar 12 15:51:49 crc kubenswrapper[4869]: I0312 15:51:49.686168 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1c13a2dfc9197e0c3d6275fadcbbd981ea89dbc3fccb1c99e4066afa6307103c"} pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 15:51:49 crc kubenswrapper[4869]: I0312 15:51:49.686248 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" containerID="cri-o://1c13a2dfc9197e0c3d6275fadcbbd981ea89dbc3fccb1c99e4066afa6307103c" gracePeriod=600 Mar 12 15:51:49 crc kubenswrapper[4869]: E0312 15:51:49.816953 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:51:50 crc kubenswrapper[4869]: I0312 15:51:50.749672 4869 generic.go:334] "Generic (PLEG): container finished" podID="1621c994-94d2-4105-a988-f4739518ba91" containerID="1c13a2dfc9197e0c3d6275fadcbbd981ea89dbc3fccb1c99e4066afa6307103c" exitCode=0 Mar 12 15:51:50 crc kubenswrapper[4869]: I0312 15:51:50.749881 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerDied","Data":"1c13a2dfc9197e0c3d6275fadcbbd981ea89dbc3fccb1c99e4066afa6307103c"} Mar 12 15:51:50 crc kubenswrapper[4869]: I0312 15:51:50.750023 4869 scope.go:117] "RemoveContainer" containerID="b6d851f5a5fde8f34bd804a8c177dc9fd9c07eee2c915ed4e71e8a9cfb1b421b" Mar 12 15:51:50 crc kubenswrapper[4869]: I0312 15:51:50.750682 4869 scope.go:117] "RemoveContainer" containerID="1c13a2dfc9197e0c3d6275fadcbbd981ea89dbc3fccb1c99e4066afa6307103c" Mar 12 15:51:50 crc kubenswrapper[4869]: E0312 15:51:50.750909 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:52:00 crc kubenswrapper[4869]: I0312 15:52:00.151304 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555512-hk2tt"] Mar 12 15:52:00 crc kubenswrapper[4869]: E0312 15:52:00.152228 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ffa51a1-0569-406e-aebb-3e4498a2353a" containerName="extract-content" Mar 12 15:52:00 crc kubenswrapper[4869]: I0312 15:52:00.152240 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ffa51a1-0569-406e-aebb-3e4498a2353a" containerName="extract-content" Mar 12 15:52:00 crc kubenswrapper[4869]: E0312 15:52:00.152262 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ffa51a1-0569-406e-aebb-3e4498a2353a" containerName="registry-server" Mar 12 15:52:00 crc kubenswrapper[4869]: I0312 15:52:00.152268 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ffa51a1-0569-406e-aebb-3e4498a2353a" containerName="registry-server" Mar 12 15:52:00 crc kubenswrapper[4869]: E0312 15:52:00.152284 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ffa51a1-0569-406e-aebb-3e4498a2353a" containerName="extract-utilities" Mar 12 15:52:00 crc kubenswrapper[4869]: I0312 15:52:00.152290 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ffa51a1-0569-406e-aebb-3e4498a2353a" containerName="extract-utilities" Mar 12 15:52:00 crc kubenswrapper[4869]: I0312 15:52:00.152465 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ffa51a1-0569-406e-aebb-3e4498a2353a" containerName="registry-server" Mar 12 15:52:00 crc kubenswrapper[4869]: I0312 15:52:00.153067 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555512-hk2tt" Mar 12 15:52:00 crc kubenswrapper[4869]: I0312 15:52:00.155066 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 15:52:00 crc kubenswrapper[4869]: I0312 15:52:00.156080 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:52:00 crc kubenswrapper[4869]: I0312 15:52:00.158230 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:52:00 crc kubenswrapper[4869]: I0312 15:52:00.160868 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555512-hk2tt"] Mar 12 15:52:00 crc kubenswrapper[4869]: I0312 15:52:00.280356 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mgj4\" (UniqueName: \"kubernetes.io/projected/fca69e87-876f-414e-b342-32d0b7f09fe7-kube-api-access-8mgj4\") pod \"auto-csr-approver-29555512-hk2tt\" (UID: \"fca69e87-876f-414e-b342-32d0b7f09fe7\") " pod="openshift-infra/auto-csr-approver-29555512-hk2tt" Mar 12 15:52:00 crc kubenswrapper[4869]: I0312 15:52:00.382162 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mgj4\" (UniqueName: \"kubernetes.io/projected/fca69e87-876f-414e-b342-32d0b7f09fe7-kube-api-access-8mgj4\") pod \"auto-csr-approver-29555512-hk2tt\" (UID: \"fca69e87-876f-414e-b342-32d0b7f09fe7\") " pod="openshift-infra/auto-csr-approver-29555512-hk2tt" Mar 12 15:52:00 crc kubenswrapper[4869]: I0312 15:52:00.401072 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mgj4\" (UniqueName: \"kubernetes.io/projected/fca69e87-876f-414e-b342-32d0b7f09fe7-kube-api-access-8mgj4\") pod \"auto-csr-approver-29555512-hk2tt\" (UID: \"fca69e87-876f-414e-b342-32d0b7f09fe7\") " pod="openshift-infra/auto-csr-approver-29555512-hk2tt" Mar 12 15:52:00 crc kubenswrapper[4869]: I0312 15:52:00.470389 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555512-hk2tt" Mar 12 15:52:00 crc kubenswrapper[4869]: I0312 15:52:00.941733 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555512-hk2tt"] Mar 12 15:52:01 crc kubenswrapper[4869]: I0312 15:52:01.857591 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555512-hk2tt" event={"ID":"fca69e87-876f-414e-b342-32d0b7f09fe7","Type":"ContainerStarted","Data":"0c2a7c3a5b4072478f796da964c5c2b500564c2f047a3cd239591b576d566eac"} Mar 12 15:52:02 crc kubenswrapper[4869]: I0312 15:52:02.868929 4869 generic.go:334] "Generic (PLEG): container finished" podID="fca69e87-876f-414e-b342-32d0b7f09fe7" containerID="4fc4e2c64a4d84a66221582e5d1750188efd995bcf776304da9a1caedfaa7cb6" exitCode=0 Mar 12 15:52:02 crc kubenswrapper[4869]: I0312 15:52:02.869273 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555512-hk2tt" event={"ID":"fca69e87-876f-414e-b342-32d0b7f09fe7","Type":"ContainerDied","Data":"4fc4e2c64a4d84a66221582e5d1750188efd995bcf776304da9a1caedfaa7cb6"} Mar 12 15:52:04 crc kubenswrapper[4869]: I0312 15:52:04.513944 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555512-hk2tt" Mar 12 15:52:04 crc kubenswrapper[4869]: I0312 15:52:04.559043 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mgj4\" (UniqueName: \"kubernetes.io/projected/fca69e87-876f-414e-b342-32d0b7f09fe7-kube-api-access-8mgj4\") pod \"fca69e87-876f-414e-b342-32d0b7f09fe7\" (UID: \"fca69e87-876f-414e-b342-32d0b7f09fe7\") " Mar 12 15:52:04 crc kubenswrapper[4869]: I0312 15:52:04.576944 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fca69e87-876f-414e-b342-32d0b7f09fe7-kube-api-access-8mgj4" (OuterVolumeSpecName: "kube-api-access-8mgj4") pod "fca69e87-876f-414e-b342-32d0b7f09fe7" (UID: "fca69e87-876f-414e-b342-32d0b7f09fe7"). InnerVolumeSpecName "kube-api-access-8mgj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:52:04 crc kubenswrapper[4869]: I0312 15:52:04.662791 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mgj4\" (UniqueName: \"kubernetes.io/projected/fca69e87-876f-414e-b342-32d0b7f09fe7-kube-api-access-8mgj4\") on node \"crc\" DevicePath \"\"" Mar 12 15:52:04 crc kubenswrapper[4869]: I0312 15:52:04.893851 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555512-hk2tt" event={"ID":"fca69e87-876f-414e-b342-32d0b7f09fe7","Type":"ContainerDied","Data":"0c2a7c3a5b4072478f796da964c5c2b500564c2f047a3cd239591b576d566eac"} Mar 12 15:52:04 crc kubenswrapper[4869]: I0312 15:52:04.893898 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c2a7c3a5b4072478f796da964c5c2b500564c2f047a3cd239591b576d566eac" Mar 12 15:52:04 crc kubenswrapper[4869]: I0312 15:52:04.893920 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555512-hk2tt" Mar 12 15:52:05 crc kubenswrapper[4869]: I0312 15:52:05.336733 4869 scope.go:117] "RemoveContainer" containerID="1c13a2dfc9197e0c3d6275fadcbbd981ea89dbc3fccb1c99e4066afa6307103c" Mar 12 15:52:05 crc kubenswrapper[4869]: E0312 15:52:05.337605 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:52:05 crc kubenswrapper[4869]: I0312 15:52:05.584182 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555506-msbjw"] Mar 12 15:52:05 crc kubenswrapper[4869]: I0312 15:52:05.591733 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555506-msbjw"] Mar 12 15:52:06 crc kubenswrapper[4869]: I0312 15:52:06.383242 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a426389f-7b59-40e0-bfe1-566f38c040a9" path="/var/lib/kubelet/pods/a426389f-7b59-40e0-bfe1-566f38c040a9/volumes" Mar 12 15:52:19 crc kubenswrapper[4869]: I0312 15:52:19.338435 4869 scope.go:117] "RemoveContainer" containerID="1c13a2dfc9197e0c3d6275fadcbbd981ea89dbc3fccb1c99e4066afa6307103c" Mar 12 15:52:19 crc kubenswrapper[4869]: E0312 15:52:19.339418 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:52:34 crc kubenswrapper[4869]: I0312 15:52:34.336920 4869 scope.go:117] "RemoveContainer" containerID="1c13a2dfc9197e0c3d6275fadcbbd981ea89dbc3fccb1c99e4066afa6307103c" Mar 12 15:52:34 crc kubenswrapper[4869]: E0312 15:52:34.337855 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:52:45 crc kubenswrapper[4869]: I0312 15:52:45.591078 4869 scope.go:117] "RemoveContainer" containerID="303f1f48ff121d1153c23a709e267796d746852aa7c3486ad789dffb3c647cac" Mar 12 15:52:45 crc kubenswrapper[4869]: I0312 15:52:45.643212 4869 scope.go:117] "RemoveContainer" containerID="bd1672b012eef00d58d4bebe40b60aeeacc73e19af3d0438e232312087a4c829" Mar 12 15:52:48 crc kubenswrapper[4869]: I0312 15:52:48.365140 4869 scope.go:117] "RemoveContainer" containerID="1c13a2dfc9197e0c3d6275fadcbbd981ea89dbc3fccb1c99e4066afa6307103c" Mar 12 15:52:48 crc kubenswrapper[4869]: E0312 15:52:48.365982 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:53:00 crc kubenswrapper[4869]: I0312 15:53:00.336997 4869 scope.go:117] "RemoveContainer" containerID="1c13a2dfc9197e0c3d6275fadcbbd981ea89dbc3fccb1c99e4066afa6307103c" Mar 12 15:53:00 crc kubenswrapper[4869]: E0312 15:53:00.337800 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:53:15 crc kubenswrapper[4869]: I0312 15:53:15.336622 4869 scope.go:117] "RemoveContainer" containerID="1c13a2dfc9197e0c3d6275fadcbbd981ea89dbc3fccb1c99e4066afa6307103c" Mar 12 15:53:15 crc kubenswrapper[4869]: E0312 15:53:15.338489 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:53:28 crc kubenswrapper[4869]: I0312 15:53:28.346292 4869 scope.go:117] "RemoveContainer" containerID="1c13a2dfc9197e0c3d6275fadcbbd981ea89dbc3fccb1c99e4066afa6307103c" Mar 12 15:53:28 crc kubenswrapper[4869]: E0312 15:53:28.347129 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:53:42 crc kubenswrapper[4869]: I0312 15:53:42.337638 4869 scope.go:117] "RemoveContainer" containerID="1c13a2dfc9197e0c3d6275fadcbbd981ea89dbc3fccb1c99e4066afa6307103c" Mar 12 15:53:42 crc kubenswrapper[4869]: E0312 15:53:42.338658 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:53:55 crc kubenswrapper[4869]: I0312 15:53:55.336847 4869 scope.go:117] "RemoveContainer" containerID="1c13a2dfc9197e0c3d6275fadcbbd981ea89dbc3fccb1c99e4066afa6307103c" Mar 12 15:53:55 crc kubenswrapper[4869]: E0312 15:53:55.338782 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:54:00 crc kubenswrapper[4869]: I0312 15:54:00.142897 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555514-6vcqk"] Mar 12 15:54:00 crc kubenswrapper[4869]: E0312 15:54:00.144164 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fca69e87-876f-414e-b342-32d0b7f09fe7" containerName="oc" Mar 12 15:54:00 crc kubenswrapper[4869]: I0312 15:54:00.144181 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="fca69e87-876f-414e-b342-32d0b7f09fe7" containerName="oc" Mar 12 15:54:00 crc kubenswrapper[4869]: I0312 15:54:00.144448 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="fca69e87-876f-414e-b342-32d0b7f09fe7" containerName="oc" Mar 12 15:54:00 crc kubenswrapper[4869]: I0312 15:54:00.145247 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555514-6vcqk" Mar 12 15:54:00 crc kubenswrapper[4869]: I0312 15:54:00.147393 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:54:00 crc kubenswrapper[4869]: I0312 15:54:00.147613 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:54:00 crc kubenswrapper[4869]: I0312 15:54:00.147903 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 15:54:00 crc kubenswrapper[4869]: I0312 15:54:00.155979 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555514-6vcqk"] Mar 12 15:54:00 crc kubenswrapper[4869]: I0312 15:54:00.197759 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2zz6\" (UniqueName: \"kubernetes.io/projected/33af9345-810d-48e3-9531-a4d445b5a6a1-kube-api-access-j2zz6\") pod \"auto-csr-approver-29555514-6vcqk\" (UID: \"33af9345-810d-48e3-9531-a4d445b5a6a1\") " pod="openshift-infra/auto-csr-approver-29555514-6vcqk" Mar 12 15:54:00 crc kubenswrapper[4869]: I0312 15:54:00.299726 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2zz6\" (UniqueName: \"kubernetes.io/projected/33af9345-810d-48e3-9531-a4d445b5a6a1-kube-api-access-j2zz6\") pod \"auto-csr-approver-29555514-6vcqk\" (UID: \"33af9345-810d-48e3-9531-a4d445b5a6a1\") " pod="openshift-infra/auto-csr-approver-29555514-6vcqk" Mar 12 15:54:00 crc kubenswrapper[4869]: I0312 15:54:00.328510 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2zz6\" (UniqueName: \"kubernetes.io/projected/33af9345-810d-48e3-9531-a4d445b5a6a1-kube-api-access-j2zz6\") pod \"auto-csr-approver-29555514-6vcqk\" (UID: \"33af9345-810d-48e3-9531-a4d445b5a6a1\") " pod="openshift-infra/auto-csr-approver-29555514-6vcqk" Mar 12 15:54:00 crc kubenswrapper[4869]: I0312 15:54:00.463740 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555514-6vcqk" Mar 12 15:54:00 crc kubenswrapper[4869]: I0312 15:54:00.990853 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555514-6vcqk"] Mar 12 15:54:01 crc kubenswrapper[4869]: I0312 15:54:01.869170 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555514-6vcqk" event={"ID":"33af9345-810d-48e3-9531-a4d445b5a6a1","Type":"ContainerStarted","Data":"6312148f9b624fd81c9b0c706e3f4f46b265210744180376d81fcb709f646eb0"} Mar 12 15:54:02 crc kubenswrapper[4869]: I0312 15:54:02.877853 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555514-6vcqk" event={"ID":"33af9345-810d-48e3-9531-a4d445b5a6a1","Type":"ContainerStarted","Data":"440387b5434a645f86069b4f40bcdf88e7bf4d75483193ec2bb74b40445ac8d9"} Mar 12 15:54:02 crc kubenswrapper[4869]: I0312 15:54:02.894996 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555514-6vcqk" podStartSLOduration=1.408330774 podStartE2EDuration="2.89496969s" podCreationTimestamp="2026-03-12 15:54:00 +0000 UTC" firstStartedPulling="2026-03-12 15:54:00.992932623 +0000 UTC m=+3993.278157901" lastFinishedPulling="2026-03-12 15:54:02.479571529 +0000 UTC m=+3994.764796817" observedRunningTime="2026-03-12 15:54:02.889313669 +0000 UTC m=+3995.174538947" watchObservedRunningTime="2026-03-12 15:54:02.89496969 +0000 UTC m=+3995.180194968" Mar 12 15:54:03 crc kubenswrapper[4869]: I0312 15:54:03.887678 4869 generic.go:334] "Generic (PLEG): container finished" podID="33af9345-810d-48e3-9531-a4d445b5a6a1" containerID="440387b5434a645f86069b4f40bcdf88e7bf4d75483193ec2bb74b40445ac8d9" exitCode=0 Mar 12 15:54:03 crc kubenswrapper[4869]: I0312 15:54:03.888003 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555514-6vcqk" event={"ID":"33af9345-810d-48e3-9531-a4d445b5a6a1","Type":"ContainerDied","Data":"440387b5434a645f86069b4f40bcdf88e7bf4d75483193ec2bb74b40445ac8d9"} Mar 12 15:54:05 crc kubenswrapper[4869]: I0312 15:54:05.461397 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555514-6vcqk" Mar 12 15:54:05 crc kubenswrapper[4869]: I0312 15:54:05.498299 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2zz6\" (UniqueName: \"kubernetes.io/projected/33af9345-810d-48e3-9531-a4d445b5a6a1-kube-api-access-j2zz6\") pod \"33af9345-810d-48e3-9531-a4d445b5a6a1\" (UID: \"33af9345-810d-48e3-9531-a4d445b5a6a1\") " Mar 12 15:54:05 crc kubenswrapper[4869]: I0312 15:54:05.515168 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33af9345-810d-48e3-9531-a4d445b5a6a1-kube-api-access-j2zz6" (OuterVolumeSpecName: "kube-api-access-j2zz6") pod "33af9345-810d-48e3-9531-a4d445b5a6a1" (UID: "33af9345-810d-48e3-9531-a4d445b5a6a1"). InnerVolumeSpecName "kube-api-access-j2zz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:54:05 crc kubenswrapper[4869]: I0312 15:54:05.600956 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2zz6\" (UniqueName: \"kubernetes.io/projected/33af9345-810d-48e3-9531-a4d445b5a6a1-kube-api-access-j2zz6\") on node \"crc\" DevicePath \"\"" Mar 12 15:54:05 crc kubenswrapper[4869]: I0312 15:54:05.905992 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555514-6vcqk" event={"ID":"33af9345-810d-48e3-9531-a4d445b5a6a1","Type":"ContainerDied","Data":"6312148f9b624fd81c9b0c706e3f4f46b265210744180376d81fcb709f646eb0"} Mar 12 15:54:05 crc kubenswrapper[4869]: I0312 15:54:05.906400 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6312148f9b624fd81c9b0c706e3f4f46b265210744180376d81fcb709f646eb0" Mar 12 15:54:05 crc kubenswrapper[4869]: I0312 15:54:05.906071 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555514-6vcqk" Mar 12 15:54:05 crc kubenswrapper[4869]: I0312 15:54:05.962648 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555508-cgxdt"] Mar 12 15:54:05 crc kubenswrapper[4869]: I0312 15:54:05.979512 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555508-cgxdt"] Mar 12 15:54:06 crc kubenswrapper[4869]: E0312 15:54:06.128065 4869 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33af9345_810d_48e3_9531_a4d445b5a6a1.slice/crio-6312148f9b624fd81c9b0c706e3f4f46b265210744180376d81fcb709f646eb0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33af9345_810d_48e3_9531_a4d445b5a6a1.slice\": RecentStats: unable to find data in memory cache]" Mar 12 15:54:06 crc kubenswrapper[4869]: I0312 15:54:06.347695 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56100cbd-9c62-4118-9e12-cbcff7d35d81" path="/var/lib/kubelet/pods/56100cbd-9c62-4118-9e12-cbcff7d35d81/volumes" Mar 12 15:54:09 crc kubenswrapper[4869]: I0312 15:54:09.336725 4869 scope.go:117] "RemoveContainer" containerID="1c13a2dfc9197e0c3d6275fadcbbd981ea89dbc3fccb1c99e4066afa6307103c" Mar 12 15:54:09 crc kubenswrapper[4869]: E0312 15:54:09.337415 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:54:22 crc kubenswrapper[4869]: I0312 15:54:22.337176 4869 scope.go:117] "RemoveContainer" containerID="1c13a2dfc9197e0c3d6275fadcbbd981ea89dbc3fccb1c99e4066afa6307103c" Mar 12 15:54:22 crc kubenswrapper[4869]: E0312 15:54:22.337807 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:54:36 crc kubenswrapper[4869]: I0312 15:54:36.337176 4869 scope.go:117] "RemoveContainer" containerID="1c13a2dfc9197e0c3d6275fadcbbd981ea89dbc3fccb1c99e4066afa6307103c" Mar 12 15:54:36 crc kubenswrapper[4869]: E0312 15:54:36.337977 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:54:45 crc kubenswrapper[4869]: I0312 15:54:45.728293 4869 scope.go:117] "RemoveContainer" containerID="ffd97784ab29b96a4eebeefa77fcf59843cf36af55c472d7c2b79f2468d3d1c5" Mar 12 15:54:51 crc kubenswrapper[4869]: I0312 15:54:51.336838 4869 scope.go:117] "RemoveContainer" containerID="1c13a2dfc9197e0c3d6275fadcbbd981ea89dbc3fccb1c99e4066afa6307103c" Mar 12 15:54:51 crc kubenswrapper[4869]: E0312 15:54:51.338138 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:55:06 crc kubenswrapper[4869]: I0312 15:55:06.336733 4869 scope.go:117] "RemoveContainer" containerID="1c13a2dfc9197e0c3d6275fadcbbd981ea89dbc3fccb1c99e4066afa6307103c" Mar 12 15:55:06 crc kubenswrapper[4869]: E0312 15:55:06.337515 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:55:19 crc kubenswrapper[4869]: I0312 15:55:19.336826 4869 scope.go:117] "RemoveContainer" containerID="1c13a2dfc9197e0c3d6275fadcbbd981ea89dbc3fccb1c99e4066afa6307103c" Mar 12 15:55:19 crc kubenswrapper[4869]: E0312 15:55:19.337495 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:55:30 crc kubenswrapper[4869]: I0312 15:55:30.337442 4869 scope.go:117] "RemoveContainer" containerID="1c13a2dfc9197e0c3d6275fadcbbd981ea89dbc3fccb1c99e4066afa6307103c" Mar 12 15:55:30 crc kubenswrapper[4869]: E0312 15:55:30.338295 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:55:42 crc kubenswrapper[4869]: I0312 15:55:42.336237 4869 scope.go:117] "RemoveContainer" containerID="1c13a2dfc9197e0c3d6275fadcbbd981ea89dbc3fccb1c99e4066afa6307103c" Mar 12 15:55:42 crc kubenswrapper[4869]: E0312 15:55:42.337333 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:55:50 crc kubenswrapper[4869]: I0312 15:55:50.849578 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-clpb4"] Mar 12 15:55:50 crc kubenswrapper[4869]: E0312 15:55:50.850416 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33af9345-810d-48e3-9531-a4d445b5a6a1" containerName="oc" Mar 12 15:55:50 crc kubenswrapper[4869]: I0312 15:55:50.850429 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="33af9345-810d-48e3-9531-a4d445b5a6a1" containerName="oc" Mar 12 15:55:50 crc kubenswrapper[4869]: I0312 15:55:50.850707 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="33af9345-810d-48e3-9531-a4d445b5a6a1" containerName="oc" Mar 12 15:55:50 crc kubenswrapper[4869]: I0312 15:55:50.852404 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-clpb4" Mar 12 15:55:50 crc kubenswrapper[4869]: I0312 15:55:50.867381 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-clpb4"] Mar 12 15:55:50 crc kubenswrapper[4869]: I0312 15:55:50.962593 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cjc9\" (UniqueName: \"kubernetes.io/projected/e2dbe8cd-729f-4947-afdd-d09422953d60-kube-api-access-9cjc9\") pod \"community-operators-clpb4\" (UID: \"e2dbe8cd-729f-4947-afdd-d09422953d60\") " pod="openshift-marketplace/community-operators-clpb4" Mar 12 15:55:50 crc kubenswrapper[4869]: I0312 15:55:50.962717 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2dbe8cd-729f-4947-afdd-d09422953d60-catalog-content\") pod \"community-operators-clpb4\" (UID: \"e2dbe8cd-729f-4947-afdd-d09422953d60\") " pod="openshift-marketplace/community-operators-clpb4" Mar 12 15:55:50 crc kubenswrapper[4869]: I0312 15:55:50.962828 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2dbe8cd-729f-4947-afdd-d09422953d60-utilities\") pod \"community-operators-clpb4\" (UID: \"e2dbe8cd-729f-4947-afdd-d09422953d60\") " pod="openshift-marketplace/community-operators-clpb4" Mar 12 15:55:51 crc kubenswrapper[4869]: I0312 15:55:51.065048 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2dbe8cd-729f-4947-afdd-d09422953d60-catalog-content\") pod \"community-operators-clpb4\" (UID: \"e2dbe8cd-729f-4947-afdd-d09422953d60\") " pod="openshift-marketplace/community-operators-clpb4" Mar 12 15:55:51 crc kubenswrapper[4869]: I0312 15:55:51.065385 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2dbe8cd-729f-4947-afdd-d09422953d60-utilities\") pod \"community-operators-clpb4\" (UID: \"e2dbe8cd-729f-4947-afdd-d09422953d60\") " pod="openshift-marketplace/community-operators-clpb4" Mar 12 15:55:51 crc kubenswrapper[4869]: I0312 15:55:51.065447 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cjc9\" (UniqueName: \"kubernetes.io/projected/e2dbe8cd-729f-4947-afdd-d09422953d60-kube-api-access-9cjc9\") pod \"community-operators-clpb4\" (UID: \"e2dbe8cd-729f-4947-afdd-d09422953d60\") " pod="openshift-marketplace/community-operators-clpb4" Mar 12 15:55:51 crc kubenswrapper[4869]: I0312 15:55:51.065568 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2dbe8cd-729f-4947-afdd-d09422953d60-catalog-content\") pod \"community-operators-clpb4\" (UID: \"e2dbe8cd-729f-4947-afdd-d09422953d60\") " pod="openshift-marketplace/community-operators-clpb4" Mar 12 15:55:51 crc kubenswrapper[4869]: I0312 15:55:51.065832 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2dbe8cd-729f-4947-afdd-d09422953d60-utilities\") pod \"community-operators-clpb4\" (UID: \"e2dbe8cd-729f-4947-afdd-d09422953d60\") " pod="openshift-marketplace/community-operators-clpb4" Mar 12 15:55:51 crc kubenswrapper[4869]: I0312 15:55:51.089647 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cjc9\" (UniqueName: \"kubernetes.io/projected/e2dbe8cd-729f-4947-afdd-d09422953d60-kube-api-access-9cjc9\") pod \"community-operators-clpb4\" (UID: \"e2dbe8cd-729f-4947-afdd-d09422953d60\") " pod="openshift-marketplace/community-operators-clpb4" Mar 12 15:55:51 crc kubenswrapper[4869]: I0312 15:55:51.179675 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-clpb4" Mar 12 15:55:52 crc kubenswrapper[4869]: I0312 15:55:52.249404 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-clpb4"] Mar 12 15:55:52 crc kubenswrapper[4869]: I0312 15:55:52.910784 4869 generic.go:334] "Generic (PLEG): container finished" podID="e2dbe8cd-729f-4947-afdd-d09422953d60" containerID="290a75b3566b355e4311f4dc90c395bbdb068b422e77ad31602427c8e47786f1" exitCode=0 Mar 12 15:55:52 crc kubenswrapper[4869]: I0312 15:55:52.910837 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clpb4" event={"ID":"e2dbe8cd-729f-4947-afdd-d09422953d60","Type":"ContainerDied","Data":"290a75b3566b355e4311f4dc90c395bbdb068b422e77ad31602427c8e47786f1"} Mar 12 15:55:52 crc kubenswrapper[4869]: I0312 15:55:52.911095 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clpb4" event={"ID":"e2dbe8cd-729f-4947-afdd-d09422953d60","Type":"ContainerStarted","Data":"58c71e8083e655d9bd35c1ebd89286c36a793d8c80893e21f7174578ca224306"} Mar 12 15:55:52 crc kubenswrapper[4869]: I0312 15:55:52.912599 4869 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 15:55:54 crc kubenswrapper[4869]: I0312 15:55:54.927794 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clpb4" event={"ID":"e2dbe8cd-729f-4947-afdd-d09422953d60","Type":"ContainerStarted","Data":"d8d16c83826a008d768f530722c1e4b2dd706756e102e6e7b19811950c83e927"} Mar 12 15:55:55 crc kubenswrapper[4869]: I0312 15:55:55.436699 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q987x"] Mar 12 15:55:55 crc kubenswrapper[4869]: I0312 15:55:55.439088 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q987x" Mar 12 15:55:55 crc kubenswrapper[4869]: I0312 15:55:55.446449 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q987x"] Mar 12 15:55:55 crc kubenswrapper[4869]: I0312 15:55:55.580630 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bb95c2c-78f1-4541-b01a-d40615995d8c-catalog-content\") pod \"certified-operators-q987x\" (UID: \"8bb95c2c-78f1-4541-b01a-d40615995d8c\") " pod="openshift-marketplace/certified-operators-q987x" Mar 12 15:55:55 crc kubenswrapper[4869]: I0312 15:55:55.581056 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bb95c2c-78f1-4541-b01a-d40615995d8c-utilities\") pod \"certified-operators-q987x\" (UID: \"8bb95c2c-78f1-4541-b01a-d40615995d8c\") " pod="openshift-marketplace/certified-operators-q987x" Mar 12 15:55:55 crc kubenswrapper[4869]: I0312 15:55:55.581118 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc88h\" (UniqueName: \"kubernetes.io/projected/8bb95c2c-78f1-4541-b01a-d40615995d8c-kube-api-access-fc88h\") pod \"certified-operators-q987x\" (UID: \"8bb95c2c-78f1-4541-b01a-d40615995d8c\") " pod="openshift-marketplace/certified-operators-q987x" Mar 12 15:55:55 crc kubenswrapper[4869]: I0312 15:55:55.683940 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bb95c2c-78f1-4541-b01a-d40615995d8c-utilities\") pod \"certified-operators-q987x\" (UID: \"8bb95c2c-78f1-4541-b01a-d40615995d8c\") " pod="openshift-marketplace/certified-operators-q987x" Mar 12 15:55:55 crc kubenswrapper[4869]: I0312 15:55:55.684404 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc88h\" (UniqueName: \"kubernetes.io/projected/8bb95c2c-78f1-4541-b01a-d40615995d8c-kube-api-access-fc88h\") pod \"certified-operators-q987x\" (UID: \"8bb95c2c-78f1-4541-b01a-d40615995d8c\") " pod="openshift-marketplace/certified-operators-q987x" Mar 12 15:55:55 crc kubenswrapper[4869]: I0312 15:55:55.684563 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bb95c2c-78f1-4541-b01a-d40615995d8c-catalog-content\") pod \"certified-operators-q987x\" (UID: \"8bb95c2c-78f1-4541-b01a-d40615995d8c\") " pod="openshift-marketplace/certified-operators-q987x" Mar 12 15:55:55 crc kubenswrapper[4869]: I0312 15:55:55.684829 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bb95c2c-78f1-4541-b01a-d40615995d8c-utilities\") pod \"certified-operators-q987x\" (UID: \"8bb95c2c-78f1-4541-b01a-d40615995d8c\") " pod="openshift-marketplace/certified-operators-q987x" Mar 12 15:55:55 crc kubenswrapper[4869]: I0312 15:55:55.685132 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bb95c2c-78f1-4541-b01a-d40615995d8c-catalog-content\") pod \"certified-operators-q987x\" (UID: \"8bb95c2c-78f1-4541-b01a-d40615995d8c\") " pod="openshift-marketplace/certified-operators-q987x" Mar 12 15:55:55 crc kubenswrapper[4869]: I0312 15:55:55.709701 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc88h\" (UniqueName: \"kubernetes.io/projected/8bb95c2c-78f1-4541-b01a-d40615995d8c-kube-api-access-fc88h\") pod \"certified-operators-q987x\" (UID: \"8bb95c2c-78f1-4541-b01a-d40615995d8c\") " pod="openshift-marketplace/certified-operators-q987x" Mar 12 15:55:55 crc kubenswrapper[4869]: I0312 15:55:55.767058 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q987x" Mar 12 15:55:55 crc kubenswrapper[4869]: I0312 15:55:55.961712 4869 generic.go:334] "Generic (PLEG): container finished" podID="e2dbe8cd-729f-4947-afdd-d09422953d60" containerID="d8d16c83826a008d768f530722c1e4b2dd706756e102e6e7b19811950c83e927" exitCode=0 Mar 12 15:55:55 crc kubenswrapper[4869]: I0312 15:55:55.962921 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clpb4" event={"ID":"e2dbe8cd-729f-4947-afdd-d09422953d60","Type":"ContainerDied","Data":"d8d16c83826a008d768f530722c1e4b2dd706756e102e6e7b19811950c83e927"} Mar 12 15:55:56 crc kubenswrapper[4869]: I0312 15:55:56.360441 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q987x"] Mar 12 15:55:56 crc kubenswrapper[4869]: I0312 15:55:56.974725 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clpb4" event={"ID":"e2dbe8cd-729f-4947-afdd-d09422953d60","Type":"ContainerStarted","Data":"d180263f821b245f288ddc63ed63e0a03401b9204639d61097b20ac644620e85"} Mar 12 15:55:56 crc kubenswrapper[4869]: I0312 15:55:56.977604 4869 generic.go:334] "Generic (PLEG): container finished" podID="8bb95c2c-78f1-4541-b01a-d40615995d8c" containerID="22c26175c70e2134100c4ab2141016cd6cae025498557fd867ea69e62477b638" exitCode=0 Mar 12 15:55:56 crc kubenswrapper[4869]: I0312 15:55:56.977647 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q987x" event={"ID":"8bb95c2c-78f1-4541-b01a-d40615995d8c","Type":"ContainerDied","Data":"22c26175c70e2134100c4ab2141016cd6cae025498557fd867ea69e62477b638"} Mar 12 15:55:56 crc kubenswrapper[4869]: I0312 15:55:56.977669 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q987x" event={"ID":"8bb95c2c-78f1-4541-b01a-d40615995d8c","Type":"ContainerStarted","Data":"a0ec90be26e8a4cdfd1cf176105960f8283350de6d89f54abac117f49442b2dc"} Mar 12 15:55:56 crc kubenswrapper[4869]: I0312 15:55:56.999206 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-clpb4" podStartSLOduration=3.500067538 podStartE2EDuration="6.999183759s" podCreationTimestamp="2026-03-12 15:55:50 +0000 UTC" firstStartedPulling="2026-03-12 15:55:52.912386711 +0000 UTC m=+4105.197611989" lastFinishedPulling="2026-03-12 15:55:56.411502932 +0000 UTC m=+4108.696728210" observedRunningTime="2026-03-12 15:55:56.991961005 +0000 UTC m=+4109.277186343" watchObservedRunningTime="2026-03-12 15:55:56.999183759 +0000 UTC m=+4109.284409047" Mar 12 15:55:57 crc kubenswrapper[4869]: I0312 15:55:57.336595 4869 scope.go:117] "RemoveContainer" containerID="1c13a2dfc9197e0c3d6275fadcbbd981ea89dbc3fccb1c99e4066afa6307103c" Mar 12 15:55:57 crc kubenswrapper[4869]: E0312 15:55:57.337105 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:55:58 crc kubenswrapper[4869]: I0312 15:55:58.995594 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q987x" event={"ID":"8bb95c2c-78f1-4541-b01a-d40615995d8c","Type":"ContainerStarted","Data":"a0a96ca914cff1fffffb00b804830a52a3422141e2a28851781ec3bda749f14d"} Mar 12 15:56:00 crc kubenswrapper[4869]: I0312 15:56:00.147158 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555516-pr7cb"] Mar 12 15:56:00 crc kubenswrapper[4869]: I0312 15:56:00.148834 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555516-pr7cb" Mar 12 15:56:00 crc kubenswrapper[4869]: I0312 15:56:00.151353 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:56:00 crc kubenswrapper[4869]: I0312 15:56:00.151553 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 15:56:00 crc kubenswrapper[4869]: I0312 15:56:00.151622 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:56:00 crc kubenswrapper[4869]: I0312 15:56:00.158923 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555516-pr7cb"] Mar 12 15:56:00 crc kubenswrapper[4869]: I0312 15:56:00.179617 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf77f\" (UniqueName: \"kubernetes.io/projected/45f832c1-0ece-49ee-9a70-08ec6ceac27e-kube-api-access-pf77f\") pod \"auto-csr-approver-29555516-pr7cb\" (UID: \"45f832c1-0ece-49ee-9a70-08ec6ceac27e\") " pod="openshift-infra/auto-csr-approver-29555516-pr7cb" Mar 12 15:56:00 crc kubenswrapper[4869]: I0312 15:56:00.281573 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf77f\" (UniqueName: \"kubernetes.io/projected/45f832c1-0ece-49ee-9a70-08ec6ceac27e-kube-api-access-pf77f\") pod \"auto-csr-approver-29555516-pr7cb\" (UID: \"45f832c1-0ece-49ee-9a70-08ec6ceac27e\") " pod="openshift-infra/auto-csr-approver-29555516-pr7cb" Mar 12 15:56:00 crc kubenswrapper[4869]: I0312 15:56:00.300424 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf77f\" (UniqueName: \"kubernetes.io/projected/45f832c1-0ece-49ee-9a70-08ec6ceac27e-kube-api-access-pf77f\") pod \"auto-csr-approver-29555516-pr7cb\" (UID: \"45f832c1-0ece-49ee-9a70-08ec6ceac27e\") " pod="openshift-infra/auto-csr-approver-29555516-pr7cb" Mar 12 15:56:00 crc kubenswrapper[4869]: I0312 15:56:00.497787 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555516-pr7cb" Mar 12 15:56:00 crc kubenswrapper[4869]: W0312 15:56:00.958042 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45f832c1_0ece_49ee_9a70_08ec6ceac27e.slice/crio-5b046e4315fd5b725dd5ab7aa314700c9b25ba570f3b0469316430a9ce433942 WatchSource:0}: Error finding container 5b046e4315fd5b725dd5ab7aa314700c9b25ba570f3b0469316430a9ce433942: Status 404 returned error can't find the container with id 5b046e4315fd5b725dd5ab7aa314700c9b25ba570f3b0469316430a9ce433942 Mar 12 15:56:00 crc kubenswrapper[4869]: I0312 15:56:00.962278 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555516-pr7cb"] Mar 12 15:56:01 crc kubenswrapper[4869]: I0312 15:56:01.015713 4869 generic.go:334] "Generic (PLEG): container finished" podID="8bb95c2c-78f1-4541-b01a-d40615995d8c" containerID="a0a96ca914cff1fffffb00b804830a52a3422141e2a28851781ec3bda749f14d" exitCode=0 Mar 12 15:56:01 crc kubenswrapper[4869]: I0312 15:56:01.015919 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q987x" event={"ID":"8bb95c2c-78f1-4541-b01a-d40615995d8c","Type":"ContainerDied","Data":"a0a96ca914cff1fffffb00b804830a52a3422141e2a28851781ec3bda749f14d"} Mar 12 15:56:01 crc kubenswrapper[4869]: I0312 15:56:01.018958 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555516-pr7cb" event={"ID":"45f832c1-0ece-49ee-9a70-08ec6ceac27e","Type":"ContainerStarted","Data":"5b046e4315fd5b725dd5ab7aa314700c9b25ba570f3b0469316430a9ce433942"} Mar 12 15:56:01 crc kubenswrapper[4869]: I0312 15:56:01.180953 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-clpb4" Mar 12 15:56:01 crc kubenswrapper[4869]: I0312 15:56:01.181015 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-clpb4" Mar 12 15:56:01 crc kubenswrapper[4869]: I0312 15:56:01.240265 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-clpb4" Mar 12 15:56:02 crc kubenswrapper[4869]: I0312 15:56:02.031033 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q987x" event={"ID":"8bb95c2c-78f1-4541-b01a-d40615995d8c","Type":"ContainerStarted","Data":"cf09bd285f1360e5a707d78ca84569e699d597f07137b095cac039587a8401fc"} Mar 12 15:56:02 crc kubenswrapper[4869]: I0312 15:56:02.054318 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q987x" podStartSLOduration=2.5671803 podStartE2EDuration="7.054296531s" podCreationTimestamp="2026-03-12 15:55:55 +0000 UTC" firstStartedPulling="2026-03-12 15:55:56.979205264 +0000 UTC m=+4109.264430552" lastFinishedPulling="2026-03-12 15:56:01.466321505 +0000 UTC m=+4113.751546783" observedRunningTime="2026-03-12 15:56:02.04827192 +0000 UTC m=+4114.333497198" watchObservedRunningTime="2026-03-12 15:56:02.054296531 +0000 UTC m=+4114.339521809" Mar 12 15:56:02 crc kubenswrapper[4869]: I0312 15:56:02.080600 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-clpb4" Mar 12 15:56:03 crc kubenswrapper[4869]: I0312 15:56:03.043325 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555516-pr7cb" event={"ID":"45f832c1-0ece-49ee-9a70-08ec6ceac27e","Type":"ContainerStarted","Data":"7dba95faaa5ee2e852fa2a03dd76ddc77d80bafebe1befb813732fc5faaf93fe"} Mar 12 15:56:03 crc kubenswrapper[4869]: I0312 15:56:03.068505 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555516-pr7cb" podStartSLOduration=2.101181268 podStartE2EDuration="3.068468862s" podCreationTimestamp="2026-03-12 15:56:00 +0000 UTC" firstStartedPulling="2026-03-12 15:56:00.960982109 +0000 UTC m=+4113.246207407" lastFinishedPulling="2026-03-12 15:56:01.928269723 +0000 UTC m=+4114.213495001" observedRunningTime="2026-03-12 15:56:03.060116846 +0000 UTC m=+4115.345342124" watchObservedRunningTime="2026-03-12 15:56:03.068468862 +0000 UTC m=+4115.353694150" Mar 12 15:56:03 crc kubenswrapper[4869]: I0312 15:56:03.624852 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-clpb4"] Mar 12 15:56:04 crc kubenswrapper[4869]: I0312 15:56:04.055248 4869 generic.go:334] "Generic (PLEG): container finished" podID="45f832c1-0ece-49ee-9a70-08ec6ceac27e" containerID="7dba95faaa5ee2e852fa2a03dd76ddc77d80bafebe1befb813732fc5faaf93fe" exitCode=0 Mar 12 15:56:04 crc kubenswrapper[4869]: I0312 15:56:04.055318 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555516-pr7cb" event={"ID":"45f832c1-0ece-49ee-9a70-08ec6ceac27e","Type":"ContainerDied","Data":"7dba95faaa5ee2e852fa2a03dd76ddc77d80bafebe1befb813732fc5faaf93fe"} Mar 12 15:56:04 crc kubenswrapper[4869]: I0312 15:56:04.057433 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-clpb4" podUID="e2dbe8cd-729f-4947-afdd-d09422953d60" containerName="registry-server" containerID="cri-o://d180263f821b245f288ddc63ed63e0a03401b9204639d61097b20ac644620e85" gracePeriod=2 Mar 12 15:56:05 crc kubenswrapper[4869]: I0312 15:56:05.068970 4869 generic.go:334] "Generic (PLEG): container finished" podID="e2dbe8cd-729f-4947-afdd-d09422953d60" containerID="d180263f821b245f288ddc63ed63e0a03401b9204639d61097b20ac644620e85" exitCode=0 Mar 12 15:56:05 crc kubenswrapper[4869]: I0312 15:56:05.069044 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clpb4" event={"ID":"e2dbe8cd-729f-4947-afdd-d09422953d60","Type":"ContainerDied","Data":"d180263f821b245f288ddc63ed63e0a03401b9204639d61097b20ac644620e85"} Mar 12 15:56:05 crc kubenswrapper[4869]: I0312 15:56:05.069569 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clpb4" event={"ID":"e2dbe8cd-729f-4947-afdd-d09422953d60","Type":"ContainerDied","Data":"58c71e8083e655d9bd35c1ebd89286c36a793d8c80893e21f7174578ca224306"} Mar 12 15:56:05 crc kubenswrapper[4869]: I0312 15:56:05.069581 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58c71e8083e655d9bd35c1ebd89286c36a793d8c80893e21f7174578ca224306" Mar 12 15:56:05 crc kubenswrapper[4869]: I0312 15:56:05.117393 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-clpb4" Mar 12 15:56:05 crc kubenswrapper[4869]: I0312 15:56:05.233769 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cjc9\" (UniqueName: \"kubernetes.io/projected/e2dbe8cd-729f-4947-afdd-d09422953d60-kube-api-access-9cjc9\") pod \"e2dbe8cd-729f-4947-afdd-d09422953d60\" (UID: \"e2dbe8cd-729f-4947-afdd-d09422953d60\") " Mar 12 15:56:05 crc kubenswrapper[4869]: I0312 15:56:05.234182 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2dbe8cd-729f-4947-afdd-d09422953d60-catalog-content\") pod \"e2dbe8cd-729f-4947-afdd-d09422953d60\" (UID: \"e2dbe8cd-729f-4947-afdd-d09422953d60\") " Mar 12 15:56:05 crc kubenswrapper[4869]: I0312 15:56:05.234231 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2dbe8cd-729f-4947-afdd-d09422953d60-utilities\") pod \"e2dbe8cd-729f-4947-afdd-d09422953d60\" (UID: \"e2dbe8cd-729f-4947-afdd-d09422953d60\") " Mar 12 15:56:05 crc kubenswrapper[4869]: I0312 15:56:05.234851 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2dbe8cd-729f-4947-afdd-d09422953d60-utilities" (OuterVolumeSpecName: "utilities") pod "e2dbe8cd-729f-4947-afdd-d09422953d60" (UID: "e2dbe8cd-729f-4947-afdd-d09422953d60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:56:05 crc kubenswrapper[4869]: I0312 15:56:05.246754 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2dbe8cd-729f-4947-afdd-d09422953d60-kube-api-access-9cjc9" (OuterVolumeSpecName: "kube-api-access-9cjc9") pod "e2dbe8cd-729f-4947-afdd-d09422953d60" (UID: "e2dbe8cd-729f-4947-afdd-d09422953d60"). InnerVolumeSpecName "kube-api-access-9cjc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:56:05 crc kubenswrapper[4869]: I0312 15:56:05.295232 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2dbe8cd-729f-4947-afdd-d09422953d60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e2dbe8cd-729f-4947-afdd-d09422953d60" (UID: "e2dbe8cd-729f-4947-afdd-d09422953d60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:56:05 crc kubenswrapper[4869]: I0312 15:56:05.337025 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cjc9\" (UniqueName: \"kubernetes.io/projected/e2dbe8cd-729f-4947-afdd-d09422953d60-kube-api-access-9cjc9\") on node \"crc\" DevicePath \"\"" Mar 12 15:56:05 crc kubenswrapper[4869]: I0312 15:56:05.337303 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2dbe8cd-729f-4947-afdd-d09422953d60-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:56:05 crc kubenswrapper[4869]: I0312 15:56:05.337718 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2dbe8cd-729f-4947-afdd-d09422953d60-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:56:05 crc kubenswrapper[4869]: I0312 15:56:05.656570 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555516-pr7cb" Mar 12 15:56:05 crc kubenswrapper[4869]: I0312 15:56:05.744198 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pf77f\" (UniqueName: \"kubernetes.io/projected/45f832c1-0ece-49ee-9a70-08ec6ceac27e-kube-api-access-pf77f\") pod \"45f832c1-0ece-49ee-9a70-08ec6ceac27e\" (UID: \"45f832c1-0ece-49ee-9a70-08ec6ceac27e\") " Mar 12 15:56:05 crc kubenswrapper[4869]: I0312 15:56:05.755722 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45f832c1-0ece-49ee-9a70-08ec6ceac27e-kube-api-access-pf77f" (OuterVolumeSpecName: "kube-api-access-pf77f") pod "45f832c1-0ece-49ee-9a70-08ec6ceac27e" (UID: "45f832c1-0ece-49ee-9a70-08ec6ceac27e"). InnerVolumeSpecName "kube-api-access-pf77f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:56:05 crc kubenswrapper[4869]: I0312 15:56:05.768226 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q987x" Mar 12 15:56:05 crc kubenswrapper[4869]: I0312 15:56:05.768511 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q987x" Mar 12 15:56:05 crc kubenswrapper[4869]: I0312 15:56:05.829878 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q987x" Mar 12 15:56:05 crc kubenswrapper[4869]: I0312 15:56:05.846764 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pf77f\" (UniqueName: \"kubernetes.io/projected/45f832c1-0ece-49ee-9a70-08ec6ceac27e-kube-api-access-pf77f\") on node \"crc\" DevicePath \"\"" Mar 12 15:56:06 crc kubenswrapper[4869]: I0312 15:56:06.080424 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555516-pr7cb" event={"ID":"45f832c1-0ece-49ee-9a70-08ec6ceac27e","Type":"ContainerDied","Data":"5b046e4315fd5b725dd5ab7aa314700c9b25ba570f3b0469316430a9ce433942"} Mar 12 15:56:06 crc kubenswrapper[4869]: I0312 15:56:06.080496 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555516-pr7cb" Mar 12 15:56:06 crc kubenswrapper[4869]: I0312 15:56:06.080497 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b046e4315fd5b725dd5ab7aa314700c9b25ba570f3b0469316430a9ce433942" Mar 12 15:56:06 crc kubenswrapper[4869]: I0312 15:56:06.080455 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-clpb4" Mar 12 15:56:06 crc kubenswrapper[4869]: I0312 15:56:06.129653 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-clpb4"] Mar 12 15:56:06 crc kubenswrapper[4869]: I0312 15:56:06.138704 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-clpb4"] Mar 12 15:56:06 crc kubenswrapper[4869]: I0312 15:56:06.146552 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555510-697s4"] Mar 12 15:56:06 crc kubenswrapper[4869]: I0312 15:56:06.151703 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q987x" Mar 12 15:56:06 crc kubenswrapper[4869]: I0312 15:56:06.153875 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555510-697s4"] Mar 12 15:56:06 crc kubenswrapper[4869]: I0312 15:56:06.346786 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21cfe119-de9a-4446-b328-801c994bbb61" path="/var/lib/kubelet/pods/21cfe119-de9a-4446-b328-801c994bbb61/volumes" Mar 12 15:56:06 crc kubenswrapper[4869]: I0312 15:56:06.347721 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2dbe8cd-729f-4947-afdd-d09422953d60" path="/var/lib/kubelet/pods/e2dbe8cd-729f-4947-afdd-d09422953d60/volumes" Mar 12 15:56:08 crc kubenswrapper[4869]: I0312 15:56:08.230727 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q987x"] Mar 12 15:56:09 crc kubenswrapper[4869]: I0312 15:56:09.108660 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-q987x" podUID="8bb95c2c-78f1-4541-b01a-d40615995d8c" containerName="registry-server" containerID="cri-o://cf09bd285f1360e5a707d78ca84569e699d597f07137b095cac039587a8401fc" gracePeriod=2 Mar 12 15:56:09 crc kubenswrapper[4869]: E0312 15:56:09.331103 4869 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bb95c2c_78f1_4541_b01a_d40615995d8c.slice/crio-cf09bd285f1360e5a707d78ca84569e699d597f07137b095cac039587a8401fc.scope\": RecentStats: unable to find data in memory cache]" Mar 12 15:56:09 crc kubenswrapper[4869]: I0312 15:56:09.775286 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q987x" Mar 12 15:56:09 crc kubenswrapper[4869]: I0312 15:56:09.845029 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bb95c2c-78f1-4541-b01a-d40615995d8c-utilities\") pod \"8bb95c2c-78f1-4541-b01a-d40615995d8c\" (UID: \"8bb95c2c-78f1-4541-b01a-d40615995d8c\") " Mar 12 15:56:09 crc kubenswrapper[4869]: I0312 15:56:09.845201 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bb95c2c-78f1-4541-b01a-d40615995d8c-catalog-content\") pod \"8bb95c2c-78f1-4541-b01a-d40615995d8c\" (UID: \"8bb95c2c-78f1-4541-b01a-d40615995d8c\") " Mar 12 15:56:09 crc kubenswrapper[4869]: I0312 15:56:09.845316 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc88h\" (UniqueName: \"kubernetes.io/projected/8bb95c2c-78f1-4541-b01a-d40615995d8c-kube-api-access-fc88h\") pod \"8bb95c2c-78f1-4541-b01a-d40615995d8c\" (UID: \"8bb95c2c-78f1-4541-b01a-d40615995d8c\") " Mar 12 15:56:09 crc kubenswrapper[4869]: I0312 15:56:09.847052 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bb95c2c-78f1-4541-b01a-d40615995d8c-utilities" (OuterVolumeSpecName: "utilities") pod "8bb95c2c-78f1-4541-b01a-d40615995d8c" (UID: "8bb95c2c-78f1-4541-b01a-d40615995d8c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:56:09 crc kubenswrapper[4869]: I0312 15:56:09.851422 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bb95c2c-78f1-4541-b01a-d40615995d8c-kube-api-access-fc88h" (OuterVolumeSpecName: "kube-api-access-fc88h") pod "8bb95c2c-78f1-4541-b01a-d40615995d8c" (UID: "8bb95c2c-78f1-4541-b01a-d40615995d8c"). InnerVolumeSpecName "kube-api-access-fc88h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:56:09 crc kubenswrapper[4869]: I0312 15:56:09.920873 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bb95c2c-78f1-4541-b01a-d40615995d8c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8bb95c2c-78f1-4541-b01a-d40615995d8c" (UID: "8bb95c2c-78f1-4541-b01a-d40615995d8c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:56:09 crc kubenswrapper[4869]: I0312 15:56:09.947476 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bb95c2c-78f1-4541-b01a-d40615995d8c-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:56:09 crc kubenswrapper[4869]: I0312 15:56:09.947529 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bb95c2c-78f1-4541-b01a-d40615995d8c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:56:09 crc kubenswrapper[4869]: I0312 15:56:09.947562 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc88h\" (UniqueName: \"kubernetes.io/projected/8bb95c2c-78f1-4541-b01a-d40615995d8c-kube-api-access-fc88h\") on node \"crc\" DevicePath \"\"" Mar 12 15:56:10 crc kubenswrapper[4869]: I0312 15:56:10.128864 4869 generic.go:334] "Generic (PLEG): container finished" podID="8bb95c2c-78f1-4541-b01a-d40615995d8c" containerID="cf09bd285f1360e5a707d78ca84569e699d597f07137b095cac039587a8401fc" exitCode=0 Mar 12 15:56:10 crc kubenswrapper[4869]: I0312 15:56:10.128908 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q987x" event={"ID":"8bb95c2c-78f1-4541-b01a-d40615995d8c","Type":"ContainerDied","Data":"cf09bd285f1360e5a707d78ca84569e699d597f07137b095cac039587a8401fc"} Mar 12 15:56:10 crc kubenswrapper[4869]: I0312 15:56:10.128935 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q987x" event={"ID":"8bb95c2c-78f1-4541-b01a-d40615995d8c","Type":"ContainerDied","Data":"a0ec90be26e8a4cdfd1cf176105960f8283350de6d89f54abac117f49442b2dc"} Mar 12 15:56:10 crc kubenswrapper[4869]: I0312 15:56:10.128953 4869 scope.go:117] "RemoveContainer" containerID="cf09bd285f1360e5a707d78ca84569e699d597f07137b095cac039587a8401fc" Mar 12 15:56:10 crc kubenswrapper[4869]: I0312 15:56:10.129071 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q987x" Mar 12 15:56:10 crc kubenswrapper[4869]: I0312 15:56:10.166691 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q987x"] Mar 12 15:56:10 crc kubenswrapper[4869]: I0312 15:56:10.179579 4869 scope.go:117] "RemoveContainer" containerID="a0a96ca914cff1fffffb00b804830a52a3422141e2a28851781ec3bda749f14d" Mar 12 15:56:10 crc kubenswrapper[4869]: I0312 15:56:10.181518 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-q987x"] Mar 12 15:56:10 crc kubenswrapper[4869]: I0312 15:56:10.211168 4869 scope.go:117] "RemoveContainer" containerID="22c26175c70e2134100c4ab2141016cd6cae025498557fd867ea69e62477b638" Mar 12 15:56:10 crc kubenswrapper[4869]: I0312 15:56:10.257780 4869 scope.go:117] "RemoveContainer" containerID="cf09bd285f1360e5a707d78ca84569e699d597f07137b095cac039587a8401fc" Mar 12 15:56:10 crc kubenswrapper[4869]: E0312 15:56:10.258300 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf09bd285f1360e5a707d78ca84569e699d597f07137b095cac039587a8401fc\": container with ID starting with cf09bd285f1360e5a707d78ca84569e699d597f07137b095cac039587a8401fc not found: ID does not exist" containerID="cf09bd285f1360e5a707d78ca84569e699d597f07137b095cac039587a8401fc" Mar 12 15:56:10 crc kubenswrapper[4869]: I0312 15:56:10.258346 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf09bd285f1360e5a707d78ca84569e699d597f07137b095cac039587a8401fc"} err="failed to get container status \"cf09bd285f1360e5a707d78ca84569e699d597f07137b095cac039587a8401fc\": rpc error: code = NotFound desc = could not find container \"cf09bd285f1360e5a707d78ca84569e699d597f07137b095cac039587a8401fc\": container with ID starting with cf09bd285f1360e5a707d78ca84569e699d597f07137b095cac039587a8401fc not found: ID does not exist" Mar 12 15:56:10 crc kubenswrapper[4869]: I0312 15:56:10.258371 4869 scope.go:117] "RemoveContainer" containerID="a0a96ca914cff1fffffb00b804830a52a3422141e2a28851781ec3bda749f14d" Mar 12 15:56:10 crc kubenswrapper[4869]: E0312 15:56:10.258883 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0a96ca914cff1fffffb00b804830a52a3422141e2a28851781ec3bda749f14d\": container with ID starting with a0a96ca914cff1fffffb00b804830a52a3422141e2a28851781ec3bda749f14d not found: ID does not exist" containerID="a0a96ca914cff1fffffb00b804830a52a3422141e2a28851781ec3bda749f14d" Mar 12 15:56:10 crc kubenswrapper[4869]: I0312 15:56:10.258944 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0a96ca914cff1fffffb00b804830a52a3422141e2a28851781ec3bda749f14d"} err="failed to get container status \"a0a96ca914cff1fffffb00b804830a52a3422141e2a28851781ec3bda749f14d\": rpc error: code = NotFound desc = could not find container \"a0a96ca914cff1fffffb00b804830a52a3422141e2a28851781ec3bda749f14d\": container with ID starting with a0a96ca914cff1fffffb00b804830a52a3422141e2a28851781ec3bda749f14d not found: ID does not exist" Mar 12 15:56:10 crc kubenswrapper[4869]: I0312 15:56:10.258983 4869 scope.go:117] "RemoveContainer" containerID="22c26175c70e2134100c4ab2141016cd6cae025498557fd867ea69e62477b638" Mar 12 15:56:10 crc kubenswrapper[4869]: E0312 15:56:10.259329 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22c26175c70e2134100c4ab2141016cd6cae025498557fd867ea69e62477b638\": container with ID starting with 22c26175c70e2134100c4ab2141016cd6cae025498557fd867ea69e62477b638 not found: ID does not exist" containerID="22c26175c70e2134100c4ab2141016cd6cae025498557fd867ea69e62477b638" Mar 12 15:56:10 crc kubenswrapper[4869]: I0312 15:56:10.259357 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22c26175c70e2134100c4ab2141016cd6cae025498557fd867ea69e62477b638"} err="failed to get container status \"22c26175c70e2134100c4ab2141016cd6cae025498557fd867ea69e62477b638\": rpc error: code = NotFound desc = could not find container \"22c26175c70e2134100c4ab2141016cd6cae025498557fd867ea69e62477b638\": container with ID starting with 22c26175c70e2134100c4ab2141016cd6cae025498557fd867ea69e62477b638 not found: ID does not exist" Mar 12 15:56:10 crc kubenswrapper[4869]: I0312 15:56:10.337063 4869 scope.go:117] "RemoveContainer" containerID="1c13a2dfc9197e0c3d6275fadcbbd981ea89dbc3fccb1c99e4066afa6307103c" Mar 12 15:56:10 crc kubenswrapper[4869]: E0312 15:56:10.337436 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:56:10 crc kubenswrapper[4869]: I0312 15:56:10.348017 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bb95c2c-78f1-4541-b01a-d40615995d8c" path="/var/lib/kubelet/pods/8bb95c2c-78f1-4541-b01a-d40615995d8c/volumes" Mar 12 15:56:22 crc kubenswrapper[4869]: I0312 15:56:22.336189 4869 scope.go:117] "RemoveContainer" containerID="1c13a2dfc9197e0c3d6275fadcbbd981ea89dbc3fccb1c99e4066afa6307103c" Mar 12 15:56:22 crc kubenswrapper[4869]: E0312 15:56:22.336904 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:56:34 crc kubenswrapper[4869]: I0312 15:56:34.337117 4869 scope.go:117] "RemoveContainer" containerID="1c13a2dfc9197e0c3d6275fadcbbd981ea89dbc3fccb1c99e4066afa6307103c" Mar 12 15:56:34 crc kubenswrapper[4869]: E0312 15:56:34.337811 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:56:45 crc kubenswrapper[4869]: I0312 15:56:45.337032 4869 scope.go:117] "RemoveContainer" containerID="1c13a2dfc9197e0c3d6275fadcbbd981ea89dbc3fccb1c99e4066afa6307103c" Mar 12 15:56:45 crc kubenswrapper[4869]: E0312 15:56:45.337820 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 15:56:45 crc kubenswrapper[4869]: I0312 15:56:45.857481 4869 scope.go:117] "RemoveContainer" containerID="9a61966355c148cecce6dcbab801999052bb473bac5778b3f8cdd14818f94068" Mar 12 15:57:00 crc kubenswrapper[4869]: I0312 15:57:00.336801 4869 scope.go:117] "RemoveContainer" containerID="1c13a2dfc9197e0c3d6275fadcbbd981ea89dbc3fccb1c99e4066afa6307103c" Mar 12 15:57:01 crc kubenswrapper[4869]: I0312 15:57:01.579685 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerStarted","Data":"4f1f716a83ed2ed2475b6a6f3d5f92c71cf571d0688c327fc80039a9d481905b"} Mar 12 15:58:00 crc kubenswrapper[4869]: I0312 15:58:00.149178 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555518-9lfc8"] Mar 12 15:58:00 crc kubenswrapper[4869]: E0312 15:58:00.150115 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb95c2c-78f1-4541-b01a-d40615995d8c" containerName="extract-content" Mar 12 15:58:00 crc kubenswrapper[4869]: I0312 15:58:00.150128 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb95c2c-78f1-4541-b01a-d40615995d8c" containerName="extract-content" Mar 12 15:58:00 crc kubenswrapper[4869]: E0312 15:58:00.150147 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb95c2c-78f1-4541-b01a-d40615995d8c" containerName="extract-utilities" Mar 12 15:58:00 crc kubenswrapper[4869]: I0312 15:58:00.150153 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb95c2c-78f1-4541-b01a-d40615995d8c" containerName="extract-utilities" Mar 12 15:58:00 crc kubenswrapper[4869]: E0312 15:58:00.150167 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2dbe8cd-729f-4947-afdd-d09422953d60" containerName="registry-server" Mar 12 15:58:00 crc kubenswrapper[4869]: I0312 15:58:00.150173 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2dbe8cd-729f-4947-afdd-d09422953d60" containerName="registry-server" Mar 12 15:58:00 crc kubenswrapper[4869]: E0312 15:58:00.150183 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45f832c1-0ece-49ee-9a70-08ec6ceac27e" containerName="oc" Mar 12 15:58:00 crc kubenswrapper[4869]: I0312 15:58:00.150189 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="45f832c1-0ece-49ee-9a70-08ec6ceac27e" containerName="oc" Mar 12 15:58:00 crc kubenswrapper[4869]: E0312 15:58:00.150209 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2dbe8cd-729f-4947-afdd-d09422953d60" containerName="extract-content" Mar 12 15:58:00 crc kubenswrapper[4869]: I0312 15:58:00.150214 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2dbe8cd-729f-4947-afdd-d09422953d60" containerName="extract-content" Mar 12 15:58:00 crc kubenswrapper[4869]: E0312 15:58:00.150224 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb95c2c-78f1-4541-b01a-d40615995d8c" containerName="registry-server" Mar 12 15:58:00 crc kubenswrapper[4869]: I0312 15:58:00.150229 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb95c2c-78f1-4541-b01a-d40615995d8c" containerName="registry-server" Mar 12 15:58:00 crc kubenswrapper[4869]: E0312 15:58:00.150236 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2dbe8cd-729f-4947-afdd-d09422953d60" containerName="extract-utilities" Mar 12 15:58:00 crc kubenswrapper[4869]: I0312 15:58:00.150242 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2dbe8cd-729f-4947-afdd-d09422953d60" containerName="extract-utilities" Mar 12 15:58:00 crc kubenswrapper[4869]: I0312 15:58:00.150428 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="45f832c1-0ece-49ee-9a70-08ec6ceac27e" containerName="oc" Mar 12 15:58:00 crc kubenswrapper[4869]: I0312 15:58:00.150441 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb95c2c-78f1-4541-b01a-d40615995d8c" containerName="registry-server" Mar 12 15:58:00 crc kubenswrapper[4869]: I0312 15:58:00.150450 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2dbe8cd-729f-4947-afdd-d09422953d60" containerName="registry-server" Mar 12 15:58:00 crc kubenswrapper[4869]: I0312 15:58:00.151100 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555518-9lfc8" Mar 12 15:58:00 crc kubenswrapper[4869]: I0312 15:58:00.154191 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:58:00 crc kubenswrapper[4869]: I0312 15:58:00.154252 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 15:58:00 crc kubenswrapper[4869]: I0312 15:58:00.154479 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:58:00 crc kubenswrapper[4869]: I0312 15:58:00.162368 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555518-9lfc8"] Mar 12 15:58:00 crc kubenswrapper[4869]: I0312 15:58:00.250664 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfnq6\" (UniqueName: \"kubernetes.io/projected/9d9798e7-7936-421d-a9b8-f37daf8efc1e-kube-api-access-qfnq6\") pod \"auto-csr-approver-29555518-9lfc8\" (UID: \"9d9798e7-7936-421d-a9b8-f37daf8efc1e\") " pod="openshift-infra/auto-csr-approver-29555518-9lfc8" Mar 12 15:58:00 crc kubenswrapper[4869]: I0312 15:58:00.353621 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfnq6\" (UniqueName: \"kubernetes.io/projected/9d9798e7-7936-421d-a9b8-f37daf8efc1e-kube-api-access-qfnq6\") pod \"auto-csr-approver-29555518-9lfc8\" (UID: \"9d9798e7-7936-421d-a9b8-f37daf8efc1e\") " pod="openshift-infra/auto-csr-approver-29555518-9lfc8" Mar 12 15:58:00 crc kubenswrapper[4869]: I0312 15:58:00.380526 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfnq6\" (UniqueName: \"kubernetes.io/projected/9d9798e7-7936-421d-a9b8-f37daf8efc1e-kube-api-access-qfnq6\") pod \"auto-csr-approver-29555518-9lfc8\" (UID: \"9d9798e7-7936-421d-a9b8-f37daf8efc1e\") " pod="openshift-infra/auto-csr-approver-29555518-9lfc8" Mar 12 15:58:00 crc kubenswrapper[4869]: I0312 15:58:00.475333 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555518-9lfc8" Mar 12 15:58:00 crc kubenswrapper[4869]: I0312 15:58:00.959010 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555518-9lfc8"] Mar 12 15:58:01 crc kubenswrapper[4869]: I0312 15:58:01.175418 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555518-9lfc8" event={"ID":"9d9798e7-7936-421d-a9b8-f37daf8efc1e","Type":"ContainerStarted","Data":"46dd94105b4f2d6e22bd5acb2fccb758cd8713e643987c55c82b12cc9af1e3f6"} Mar 12 15:58:03 crc kubenswrapper[4869]: I0312 15:58:03.200011 4869 generic.go:334] "Generic (PLEG): container finished" podID="9d9798e7-7936-421d-a9b8-f37daf8efc1e" containerID="66cc10f53c26eb56090620228e3b1f9949e63f83b447992500ab95a6446c254c" exitCode=0 Mar 12 15:58:03 crc kubenswrapper[4869]: I0312 15:58:03.200076 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555518-9lfc8" event={"ID":"9d9798e7-7936-421d-a9b8-f37daf8efc1e","Type":"ContainerDied","Data":"66cc10f53c26eb56090620228e3b1f9949e63f83b447992500ab95a6446c254c"} Mar 12 15:58:04 crc kubenswrapper[4869]: I0312 15:58:04.806859 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555518-9lfc8" Mar 12 15:58:04 crc kubenswrapper[4869]: I0312 15:58:04.848431 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfnq6\" (UniqueName: \"kubernetes.io/projected/9d9798e7-7936-421d-a9b8-f37daf8efc1e-kube-api-access-qfnq6\") pod \"9d9798e7-7936-421d-a9b8-f37daf8efc1e\" (UID: \"9d9798e7-7936-421d-a9b8-f37daf8efc1e\") " Mar 12 15:58:04 crc kubenswrapper[4869]: I0312 15:58:04.867896 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d9798e7-7936-421d-a9b8-f37daf8efc1e-kube-api-access-qfnq6" (OuterVolumeSpecName: "kube-api-access-qfnq6") pod "9d9798e7-7936-421d-a9b8-f37daf8efc1e" (UID: "9d9798e7-7936-421d-a9b8-f37daf8efc1e"). InnerVolumeSpecName "kube-api-access-qfnq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:58:04 crc kubenswrapper[4869]: I0312 15:58:04.950992 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfnq6\" (UniqueName: \"kubernetes.io/projected/9d9798e7-7936-421d-a9b8-f37daf8efc1e-kube-api-access-qfnq6\") on node \"crc\" DevicePath \"\"" Mar 12 15:58:05 crc kubenswrapper[4869]: I0312 15:58:05.217904 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555518-9lfc8" event={"ID":"9d9798e7-7936-421d-a9b8-f37daf8efc1e","Type":"ContainerDied","Data":"46dd94105b4f2d6e22bd5acb2fccb758cd8713e643987c55c82b12cc9af1e3f6"} Mar 12 15:58:05 crc kubenswrapper[4869]: I0312 15:58:05.217944 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46dd94105b4f2d6e22bd5acb2fccb758cd8713e643987c55c82b12cc9af1e3f6" Mar 12 15:58:05 crc kubenswrapper[4869]: I0312 15:58:05.218240 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555518-9lfc8" Mar 12 15:58:05 crc kubenswrapper[4869]: I0312 15:58:05.881924 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555512-hk2tt"] Mar 12 15:58:05 crc kubenswrapper[4869]: I0312 15:58:05.892428 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555512-hk2tt"] Mar 12 15:58:06 crc kubenswrapper[4869]: I0312 15:58:06.350158 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fca69e87-876f-414e-b342-32d0b7f09fe7" path="/var/lib/kubelet/pods/fca69e87-876f-414e-b342-32d0b7f09fe7/volumes" Mar 12 15:58:45 crc kubenswrapper[4869]: I0312 15:58:45.971187 4869 scope.go:117] "RemoveContainer" containerID="4fc4e2c64a4d84a66221582e5d1750188efd995bcf776304da9a1caedfaa7cb6" Mar 12 15:58:53 crc kubenswrapper[4869]: I0312 15:58:53.153074 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bw6n6"] Mar 12 15:58:53 crc kubenswrapper[4869]: E0312 15:58:53.155255 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d9798e7-7936-421d-a9b8-f37daf8efc1e" containerName="oc" Mar 12 15:58:53 crc kubenswrapper[4869]: I0312 15:58:53.155470 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d9798e7-7936-421d-a9b8-f37daf8efc1e" containerName="oc" Mar 12 15:58:53 crc kubenswrapper[4869]: I0312 15:58:53.155793 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d9798e7-7936-421d-a9b8-f37daf8efc1e" containerName="oc" Mar 12 15:58:53 crc kubenswrapper[4869]: I0312 15:58:53.157581 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bw6n6" Mar 12 15:58:53 crc kubenswrapper[4869]: I0312 15:58:53.168413 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bw6n6"] Mar 12 15:58:53 crc kubenswrapper[4869]: I0312 15:58:53.230979 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d726db9-c68a-4cb2-a119-f01407ce7e4e-utilities\") pod \"redhat-marketplace-bw6n6\" (UID: \"7d726db9-c68a-4cb2-a119-f01407ce7e4e\") " pod="openshift-marketplace/redhat-marketplace-bw6n6" Mar 12 15:58:53 crc kubenswrapper[4869]: I0312 15:58:53.231336 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kxjj\" (UniqueName: \"kubernetes.io/projected/7d726db9-c68a-4cb2-a119-f01407ce7e4e-kube-api-access-5kxjj\") pod \"redhat-marketplace-bw6n6\" (UID: \"7d726db9-c68a-4cb2-a119-f01407ce7e4e\") " pod="openshift-marketplace/redhat-marketplace-bw6n6" Mar 12 15:58:53 crc kubenswrapper[4869]: I0312 15:58:53.231502 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d726db9-c68a-4cb2-a119-f01407ce7e4e-catalog-content\") pod \"redhat-marketplace-bw6n6\" (UID: \"7d726db9-c68a-4cb2-a119-f01407ce7e4e\") " pod="openshift-marketplace/redhat-marketplace-bw6n6" Mar 12 15:58:53 crc kubenswrapper[4869]: I0312 15:58:53.332919 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d726db9-c68a-4cb2-a119-f01407ce7e4e-catalog-content\") pod \"redhat-marketplace-bw6n6\" (UID: \"7d726db9-c68a-4cb2-a119-f01407ce7e4e\") " pod="openshift-marketplace/redhat-marketplace-bw6n6" Mar 12 15:58:53 crc kubenswrapper[4869]: I0312 15:58:53.333050 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d726db9-c68a-4cb2-a119-f01407ce7e4e-utilities\") pod \"redhat-marketplace-bw6n6\" (UID: \"7d726db9-c68a-4cb2-a119-f01407ce7e4e\") " pod="openshift-marketplace/redhat-marketplace-bw6n6" Mar 12 15:58:53 crc kubenswrapper[4869]: I0312 15:58:53.333165 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kxjj\" (UniqueName: \"kubernetes.io/projected/7d726db9-c68a-4cb2-a119-f01407ce7e4e-kube-api-access-5kxjj\") pod \"redhat-marketplace-bw6n6\" (UID: \"7d726db9-c68a-4cb2-a119-f01407ce7e4e\") " pod="openshift-marketplace/redhat-marketplace-bw6n6" Mar 12 15:58:53 crc kubenswrapper[4869]: I0312 15:58:53.333520 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d726db9-c68a-4cb2-a119-f01407ce7e4e-catalog-content\") pod \"redhat-marketplace-bw6n6\" (UID: \"7d726db9-c68a-4cb2-a119-f01407ce7e4e\") " pod="openshift-marketplace/redhat-marketplace-bw6n6" Mar 12 15:58:53 crc kubenswrapper[4869]: I0312 15:58:53.333630 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d726db9-c68a-4cb2-a119-f01407ce7e4e-utilities\") pod \"redhat-marketplace-bw6n6\" (UID: \"7d726db9-c68a-4cb2-a119-f01407ce7e4e\") " pod="openshift-marketplace/redhat-marketplace-bw6n6" Mar 12 15:58:53 crc kubenswrapper[4869]: I0312 15:58:53.366507 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kxjj\" (UniqueName: \"kubernetes.io/projected/7d726db9-c68a-4cb2-a119-f01407ce7e4e-kube-api-access-5kxjj\") pod \"redhat-marketplace-bw6n6\" (UID: \"7d726db9-c68a-4cb2-a119-f01407ce7e4e\") " pod="openshift-marketplace/redhat-marketplace-bw6n6" Mar 12 15:58:53 crc kubenswrapper[4869]: I0312 15:58:53.478250 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bw6n6" Mar 12 15:58:54 crc kubenswrapper[4869]: I0312 15:58:54.007388 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bw6n6"] Mar 12 15:58:54 crc kubenswrapper[4869]: I0312 15:58:54.697683 4869 generic.go:334] "Generic (PLEG): container finished" podID="7d726db9-c68a-4cb2-a119-f01407ce7e4e" containerID="b1bdae54ab26ac00c393c276571e97cddc4dece6cabad70f3823fcc84d604e17" exitCode=0 Mar 12 15:58:54 crc kubenswrapper[4869]: I0312 15:58:54.697737 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw6n6" event={"ID":"7d726db9-c68a-4cb2-a119-f01407ce7e4e","Type":"ContainerDied","Data":"b1bdae54ab26ac00c393c276571e97cddc4dece6cabad70f3823fcc84d604e17"} Mar 12 15:58:54 crc kubenswrapper[4869]: I0312 15:58:54.698069 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw6n6" event={"ID":"7d726db9-c68a-4cb2-a119-f01407ce7e4e","Type":"ContainerStarted","Data":"1b12a6cbec8e6193db1fc45c2e9d34b7536e1f9e6275bb130f59904e5a6de671"} Mar 12 15:58:55 crc kubenswrapper[4869]: I0312 15:58:55.710001 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw6n6" event={"ID":"7d726db9-c68a-4cb2-a119-f01407ce7e4e","Type":"ContainerStarted","Data":"c42bdd5cdb837a8d50e917464312b24abebb53c6e8ddd4932d2ce6285497dd77"} Mar 12 15:58:56 crc kubenswrapper[4869]: I0312 15:58:56.718412 4869 generic.go:334] "Generic (PLEG): container finished" podID="7d726db9-c68a-4cb2-a119-f01407ce7e4e" containerID="c42bdd5cdb837a8d50e917464312b24abebb53c6e8ddd4932d2ce6285497dd77" exitCode=0 Mar 12 15:58:56 crc kubenswrapper[4869]: I0312 15:58:56.718459 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw6n6" event={"ID":"7d726db9-c68a-4cb2-a119-f01407ce7e4e","Type":"ContainerDied","Data":"c42bdd5cdb837a8d50e917464312b24abebb53c6e8ddd4932d2ce6285497dd77"} Mar 12 15:58:57 crc kubenswrapper[4869]: I0312 15:58:57.730418 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw6n6" event={"ID":"7d726db9-c68a-4cb2-a119-f01407ce7e4e","Type":"ContainerStarted","Data":"51f5b04c25e7a1979cdaf8c1e00300c29bf9a1be8b18897399a44d094d8f9a89"} Mar 12 15:58:57 crc kubenswrapper[4869]: I0312 15:58:57.761390 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bw6n6" podStartSLOduration=2.062918839 podStartE2EDuration="4.761358358s" podCreationTimestamp="2026-03-12 15:58:53 +0000 UTC" firstStartedPulling="2026-03-12 15:58:54.70031793 +0000 UTC m=+4286.985543248" lastFinishedPulling="2026-03-12 15:58:57.398757489 +0000 UTC m=+4289.683982767" observedRunningTime="2026-03-12 15:58:57.755692548 +0000 UTC m=+4290.040917846" watchObservedRunningTime="2026-03-12 15:58:57.761358358 +0000 UTC m=+4290.046583736" Mar 12 15:59:03 crc kubenswrapper[4869]: I0312 15:59:03.479081 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bw6n6" Mar 12 15:59:03 crc kubenswrapper[4869]: I0312 15:59:03.479704 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bw6n6" Mar 12 15:59:03 crc kubenswrapper[4869]: I0312 15:59:03.546189 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bw6n6" Mar 12 15:59:03 crc kubenswrapper[4869]: I0312 15:59:03.829068 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bw6n6" Mar 12 15:59:03 crc kubenswrapper[4869]: I0312 15:59:03.880514 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bw6n6"] Mar 12 15:59:05 crc kubenswrapper[4869]: I0312 15:59:05.815341 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bw6n6" podUID="7d726db9-c68a-4cb2-a119-f01407ce7e4e" containerName="registry-server" containerID="cri-o://51f5b04c25e7a1979cdaf8c1e00300c29bf9a1be8b18897399a44d094d8f9a89" gracePeriod=2 Mar 12 15:59:06 crc kubenswrapper[4869]: I0312 15:59:06.258222 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bw6n6" Mar 12 15:59:06 crc kubenswrapper[4869]: I0312 15:59:06.418600 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kxjj\" (UniqueName: \"kubernetes.io/projected/7d726db9-c68a-4cb2-a119-f01407ce7e4e-kube-api-access-5kxjj\") pod \"7d726db9-c68a-4cb2-a119-f01407ce7e4e\" (UID: \"7d726db9-c68a-4cb2-a119-f01407ce7e4e\") " Mar 12 15:59:06 crc kubenswrapper[4869]: I0312 15:59:06.418711 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d726db9-c68a-4cb2-a119-f01407ce7e4e-utilities\") pod \"7d726db9-c68a-4cb2-a119-f01407ce7e4e\" (UID: \"7d726db9-c68a-4cb2-a119-f01407ce7e4e\") " Mar 12 15:59:06 crc kubenswrapper[4869]: I0312 15:59:06.418756 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d726db9-c68a-4cb2-a119-f01407ce7e4e-catalog-content\") pod \"7d726db9-c68a-4cb2-a119-f01407ce7e4e\" (UID: \"7d726db9-c68a-4cb2-a119-f01407ce7e4e\") " Mar 12 15:59:06 crc kubenswrapper[4869]: I0312 15:59:06.419537 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d726db9-c68a-4cb2-a119-f01407ce7e4e-utilities" (OuterVolumeSpecName: "utilities") pod "7d726db9-c68a-4cb2-a119-f01407ce7e4e" (UID: "7d726db9-c68a-4cb2-a119-f01407ce7e4e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:59:06 crc kubenswrapper[4869]: I0312 15:59:06.426420 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d726db9-c68a-4cb2-a119-f01407ce7e4e-kube-api-access-5kxjj" (OuterVolumeSpecName: "kube-api-access-5kxjj") pod "7d726db9-c68a-4cb2-a119-f01407ce7e4e" (UID: "7d726db9-c68a-4cb2-a119-f01407ce7e4e"). InnerVolumeSpecName "kube-api-access-5kxjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:59:06 crc kubenswrapper[4869]: I0312 15:59:06.458765 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d726db9-c68a-4cb2-a119-f01407ce7e4e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7d726db9-c68a-4cb2-a119-f01407ce7e4e" (UID: "7d726db9-c68a-4cb2-a119-f01407ce7e4e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:59:06 crc kubenswrapper[4869]: I0312 15:59:06.521068 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kxjj\" (UniqueName: \"kubernetes.io/projected/7d726db9-c68a-4cb2-a119-f01407ce7e4e-kube-api-access-5kxjj\") on node \"crc\" DevicePath \"\"" Mar 12 15:59:06 crc kubenswrapper[4869]: I0312 15:59:06.521105 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d726db9-c68a-4cb2-a119-f01407ce7e4e-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:59:06 crc kubenswrapper[4869]: I0312 15:59:06.521118 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d726db9-c68a-4cb2-a119-f01407ce7e4e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:59:06 crc kubenswrapper[4869]: I0312 15:59:06.828774 4869 generic.go:334] "Generic (PLEG): container finished" podID="7d726db9-c68a-4cb2-a119-f01407ce7e4e" containerID="51f5b04c25e7a1979cdaf8c1e00300c29bf9a1be8b18897399a44d094d8f9a89" exitCode=0 Mar 12 15:59:06 crc kubenswrapper[4869]: I0312 15:59:06.828843 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bw6n6" Mar 12 15:59:06 crc kubenswrapper[4869]: I0312 15:59:06.828828 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw6n6" event={"ID":"7d726db9-c68a-4cb2-a119-f01407ce7e4e","Type":"ContainerDied","Data":"51f5b04c25e7a1979cdaf8c1e00300c29bf9a1be8b18897399a44d094d8f9a89"} Mar 12 15:59:06 crc kubenswrapper[4869]: I0312 15:59:06.829273 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw6n6" event={"ID":"7d726db9-c68a-4cb2-a119-f01407ce7e4e","Type":"ContainerDied","Data":"1b12a6cbec8e6193db1fc45c2e9d34b7536e1f9e6275bb130f59904e5a6de671"} Mar 12 15:59:06 crc kubenswrapper[4869]: I0312 15:59:06.829337 4869 scope.go:117] "RemoveContainer" containerID="51f5b04c25e7a1979cdaf8c1e00300c29bf9a1be8b18897399a44d094d8f9a89" Mar 12 15:59:06 crc kubenswrapper[4869]: I0312 15:59:06.863081 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bw6n6"] Mar 12 15:59:06 crc kubenswrapper[4869]: I0312 15:59:06.863214 4869 scope.go:117] "RemoveContainer" containerID="c42bdd5cdb837a8d50e917464312b24abebb53c6e8ddd4932d2ce6285497dd77" Mar 12 15:59:06 crc kubenswrapper[4869]: I0312 15:59:06.874478 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bw6n6"] Mar 12 15:59:06 crc kubenswrapper[4869]: I0312 15:59:06.882099 4869 scope.go:117] "RemoveContainer" containerID="b1bdae54ab26ac00c393c276571e97cddc4dece6cabad70f3823fcc84d604e17" Mar 12 15:59:06 crc kubenswrapper[4869]: I0312 15:59:06.944180 4869 scope.go:117] "RemoveContainer" containerID="51f5b04c25e7a1979cdaf8c1e00300c29bf9a1be8b18897399a44d094d8f9a89" Mar 12 15:59:06 crc kubenswrapper[4869]: E0312 15:59:06.944631 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51f5b04c25e7a1979cdaf8c1e00300c29bf9a1be8b18897399a44d094d8f9a89\": container with ID starting with 51f5b04c25e7a1979cdaf8c1e00300c29bf9a1be8b18897399a44d094d8f9a89 not found: ID does not exist" containerID="51f5b04c25e7a1979cdaf8c1e00300c29bf9a1be8b18897399a44d094d8f9a89" Mar 12 15:59:06 crc kubenswrapper[4869]: I0312 15:59:06.944689 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51f5b04c25e7a1979cdaf8c1e00300c29bf9a1be8b18897399a44d094d8f9a89"} err="failed to get container status \"51f5b04c25e7a1979cdaf8c1e00300c29bf9a1be8b18897399a44d094d8f9a89\": rpc error: code = NotFound desc = could not find container \"51f5b04c25e7a1979cdaf8c1e00300c29bf9a1be8b18897399a44d094d8f9a89\": container with ID starting with 51f5b04c25e7a1979cdaf8c1e00300c29bf9a1be8b18897399a44d094d8f9a89 not found: ID does not exist" Mar 12 15:59:06 crc kubenswrapper[4869]: I0312 15:59:06.944724 4869 scope.go:117] "RemoveContainer" containerID="c42bdd5cdb837a8d50e917464312b24abebb53c6e8ddd4932d2ce6285497dd77" Mar 12 15:59:06 crc kubenswrapper[4869]: E0312 15:59:06.945046 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c42bdd5cdb837a8d50e917464312b24abebb53c6e8ddd4932d2ce6285497dd77\": container with ID starting with c42bdd5cdb837a8d50e917464312b24abebb53c6e8ddd4932d2ce6285497dd77 not found: ID does not exist" containerID="c42bdd5cdb837a8d50e917464312b24abebb53c6e8ddd4932d2ce6285497dd77" Mar 12 15:59:06 crc kubenswrapper[4869]: I0312 15:59:06.945071 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c42bdd5cdb837a8d50e917464312b24abebb53c6e8ddd4932d2ce6285497dd77"} err="failed to get container status \"c42bdd5cdb837a8d50e917464312b24abebb53c6e8ddd4932d2ce6285497dd77\": rpc error: code = NotFound desc = could not find container \"c42bdd5cdb837a8d50e917464312b24abebb53c6e8ddd4932d2ce6285497dd77\": container with ID starting with c42bdd5cdb837a8d50e917464312b24abebb53c6e8ddd4932d2ce6285497dd77 not found: ID does not exist" Mar 12 15:59:06 crc kubenswrapper[4869]: I0312 15:59:06.945089 4869 scope.go:117] "RemoveContainer" containerID="b1bdae54ab26ac00c393c276571e97cddc4dece6cabad70f3823fcc84d604e17" Mar 12 15:59:06 crc kubenswrapper[4869]: E0312 15:59:06.945377 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1bdae54ab26ac00c393c276571e97cddc4dece6cabad70f3823fcc84d604e17\": container with ID starting with b1bdae54ab26ac00c393c276571e97cddc4dece6cabad70f3823fcc84d604e17 not found: ID does not exist" containerID="b1bdae54ab26ac00c393c276571e97cddc4dece6cabad70f3823fcc84d604e17" Mar 12 15:59:06 crc kubenswrapper[4869]: I0312 15:59:06.945408 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1bdae54ab26ac00c393c276571e97cddc4dece6cabad70f3823fcc84d604e17"} err="failed to get container status \"b1bdae54ab26ac00c393c276571e97cddc4dece6cabad70f3823fcc84d604e17\": rpc error: code = NotFound desc = could not find container \"b1bdae54ab26ac00c393c276571e97cddc4dece6cabad70f3823fcc84d604e17\": container with ID starting with b1bdae54ab26ac00c393c276571e97cddc4dece6cabad70f3823fcc84d604e17 not found: ID does not exist" Mar 12 15:59:08 crc kubenswrapper[4869]: I0312 15:59:08.350239 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d726db9-c68a-4cb2-a119-f01407ce7e4e" path="/var/lib/kubelet/pods/7d726db9-c68a-4cb2-a119-f01407ce7e4e/volumes" Mar 12 15:59:19 crc kubenswrapper[4869]: I0312 15:59:19.685002 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:59:19 crc kubenswrapper[4869]: I0312 15:59:19.687906 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:59:49 crc kubenswrapper[4869]: I0312 15:59:49.683910 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:59:49 crc kubenswrapper[4869]: I0312 15:59:49.684712 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 16:00:00 crc kubenswrapper[4869]: I0312 16:00:00.155841 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555520-k4bsz"] Mar 12 16:00:00 crc kubenswrapper[4869]: E0312 16:00:00.156906 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d726db9-c68a-4cb2-a119-f01407ce7e4e" containerName="registry-server" Mar 12 16:00:00 crc kubenswrapper[4869]: I0312 16:00:00.156923 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d726db9-c68a-4cb2-a119-f01407ce7e4e" containerName="registry-server" Mar 12 16:00:00 crc kubenswrapper[4869]: E0312 16:00:00.156949 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d726db9-c68a-4cb2-a119-f01407ce7e4e" containerName="extract-utilities" Mar 12 16:00:00 crc kubenswrapper[4869]: I0312 16:00:00.156958 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d726db9-c68a-4cb2-a119-f01407ce7e4e" containerName="extract-utilities" Mar 12 16:00:00 crc kubenswrapper[4869]: E0312 16:00:00.156982 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d726db9-c68a-4cb2-a119-f01407ce7e4e" containerName="extract-content" Mar 12 16:00:00 crc kubenswrapper[4869]: I0312 16:00:00.156990 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d726db9-c68a-4cb2-a119-f01407ce7e4e" containerName="extract-content" Mar 12 16:00:00 crc kubenswrapper[4869]: I0312 16:00:00.157222 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d726db9-c68a-4cb2-a119-f01407ce7e4e" containerName="registry-server" Mar 12 16:00:00 crc kubenswrapper[4869]: I0312 16:00:00.157988 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555520-k4bsz" Mar 12 16:00:00 crc kubenswrapper[4869]: I0312 16:00:00.160769 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 12 16:00:00 crc kubenswrapper[4869]: I0312 16:00:00.160804 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 12 16:00:00 crc kubenswrapper[4869]: I0312 16:00:00.165909 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555520-v8kjt"] Mar 12 16:00:00 crc kubenswrapper[4869]: I0312 16:00:00.168283 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555520-v8kjt" Mar 12 16:00:00 crc kubenswrapper[4869]: I0312 16:00:00.172023 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 16:00:00 crc kubenswrapper[4869]: I0312 16:00:00.172227 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 16:00:00 crc kubenswrapper[4869]: I0312 16:00:00.173761 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 16:00:00 crc kubenswrapper[4869]: I0312 16:00:00.215503 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555520-k4bsz"] Mar 12 16:00:00 crc kubenswrapper[4869]: I0312 16:00:00.224872 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555520-v8kjt"] Mar 12 16:00:00 crc kubenswrapper[4869]: I0312 16:00:00.239262 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hltzd\" (UniqueName: \"kubernetes.io/projected/eb4e751e-a8db-4ffd-97e9-a0dba749a305-kube-api-access-hltzd\") pod \"auto-csr-approver-29555520-v8kjt\" (UID: \"eb4e751e-a8db-4ffd-97e9-a0dba749a305\") " pod="openshift-infra/auto-csr-approver-29555520-v8kjt" Mar 12 16:00:00 crc kubenswrapper[4869]: I0312 16:00:00.243956 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a784a45e-9fc5-42d0-b86e-70a2c75e22d8-config-volume\") pod \"collect-profiles-29555520-k4bsz\" (UID: \"a784a45e-9fc5-42d0-b86e-70a2c75e22d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555520-k4bsz" Mar 12 16:00:00 crc kubenswrapper[4869]: I0312 16:00:00.244312 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp4dz\" (UniqueName: \"kubernetes.io/projected/a784a45e-9fc5-42d0-b86e-70a2c75e22d8-kube-api-access-hp4dz\") pod \"collect-profiles-29555520-k4bsz\" (UID: \"a784a45e-9fc5-42d0-b86e-70a2c75e22d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555520-k4bsz" Mar 12 16:00:00 crc kubenswrapper[4869]: I0312 16:00:00.244502 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a784a45e-9fc5-42d0-b86e-70a2c75e22d8-secret-volume\") pod \"collect-profiles-29555520-k4bsz\" (UID: \"a784a45e-9fc5-42d0-b86e-70a2c75e22d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555520-k4bsz" Mar 12 16:00:00 crc kubenswrapper[4869]: I0312 16:00:00.347756 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp4dz\" (UniqueName: \"kubernetes.io/projected/a784a45e-9fc5-42d0-b86e-70a2c75e22d8-kube-api-access-hp4dz\") pod \"collect-profiles-29555520-k4bsz\" (UID: \"a784a45e-9fc5-42d0-b86e-70a2c75e22d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555520-k4bsz" Mar 12 16:00:00 crc kubenswrapper[4869]: I0312 16:00:00.347853 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a784a45e-9fc5-42d0-b86e-70a2c75e22d8-secret-volume\") pod \"collect-profiles-29555520-k4bsz\" (UID: \"a784a45e-9fc5-42d0-b86e-70a2c75e22d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555520-k4bsz" Mar 12 16:00:00 crc kubenswrapper[4869]: I0312 16:00:00.347934 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hltzd\" (UniqueName: \"kubernetes.io/projected/eb4e751e-a8db-4ffd-97e9-a0dba749a305-kube-api-access-hltzd\") pod \"auto-csr-approver-29555520-v8kjt\" (UID: \"eb4e751e-a8db-4ffd-97e9-a0dba749a305\") " pod="openshift-infra/auto-csr-approver-29555520-v8kjt" Mar 12 16:00:00 crc kubenswrapper[4869]: I0312 16:00:00.347971 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a784a45e-9fc5-42d0-b86e-70a2c75e22d8-config-volume\") pod \"collect-profiles-29555520-k4bsz\" (UID: \"a784a45e-9fc5-42d0-b86e-70a2c75e22d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555520-k4bsz" Mar 12 16:00:00 crc kubenswrapper[4869]: I0312 16:00:00.350140 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a784a45e-9fc5-42d0-b86e-70a2c75e22d8-config-volume\") pod \"collect-profiles-29555520-k4bsz\" (UID: \"a784a45e-9fc5-42d0-b86e-70a2c75e22d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555520-k4bsz" Mar 12 16:00:00 crc kubenswrapper[4869]: I0312 16:00:00.355498 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a784a45e-9fc5-42d0-b86e-70a2c75e22d8-secret-volume\") pod \"collect-profiles-29555520-k4bsz\" (UID: \"a784a45e-9fc5-42d0-b86e-70a2c75e22d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555520-k4bsz" Mar 12 16:00:00 crc kubenswrapper[4869]: I0312 16:00:00.384315 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hltzd\" (UniqueName: \"kubernetes.io/projected/eb4e751e-a8db-4ffd-97e9-a0dba749a305-kube-api-access-hltzd\") pod \"auto-csr-approver-29555520-v8kjt\" (UID: \"eb4e751e-a8db-4ffd-97e9-a0dba749a305\") " pod="openshift-infra/auto-csr-approver-29555520-v8kjt" Mar 12 16:00:00 crc kubenswrapper[4869]: I0312 16:00:00.385264 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp4dz\" (UniqueName: \"kubernetes.io/projected/a784a45e-9fc5-42d0-b86e-70a2c75e22d8-kube-api-access-hp4dz\") pod \"collect-profiles-29555520-k4bsz\" (UID: \"a784a45e-9fc5-42d0-b86e-70a2c75e22d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555520-k4bsz" Mar 12 16:00:00 crc kubenswrapper[4869]: I0312 16:00:00.510089 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555520-k4bsz" Mar 12 16:00:00 crc kubenswrapper[4869]: I0312 16:00:00.519079 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555520-v8kjt" Mar 12 16:00:01 crc kubenswrapper[4869]: I0312 16:00:00.992213 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555520-k4bsz"] Mar 12 16:00:01 crc kubenswrapper[4869]: I0312 16:00:01.089111 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555520-v8kjt"] Mar 12 16:00:01 crc kubenswrapper[4869]: W0312 16:00:01.090153 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb4e751e_a8db_4ffd_97e9_a0dba749a305.slice/crio-74803229155ded55976f8f58b073a5ce3658b7148b0a4e06318989bb53e08254 WatchSource:0}: Error finding container 74803229155ded55976f8f58b073a5ce3658b7148b0a4e06318989bb53e08254: Status 404 returned error can't find the container with id 74803229155ded55976f8f58b073a5ce3658b7148b0a4e06318989bb53e08254 Mar 12 16:00:01 crc kubenswrapper[4869]: I0312 16:00:01.380315 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555520-k4bsz" event={"ID":"a784a45e-9fc5-42d0-b86e-70a2c75e22d8","Type":"ContainerStarted","Data":"279763dafffd11907dd2eb6174ad0ed0c3a2eee7ccd85a62ca0ccf2a7818168a"} Mar 12 16:00:01 crc kubenswrapper[4869]: I0312 16:00:01.381905 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555520-k4bsz" event={"ID":"a784a45e-9fc5-42d0-b86e-70a2c75e22d8","Type":"ContainerStarted","Data":"70b1e6c40b4094d4b306c04a0a9546617e45daf6e3db1d58aab71b24dcf11232"} Mar 12 16:00:01 crc kubenswrapper[4869]: I0312 16:00:01.382123 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555520-v8kjt" event={"ID":"eb4e751e-a8db-4ffd-97e9-a0dba749a305","Type":"ContainerStarted","Data":"74803229155ded55976f8f58b073a5ce3658b7148b0a4e06318989bb53e08254"} Mar 12 16:00:01 crc kubenswrapper[4869]: I0312 16:00:01.404038 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29555520-k4bsz" podStartSLOduration=1.4040123979999999 podStartE2EDuration="1.404012398s" podCreationTimestamp="2026-03-12 16:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:00:01.399171801 +0000 UTC m=+4353.684397119" watchObservedRunningTime="2026-03-12 16:00:01.404012398 +0000 UTC m=+4353.689237716" Mar 12 16:00:02 crc kubenswrapper[4869]: I0312 16:00:02.394933 4869 generic.go:334] "Generic (PLEG): container finished" podID="a784a45e-9fc5-42d0-b86e-70a2c75e22d8" containerID="279763dafffd11907dd2eb6174ad0ed0c3a2eee7ccd85a62ca0ccf2a7818168a" exitCode=0 Mar 12 16:00:02 crc kubenswrapper[4869]: I0312 16:00:02.395158 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555520-k4bsz" event={"ID":"a784a45e-9fc5-42d0-b86e-70a2c75e22d8","Type":"ContainerDied","Data":"279763dafffd11907dd2eb6174ad0ed0c3a2eee7ccd85a62ca0ccf2a7818168a"} Mar 12 16:00:03 crc kubenswrapper[4869]: I0312 16:00:03.800577 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555520-k4bsz" Mar 12 16:00:03 crc kubenswrapper[4869]: I0312 16:00:03.931625 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp4dz\" (UniqueName: \"kubernetes.io/projected/a784a45e-9fc5-42d0-b86e-70a2c75e22d8-kube-api-access-hp4dz\") pod \"a784a45e-9fc5-42d0-b86e-70a2c75e22d8\" (UID: \"a784a45e-9fc5-42d0-b86e-70a2c75e22d8\") " Mar 12 16:00:03 crc kubenswrapper[4869]: I0312 16:00:03.932061 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a784a45e-9fc5-42d0-b86e-70a2c75e22d8-config-volume\") pod \"a784a45e-9fc5-42d0-b86e-70a2c75e22d8\" (UID: \"a784a45e-9fc5-42d0-b86e-70a2c75e22d8\") " Mar 12 16:00:03 crc kubenswrapper[4869]: I0312 16:00:03.932291 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a784a45e-9fc5-42d0-b86e-70a2c75e22d8-secret-volume\") pod \"a784a45e-9fc5-42d0-b86e-70a2c75e22d8\" (UID: \"a784a45e-9fc5-42d0-b86e-70a2c75e22d8\") " Mar 12 16:00:03 crc kubenswrapper[4869]: I0312 16:00:03.932600 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a784a45e-9fc5-42d0-b86e-70a2c75e22d8-config-volume" (OuterVolumeSpecName: "config-volume") pod "a784a45e-9fc5-42d0-b86e-70a2c75e22d8" (UID: "a784a45e-9fc5-42d0-b86e-70a2c75e22d8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 16:00:03 crc kubenswrapper[4869]: I0312 16:00:03.933277 4869 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a784a45e-9fc5-42d0-b86e-70a2c75e22d8-config-volume\") on node \"crc\" DevicePath \"\"" Mar 12 16:00:03 crc kubenswrapper[4869]: I0312 16:00:03.937995 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a784a45e-9fc5-42d0-b86e-70a2c75e22d8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a784a45e-9fc5-42d0-b86e-70a2c75e22d8" (UID: "a784a45e-9fc5-42d0-b86e-70a2c75e22d8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 16:00:03 crc kubenswrapper[4869]: I0312 16:00:03.952260 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a784a45e-9fc5-42d0-b86e-70a2c75e22d8-kube-api-access-hp4dz" (OuterVolumeSpecName: "kube-api-access-hp4dz") pod "a784a45e-9fc5-42d0-b86e-70a2c75e22d8" (UID: "a784a45e-9fc5-42d0-b86e-70a2c75e22d8"). InnerVolumeSpecName "kube-api-access-hp4dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 16:00:04 crc kubenswrapper[4869]: I0312 16:00:04.034891 4869 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a784a45e-9fc5-42d0-b86e-70a2c75e22d8-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 12 16:00:04 crc kubenswrapper[4869]: I0312 16:00:04.034940 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp4dz\" (UniqueName: \"kubernetes.io/projected/a784a45e-9fc5-42d0-b86e-70a2c75e22d8-kube-api-access-hp4dz\") on node \"crc\" DevicePath \"\"" Mar 12 16:00:04 crc kubenswrapper[4869]: I0312 16:00:04.412452 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555520-k4bsz" event={"ID":"a784a45e-9fc5-42d0-b86e-70a2c75e22d8","Type":"ContainerDied","Data":"70b1e6c40b4094d4b306c04a0a9546617e45daf6e3db1d58aab71b24dcf11232"} Mar 12 16:00:04 crc kubenswrapper[4869]: I0312 16:00:04.412499 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70b1e6c40b4094d4b306c04a0a9546617e45daf6e3db1d58aab71b24dcf11232" Mar 12 16:00:04 crc kubenswrapper[4869]: I0312 16:00:04.412560 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555520-k4bsz" Mar 12 16:00:04 crc kubenswrapper[4869]: I0312 16:00:04.475413 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555475-zv8vj"] Mar 12 16:00:04 crc kubenswrapper[4869]: I0312 16:00:04.484133 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555475-zv8vj"] Mar 12 16:00:06 crc kubenswrapper[4869]: I0312 16:00:06.352925 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f762edf1-d60b-44cf-a949-f8a8d53599c3" path="/var/lib/kubelet/pods/f762edf1-d60b-44cf-a949-f8a8d53599c3/volumes" Mar 12 16:00:19 crc kubenswrapper[4869]: I0312 16:00:19.684169 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 16:00:19 crc kubenswrapper[4869]: I0312 16:00:19.684993 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 16:00:19 crc kubenswrapper[4869]: I0312 16:00:19.685061 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" Mar 12 16:00:19 crc kubenswrapper[4869]: I0312 16:00:19.686044 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4f1f716a83ed2ed2475b6a6f3d5f92c71cf571d0688c327fc80039a9d481905b"} pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 16:00:19 crc kubenswrapper[4869]: I0312 16:00:19.686154 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" containerID="cri-o://4f1f716a83ed2ed2475b6a6f3d5f92c71cf571d0688c327fc80039a9d481905b" gracePeriod=600 Mar 12 16:00:20 crc kubenswrapper[4869]: I0312 16:00:20.576270 4869 generic.go:334] "Generic (PLEG): container finished" podID="1621c994-94d2-4105-a988-f4739518ba91" containerID="4f1f716a83ed2ed2475b6a6f3d5f92c71cf571d0688c327fc80039a9d481905b" exitCode=0 Mar 12 16:00:20 crc kubenswrapper[4869]: I0312 16:00:20.576335 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerDied","Data":"4f1f716a83ed2ed2475b6a6f3d5f92c71cf571d0688c327fc80039a9d481905b"} Mar 12 16:00:20 crc kubenswrapper[4869]: I0312 16:00:20.576827 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerStarted","Data":"7f386f4ac9366e91219ce91ba4e94753e5c56573855dfb2b2ddd058f1eec367e"} Mar 12 16:00:20 crc kubenswrapper[4869]: I0312 16:00:20.576849 4869 scope.go:117] "RemoveContainer" containerID="1c13a2dfc9197e0c3d6275fadcbbd981ea89dbc3fccb1c99e4066afa6307103c" Mar 12 16:00:28 crc kubenswrapper[4869]: I0312 16:00:28.652904 4869 generic.go:334] "Generic (PLEG): container finished" podID="eb4e751e-a8db-4ffd-97e9-a0dba749a305" containerID="cf11f4c01b7522bb921936f8376c24ccc4d962bfce414ecc898c2b8a038f1114" exitCode=0 Mar 12 16:00:28 crc kubenswrapper[4869]: I0312 16:00:28.652984 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555520-v8kjt" event={"ID":"eb4e751e-a8db-4ffd-97e9-a0dba749a305","Type":"ContainerDied","Data":"cf11f4c01b7522bb921936f8376c24ccc4d962bfce414ecc898c2b8a038f1114"} Mar 12 16:00:30 crc kubenswrapper[4869]: I0312 16:00:30.020972 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555520-v8kjt" Mar 12 16:00:30 crc kubenswrapper[4869]: I0312 16:00:30.193278 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hltzd\" (UniqueName: \"kubernetes.io/projected/eb4e751e-a8db-4ffd-97e9-a0dba749a305-kube-api-access-hltzd\") pod \"eb4e751e-a8db-4ffd-97e9-a0dba749a305\" (UID: \"eb4e751e-a8db-4ffd-97e9-a0dba749a305\") " Mar 12 16:00:30 crc kubenswrapper[4869]: I0312 16:00:30.203398 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb4e751e-a8db-4ffd-97e9-a0dba749a305-kube-api-access-hltzd" (OuterVolumeSpecName: "kube-api-access-hltzd") pod "eb4e751e-a8db-4ffd-97e9-a0dba749a305" (UID: "eb4e751e-a8db-4ffd-97e9-a0dba749a305"). InnerVolumeSpecName "kube-api-access-hltzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 16:00:30 crc kubenswrapper[4869]: I0312 16:00:30.297108 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hltzd\" (UniqueName: \"kubernetes.io/projected/eb4e751e-a8db-4ffd-97e9-a0dba749a305-kube-api-access-hltzd\") on node \"crc\" DevicePath \"\"" Mar 12 16:00:30 crc kubenswrapper[4869]: I0312 16:00:30.673284 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555520-v8kjt" event={"ID":"eb4e751e-a8db-4ffd-97e9-a0dba749a305","Type":"ContainerDied","Data":"74803229155ded55976f8f58b073a5ce3658b7148b0a4e06318989bb53e08254"} Mar 12 16:00:30 crc kubenswrapper[4869]: I0312 16:00:30.673333 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74803229155ded55976f8f58b073a5ce3658b7148b0a4e06318989bb53e08254" Mar 12 16:00:30 crc kubenswrapper[4869]: I0312 16:00:30.673381 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555520-v8kjt" Mar 12 16:00:31 crc kubenswrapper[4869]: I0312 16:00:31.105814 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555514-6vcqk"] Mar 12 16:00:31 crc kubenswrapper[4869]: I0312 16:00:31.117484 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555514-6vcqk"] Mar 12 16:00:32 crc kubenswrapper[4869]: I0312 16:00:32.351739 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33af9345-810d-48e3-9531-a4d445b5a6a1" path="/var/lib/kubelet/pods/33af9345-810d-48e3-9531-a4d445b5a6a1/volumes" Mar 12 16:00:46 crc kubenswrapper[4869]: I0312 16:00:46.118737 4869 scope.go:117] "RemoveContainer" containerID="20b506cbc56dc3d139f6dd738d14366b6c8d4771496087708289001b10cbbe7a" Mar 12 16:00:46 crc kubenswrapper[4869]: I0312 16:00:46.150927 4869 scope.go:117] "RemoveContainer" containerID="440387b5434a645f86069b4f40bcdf88e7bf4d75483193ec2bb74b40445ac8d9" Mar 12 16:01:00 crc kubenswrapper[4869]: I0312 16:01:00.156601 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29555521-g6jvn"] Mar 12 16:01:00 crc kubenswrapper[4869]: E0312 16:01:00.158746 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a784a45e-9fc5-42d0-b86e-70a2c75e22d8" containerName="collect-profiles" Mar 12 16:01:00 crc kubenswrapper[4869]: I0312 16:01:00.158782 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="a784a45e-9fc5-42d0-b86e-70a2c75e22d8" containerName="collect-profiles" Mar 12 16:01:00 crc kubenswrapper[4869]: E0312 16:01:00.158862 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4e751e-a8db-4ffd-97e9-a0dba749a305" containerName="oc" Mar 12 16:01:00 crc kubenswrapper[4869]: I0312 16:01:00.158889 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4e751e-a8db-4ffd-97e9-a0dba749a305" containerName="oc" Mar 12 16:01:00 crc kubenswrapper[4869]: I0312 16:01:00.159671 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="a784a45e-9fc5-42d0-b86e-70a2c75e22d8" containerName="collect-profiles" Mar 12 16:01:00 crc kubenswrapper[4869]: I0312 16:01:00.159744 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb4e751e-a8db-4ffd-97e9-a0dba749a305" containerName="oc" Mar 12 16:01:00 crc kubenswrapper[4869]: I0312 16:01:00.161108 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29555521-g6jvn" Mar 12 16:01:00 crc kubenswrapper[4869]: I0312 16:01:00.177360 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29555521-g6jvn"] Mar 12 16:01:00 crc kubenswrapper[4869]: I0312 16:01:00.354412 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2481b36-642b-40ec-a6b2-c5f9750997a1-combined-ca-bundle\") pod \"keystone-cron-29555521-g6jvn\" (UID: \"e2481b36-642b-40ec-a6b2-c5f9750997a1\") " pod="openstack/keystone-cron-29555521-g6jvn" Mar 12 16:01:00 crc kubenswrapper[4869]: I0312 16:01:00.355143 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2481b36-642b-40ec-a6b2-c5f9750997a1-config-data\") pod \"keystone-cron-29555521-g6jvn\" (UID: \"e2481b36-642b-40ec-a6b2-c5f9750997a1\") " pod="openstack/keystone-cron-29555521-g6jvn" Mar 12 16:01:00 crc kubenswrapper[4869]: I0312 16:01:00.355203 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e2481b36-642b-40ec-a6b2-c5f9750997a1-fernet-keys\") pod \"keystone-cron-29555521-g6jvn\" (UID: \"e2481b36-642b-40ec-a6b2-c5f9750997a1\") " pod="openstack/keystone-cron-29555521-g6jvn" Mar 12 16:01:00 crc kubenswrapper[4869]: I0312 16:01:00.355980 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5rnz\" (UniqueName: \"kubernetes.io/projected/e2481b36-642b-40ec-a6b2-c5f9750997a1-kube-api-access-w5rnz\") pod \"keystone-cron-29555521-g6jvn\" (UID: \"e2481b36-642b-40ec-a6b2-c5f9750997a1\") " pod="openstack/keystone-cron-29555521-g6jvn" Mar 12 16:01:00 crc kubenswrapper[4869]: I0312 16:01:00.457555 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2481b36-642b-40ec-a6b2-c5f9750997a1-combined-ca-bundle\") pod \"keystone-cron-29555521-g6jvn\" (UID: \"e2481b36-642b-40ec-a6b2-c5f9750997a1\") " pod="openstack/keystone-cron-29555521-g6jvn" Mar 12 16:01:00 crc kubenswrapper[4869]: I0312 16:01:00.457661 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2481b36-642b-40ec-a6b2-c5f9750997a1-config-data\") pod \"keystone-cron-29555521-g6jvn\" (UID: \"e2481b36-642b-40ec-a6b2-c5f9750997a1\") " pod="openstack/keystone-cron-29555521-g6jvn" Mar 12 16:01:00 crc kubenswrapper[4869]: I0312 16:01:00.457693 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e2481b36-642b-40ec-a6b2-c5f9750997a1-fernet-keys\") pod \"keystone-cron-29555521-g6jvn\" (UID: \"e2481b36-642b-40ec-a6b2-c5f9750997a1\") " pod="openstack/keystone-cron-29555521-g6jvn" Mar 12 16:01:00 crc kubenswrapper[4869]: I0312 16:01:00.457733 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5rnz\" (UniqueName: \"kubernetes.io/projected/e2481b36-642b-40ec-a6b2-c5f9750997a1-kube-api-access-w5rnz\") pod \"keystone-cron-29555521-g6jvn\" (UID: \"e2481b36-642b-40ec-a6b2-c5f9750997a1\") " pod="openstack/keystone-cron-29555521-g6jvn" Mar 12 16:01:00 crc kubenswrapper[4869]: I0312 16:01:00.678166 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2481b36-642b-40ec-a6b2-c5f9750997a1-combined-ca-bundle\") pod \"keystone-cron-29555521-g6jvn\" (UID: \"e2481b36-642b-40ec-a6b2-c5f9750997a1\") " pod="openstack/keystone-cron-29555521-g6jvn" Mar 12 16:01:00 crc kubenswrapper[4869]: I0312 16:01:00.678482 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e2481b36-642b-40ec-a6b2-c5f9750997a1-fernet-keys\") pod \"keystone-cron-29555521-g6jvn\" (UID: \"e2481b36-642b-40ec-a6b2-c5f9750997a1\") " pod="openstack/keystone-cron-29555521-g6jvn" Mar 12 16:01:00 crc kubenswrapper[4869]: I0312 16:01:00.679033 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2481b36-642b-40ec-a6b2-c5f9750997a1-config-data\") pod \"keystone-cron-29555521-g6jvn\" (UID: \"e2481b36-642b-40ec-a6b2-c5f9750997a1\") " pod="openstack/keystone-cron-29555521-g6jvn" Mar 12 16:01:00 crc kubenswrapper[4869]: I0312 16:01:00.679852 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5rnz\" (UniqueName: \"kubernetes.io/projected/e2481b36-642b-40ec-a6b2-c5f9750997a1-kube-api-access-w5rnz\") pod \"keystone-cron-29555521-g6jvn\" (UID: \"e2481b36-642b-40ec-a6b2-c5f9750997a1\") " pod="openstack/keystone-cron-29555521-g6jvn" Mar 12 16:01:00 crc kubenswrapper[4869]: I0312 16:01:00.807156 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29555521-g6jvn" Mar 12 16:01:01 crc kubenswrapper[4869]: I0312 16:01:01.277619 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29555521-g6jvn"] Mar 12 16:01:01 crc kubenswrapper[4869]: I0312 16:01:01.977201 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29555521-g6jvn" event={"ID":"e2481b36-642b-40ec-a6b2-c5f9750997a1","Type":"ContainerStarted","Data":"4be027e651adc8b19830775798bb7cf3d1d1050acbb182d83041102c80e61f31"} Mar 12 16:01:01 crc kubenswrapper[4869]: I0312 16:01:01.977694 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29555521-g6jvn" event={"ID":"e2481b36-642b-40ec-a6b2-c5f9750997a1","Type":"ContainerStarted","Data":"b156a82c0feba5c083643de084e259d40112fc4fd5b2cd3869ae11ab543a1bc3"} Mar 12 16:01:01 crc kubenswrapper[4869]: I0312 16:01:01.997959 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29555521-g6jvn" podStartSLOduration=1.9979387229999999 podStartE2EDuration="1.997938723s" podCreationTimestamp="2026-03-12 16:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:01:01.997441759 +0000 UTC m=+4414.282667107" watchObservedRunningTime="2026-03-12 16:01:01.997938723 +0000 UTC m=+4414.283164001" Mar 12 16:01:05 crc kubenswrapper[4869]: I0312 16:01:05.004894 4869 generic.go:334] "Generic (PLEG): container finished" podID="e2481b36-642b-40ec-a6b2-c5f9750997a1" containerID="4be027e651adc8b19830775798bb7cf3d1d1050acbb182d83041102c80e61f31" exitCode=0 Mar 12 16:01:05 crc kubenswrapper[4869]: I0312 16:01:05.004973 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29555521-g6jvn" event={"ID":"e2481b36-642b-40ec-a6b2-c5f9750997a1","Type":"ContainerDied","Data":"4be027e651adc8b19830775798bb7cf3d1d1050acbb182d83041102c80e61f31"} Mar 12 16:01:06 crc kubenswrapper[4869]: I0312 16:01:06.359343 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29555521-g6jvn" Mar 12 16:01:06 crc kubenswrapper[4869]: I0312 16:01:06.414011 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e2481b36-642b-40ec-a6b2-c5f9750997a1-fernet-keys\") pod \"e2481b36-642b-40ec-a6b2-c5f9750997a1\" (UID: \"e2481b36-642b-40ec-a6b2-c5f9750997a1\") " Mar 12 16:01:06 crc kubenswrapper[4869]: I0312 16:01:06.414068 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2481b36-642b-40ec-a6b2-c5f9750997a1-combined-ca-bundle\") pod \"e2481b36-642b-40ec-a6b2-c5f9750997a1\" (UID: \"e2481b36-642b-40ec-a6b2-c5f9750997a1\") " Mar 12 16:01:06 crc kubenswrapper[4869]: I0312 16:01:06.414167 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5rnz\" (UniqueName: \"kubernetes.io/projected/e2481b36-642b-40ec-a6b2-c5f9750997a1-kube-api-access-w5rnz\") pod \"e2481b36-642b-40ec-a6b2-c5f9750997a1\" (UID: \"e2481b36-642b-40ec-a6b2-c5f9750997a1\") " Mar 12 16:01:06 crc kubenswrapper[4869]: I0312 16:01:06.414183 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2481b36-642b-40ec-a6b2-c5f9750997a1-config-data\") pod \"e2481b36-642b-40ec-a6b2-c5f9750997a1\" (UID: \"e2481b36-642b-40ec-a6b2-c5f9750997a1\") " Mar 12 16:01:06 crc kubenswrapper[4869]: I0312 16:01:06.420888 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2481b36-642b-40ec-a6b2-c5f9750997a1-kube-api-access-w5rnz" (OuterVolumeSpecName: "kube-api-access-w5rnz") pod "e2481b36-642b-40ec-a6b2-c5f9750997a1" (UID: "e2481b36-642b-40ec-a6b2-c5f9750997a1"). InnerVolumeSpecName "kube-api-access-w5rnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 16:01:06 crc kubenswrapper[4869]: I0312 16:01:06.431835 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2481b36-642b-40ec-a6b2-c5f9750997a1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e2481b36-642b-40ec-a6b2-c5f9750997a1" (UID: "e2481b36-642b-40ec-a6b2-c5f9750997a1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 16:01:06 crc kubenswrapper[4869]: I0312 16:01:06.454026 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2481b36-642b-40ec-a6b2-c5f9750997a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2481b36-642b-40ec-a6b2-c5f9750997a1" (UID: "e2481b36-642b-40ec-a6b2-c5f9750997a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 16:01:06 crc kubenswrapper[4869]: I0312 16:01:06.494575 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2481b36-642b-40ec-a6b2-c5f9750997a1-config-data" (OuterVolumeSpecName: "config-data") pod "e2481b36-642b-40ec-a6b2-c5f9750997a1" (UID: "e2481b36-642b-40ec-a6b2-c5f9750997a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 16:01:06 crc kubenswrapper[4869]: I0312 16:01:06.516875 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2481b36-642b-40ec-a6b2-c5f9750997a1-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 16:01:06 crc kubenswrapper[4869]: I0312 16:01:06.516917 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5rnz\" (UniqueName: \"kubernetes.io/projected/e2481b36-642b-40ec-a6b2-c5f9750997a1-kube-api-access-w5rnz\") on node \"crc\" DevicePath \"\"" Mar 12 16:01:06 crc kubenswrapper[4869]: I0312 16:01:06.516931 4869 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e2481b36-642b-40ec-a6b2-c5f9750997a1-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 12 16:01:06 crc kubenswrapper[4869]: I0312 16:01:06.516943 4869 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2481b36-642b-40ec-a6b2-c5f9750997a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 16:01:07 crc kubenswrapper[4869]: I0312 16:01:07.030026 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29555521-g6jvn" event={"ID":"e2481b36-642b-40ec-a6b2-c5f9750997a1","Type":"ContainerDied","Data":"b156a82c0feba5c083643de084e259d40112fc4fd5b2cd3869ae11ab543a1bc3"} Mar 12 16:01:07 crc kubenswrapper[4869]: I0312 16:01:07.030068 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b156a82c0feba5c083643de084e259d40112fc4fd5b2cd3869ae11ab543a1bc3" Mar 12 16:01:07 crc kubenswrapper[4869]: I0312 16:01:07.030126 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29555521-g6jvn" Mar 12 16:01:40 crc kubenswrapper[4869]: I0312 16:01:40.342125 4869 generic.go:334] "Generic (PLEG): container finished" podID="e6182d72-d424-4a24-bb32-1c43aaa82bba" containerID="54c239e479b7d22722ff1817b02fdab11ff5547cc1e1e1e16c01f8e58a0b213c" exitCode=0 Mar 12 16:01:40 crc kubenswrapper[4869]: I0312 16:01:40.346922 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e6182d72-d424-4a24-bb32-1c43aaa82bba","Type":"ContainerDied","Data":"54c239e479b7d22722ff1817b02fdab11ff5547cc1e1e1e16c01f8e58a0b213c"} Mar 12 16:01:42 crc kubenswrapper[4869]: I0312 16:01:42.368056 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e6182d72-d424-4a24-bb32-1c43aaa82bba","Type":"ContainerDied","Data":"ce545262dd1c3a54f5126f3464c4afe21b45cc3054056538eac4411c4c477233"} Mar 12 16:01:42 crc kubenswrapper[4869]: I0312 16:01:42.368904 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce545262dd1c3a54f5126f3464c4afe21b45cc3054056538eac4411c4c477233" Mar 12 16:01:42 crc kubenswrapper[4869]: I0312 16:01:42.599964 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 12 16:01:42 crc kubenswrapper[4869]: I0312 16:01:42.748278 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e6182d72-d424-4a24-bb32-1c43aaa82bba-test-operator-ephemeral-workdir\") pod \"e6182d72-d424-4a24-bb32-1c43aaa82bba\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " Mar 12 16:01:42 crc kubenswrapper[4869]: I0312 16:01:42.748351 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e6182d72-d424-4a24-bb32-1c43aaa82bba-openstack-config\") pod \"e6182d72-d424-4a24-bb32-1c43aaa82bba\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " Mar 12 16:01:42 crc kubenswrapper[4869]: I0312 16:01:42.748392 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvrjr\" (UniqueName: \"kubernetes.io/projected/e6182d72-d424-4a24-bb32-1c43aaa82bba-kube-api-access-nvrjr\") pod \"e6182d72-d424-4a24-bb32-1c43aaa82bba\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " Mar 12 16:01:42 crc kubenswrapper[4869]: I0312 16:01:42.748464 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e6182d72-d424-4a24-bb32-1c43aaa82bba-ca-certs\") pod \"e6182d72-d424-4a24-bb32-1c43aaa82bba\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " Mar 12 16:01:42 crc kubenswrapper[4869]: I0312 16:01:42.748516 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"e6182d72-d424-4a24-bb32-1c43aaa82bba\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " Mar 12 16:01:42 crc kubenswrapper[4869]: I0312 16:01:42.749738 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6182d72-d424-4a24-bb32-1c43aaa82bba-config-data\") pod \"e6182d72-d424-4a24-bb32-1c43aaa82bba\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " Mar 12 16:01:42 crc kubenswrapper[4869]: I0312 16:01:42.749851 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e6182d72-d424-4a24-bb32-1c43aaa82bba-test-operator-ephemeral-temporary\") pod \"e6182d72-d424-4a24-bb32-1c43aaa82bba\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " Mar 12 16:01:42 crc kubenswrapper[4869]: I0312 16:01:42.749877 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e6182d72-d424-4a24-bb32-1c43aaa82bba-openstack-config-secret\") pod \"e6182d72-d424-4a24-bb32-1c43aaa82bba\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " Mar 12 16:01:42 crc kubenswrapper[4869]: I0312 16:01:42.749910 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e6182d72-d424-4a24-bb32-1c43aaa82bba-ssh-key\") pod \"e6182d72-d424-4a24-bb32-1c43aaa82bba\" (UID: \"e6182d72-d424-4a24-bb32-1c43aaa82bba\") " Mar 12 16:01:42 crc kubenswrapper[4869]: I0312 16:01:42.753696 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6182d72-d424-4a24-bb32-1c43aaa82bba-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "e6182d72-d424-4a24-bb32-1c43aaa82bba" (UID: "e6182d72-d424-4a24-bb32-1c43aaa82bba"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 16:01:42 crc kubenswrapper[4869]: I0312 16:01:42.758759 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6182d72-d424-4a24-bb32-1c43aaa82bba-config-data" (OuterVolumeSpecName: "config-data") pod "e6182d72-d424-4a24-bb32-1c43aaa82bba" (UID: "e6182d72-d424-4a24-bb32-1c43aaa82bba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 16:01:42 crc kubenswrapper[4869]: I0312 16:01:42.759257 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6182d72-d424-4a24-bb32-1c43aaa82bba-kube-api-access-nvrjr" (OuterVolumeSpecName: "kube-api-access-nvrjr") pod "e6182d72-d424-4a24-bb32-1c43aaa82bba" (UID: "e6182d72-d424-4a24-bb32-1c43aaa82bba"). InnerVolumeSpecName "kube-api-access-nvrjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 16:01:42 crc kubenswrapper[4869]: I0312 16:01:42.764990 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "test-operator-logs") pod "e6182d72-d424-4a24-bb32-1c43aaa82bba" (UID: "e6182d72-d424-4a24-bb32-1c43aaa82bba"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 12 16:01:42 crc kubenswrapper[4869]: I0312 16:01:42.767369 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6182d72-d424-4a24-bb32-1c43aaa82bba-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "e6182d72-d424-4a24-bb32-1c43aaa82bba" (UID: "e6182d72-d424-4a24-bb32-1c43aaa82bba"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 16:01:42 crc kubenswrapper[4869]: I0312 16:01:42.784923 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6182d72-d424-4a24-bb32-1c43aaa82bba-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "e6182d72-d424-4a24-bb32-1c43aaa82bba" (UID: "e6182d72-d424-4a24-bb32-1c43aaa82bba"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 16:01:42 crc kubenswrapper[4869]: I0312 16:01:42.802645 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6182d72-d424-4a24-bb32-1c43aaa82bba-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "e6182d72-d424-4a24-bb32-1c43aaa82bba" (UID: "e6182d72-d424-4a24-bb32-1c43aaa82bba"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 16:01:42 crc kubenswrapper[4869]: I0312 16:01:42.811956 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6182d72-d424-4a24-bb32-1c43aaa82bba-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e6182d72-d424-4a24-bb32-1c43aaa82bba" (UID: "e6182d72-d424-4a24-bb32-1c43aaa82bba"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 16:01:42 crc kubenswrapper[4869]: I0312 16:01:42.819241 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6182d72-d424-4a24-bb32-1c43aaa82bba-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "e6182d72-d424-4a24-bb32-1c43aaa82bba" (UID: "e6182d72-d424-4a24-bb32-1c43aaa82bba"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 16:01:42 crc kubenswrapper[4869]: I0312 16:01:42.852337 4869 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e6182d72-d424-4a24-bb32-1c43aaa82bba-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 12 16:01:42 crc kubenswrapper[4869]: I0312 16:01:42.852376 4869 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e6182d72-d424-4a24-bb32-1c43aaa82bba-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:01:42 crc kubenswrapper[4869]: I0312 16:01:42.852388 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvrjr\" (UniqueName: \"kubernetes.io/projected/e6182d72-d424-4a24-bb32-1c43aaa82bba-kube-api-access-nvrjr\") on node \"crc\" DevicePath \"\"" Mar 12 16:01:42 crc kubenswrapper[4869]: I0312 16:01:42.852399 4869 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e6182d72-d424-4a24-bb32-1c43aaa82bba-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 12 16:01:42 crc kubenswrapper[4869]: I0312 16:01:42.852434 4869 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 12 16:01:42 crc kubenswrapper[4869]: I0312 16:01:42.852443 4869 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6182d72-d424-4a24-bb32-1c43aaa82bba-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 16:01:42 crc kubenswrapper[4869]: I0312 16:01:42.852456 4869 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e6182d72-d424-4a24-bb32-1c43aaa82bba-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 12 16:01:42 crc kubenswrapper[4869]: I0312 16:01:42.852465 4869 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e6182d72-d424-4a24-bb32-1c43aaa82bba-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 12 16:01:42 crc kubenswrapper[4869]: I0312 16:01:42.852474 4869 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e6182d72-d424-4a24-bb32-1c43aaa82bba-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 12 16:01:42 crc kubenswrapper[4869]: I0312 16:01:42.878204 4869 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 12 16:01:42 crc kubenswrapper[4869]: I0312 16:01:42.954055 4869 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 12 16:01:43 crc kubenswrapper[4869]: I0312 16:01:43.378631 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 12 16:01:53 crc kubenswrapper[4869]: I0312 16:01:53.764436 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 12 16:01:53 crc kubenswrapper[4869]: E0312 16:01:53.766093 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6182d72-d424-4a24-bb32-1c43aaa82bba" containerName="tempest-tests-tempest-tests-runner" Mar 12 16:01:53 crc kubenswrapper[4869]: I0312 16:01:53.766122 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6182d72-d424-4a24-bb32-1c43aaa82bba" containerName="tempest-tests-tempest-tests-runner" Mar 12 16:01:53 crc kubenswrapper[4869]: E0312 16:01:53.766142 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2481b36-642b-40ec-a6b2-c5f9750997a1" containerName="keystone-cron" Mar 12 16:01:53 crc kubenswrapper[4869]: I0312 16:01:53.766155 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2481b36-642b-40ec-a6b2-c5f9750997a1" containerName="keystone-cron" Mar 12 16:01:53 crc kubenswrapper[4869]: I0312 16:01:53.766507 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2481b36-642b-40ec-a6b2-c5f9750997a1" containerName="keystone-cron" Mar 12 16:01:53 crc kubenswrapper[4869]: I0312 16:01:53.766569 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6182d72-d424-4a24-bb32-1c43aaa82bba" containerName="tempest-tests-tempest-tests-runner" Mar 12 16:01:53 crc kubenswrapper[4869]: I0312 16:01:53.767757 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 12 16:01:53 crc kubenswrapper[4869]: I0312 16:01:53.775464 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 12 16:01:53 crc kubenswrapper[4869]: I0312 16:01:53.883666 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"00cc4d19-9222-42f6-a43c-46cfd886fd77\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 12 16:01:53 crc kubenswrapper[4869]: I0312 16:01:53.883894 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6lcg\" (UniqueName: \"kubernetes.io/projected/00cc4d19-9222-42f6-a43c-46cfd886fd77-kube-api-access-p6lcg\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"00cc4d19-9222-42f6-a43c-46cfd886fd77\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 12 16:01:53 crc kubenswrapper[4869]: I0312 16:01:53.985684 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"00cc4d19-9222-42f6-a43c-46cfd886fd77\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 12 16:01:53 crc kubenswrapper[4869]: I0312 16:01:53.985805 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6lcg\" (UniqueName: \"kubernetes.io/projected/00cc4d19-9222-42f6-a43c-46cfd886fd77-kube-api-access-p6lcg\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"00cc4d19-9222-42f6-a43c-46cfd886fd77\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 12 16:01:53 crc kubenswrapper[4869]: I0312 16:01:53.987770 4869 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"00cc4d19-9222-42f6-a43c-46cfd886fd77\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 12 16:01:54 crc kubenswrapper[4869]: I0312 16:01:54.012220 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6lcg\" (UniqueName: \"kubernetes.io/projected/00cc4d19-9222-42f6-a43c-46cfd886fd77-kube-api-access-p6lcg\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"00cc4d19-9222-42f6-a43c-46cfd886fd77\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 12 16:01:54 crc kubenswrapper[4869]: I0312 16:01:54.014731 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"00cc4d19-9222-42f6-a43c-46cfd886fd77\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 12 16:01:54 crc kubenswrapper[4869]: I0312 16:01:54.101511 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 12 16:01:54 crc kubenswrapper[4869]: I0312 16:01:54.557046 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 12 16:01:54 crc kubenswrapper[4869]: I0312 16:01:54.887113 4869 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 16:01:55 crc kubenswrapper[4869]: I0312 16:01:55.493815 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"00cc4d19-9222-42f6-a43c-46cfd886fd77","Type":"ContainerStarted","Data":"7fe9e84637db179a05ea69ffdc91d27be0f0fa5e9f4a1257e3619500ea4b4462"} Mar 12 16:01:56 crc kubenswrapper[4869]: I0312 16:01:56.507257 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"00cc4d19-9222-42f6-a43c-46cfd886fd77","Type":"ContainerStarted","Data":"d2f2faa6584a444fc3dae2123ca68a43ddb951e3e945700a3c3c5af3292807c7"} Mar 12 16:01:56 crc kubenswrapper[4869]: I0312 16:01:56.530910 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.415802585 podStartE2EDuration="3.530886798s" podCreationTimestamp="2026-03-12 16:01:53 +0000 UTC" firstStartedPulling="2026-03-12 16:01:54.886836651 +0000 UTC m=+4467.172061939" lastFinishedPulling="2026-03-12 16:01:56.001920874 +0000 UTC m=+4468.287146152" observedRunningTime="2026-03-12 16:01:56.529938392 +0000 UTC m=+4468.815163680" watchObservedRunningTime="2026-03-12 16:01:56.530886798 +0000 UTC m=+4468.816112096" Mar 12 16:02:00 crc kubenswrapper[4869]: I0312 16:02:00.150524 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555522-7bd8f"] Mar 12 16:02:00 crc kubenswrapper[4869]: I0312 16:02:00.152759 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555522-7bd8f" Mar 12 16:02:00 crc kubenswrapper[4869]: I0312 16:02:00.155364 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 16:02:00 crc kubenswrapper[4869]: I0312 16:02:00.155679 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 16:02:00 crc kubenswrapper[4869]: I0312 16:02:00.155902 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 16:02:00 crc kubenswrapper[4869]: I0312 16:02:00.160251 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555522-7bd8f"] Mar 12 16:02:00 crc kubenswrapper[4869]: I0312 16:02:00.310888 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc57j\" (UniqueName: \"kubernetes.io/projected/e1ac2137-2cbd-424a-8668-cd87d1fe0334-kube-api-access-qc57j\") pod \"auto-csr-approver-29555522-7bd8f\" (UID: \"e1ac2137-2cbd-424a-8668-cd87d1fe0334\") " pod="openshift-infra/auto-csr-approver-29555522-7bd8f" Mar 12 16:02:00 crc kubenswrapper[4869]: I0312 16:02:00.413033 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc57j\" (UniqueName: \"kubernetes.io/projected/e1ac2137-2cbd-424a-8668-cd87d1fe0334-kube-api-access-qc57j\") pod \"auto-csr-approver-29555522-7bd8f\" (UID: \"e1ac2137-2cbd-424a-8668-cd87d1fe0334\") " pod="openshift-infra/auto-csr-approver-29555522-7bd8f" Mar 12 16:02:00 crc kubenswrapper[4869]: I0312 16:02:00.435396 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc57j\" (UniqueName: \"kubernetes.io/projected/e1ac2137-2cbd-424a-8668-cd87d1fe0334-kube-api-access-qc57j\") pod \"auto-csr-approver-29555522-7bd8f\" (UID: \"e1ac2137-2cbd-424a-8668-cd87d1fe0334\") " pod="openshift-infra/auto-csr-approver-29555522-7bd8f" Mar 12 16:02:00 crc kubenswrapper[4869]: I0312 16:02:00.472826 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555522-7bd8f" Mar 12 16:02:00 crc kubenswrapper[4869]: I0312 16:02:00.888265 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555522-7bd8f"] Mar 12 16:02:01 crc kubenswrapper[4869]: I0312 16:02:01.564877 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555522-7bd8f" event={"ID":"e1ac2137-2cbd-424a-8668-cd87d1fe0334","Type":"ContainerStarted","Data":"35920b920e2c0442b903236499076896e7eb84665f0ded83d27e11c53b57e99b"} Mar 12 16:02:03 crc kubenswrapper[4869]: I0312 16:02:03.595243 4869 generic.go:334] "Generic (PLEG): container finished" podID="e1ac2137-2cbd-424a-8668-cd87d1fe0334" containerID="1ee983b7440ac0ee3b9ea83249f52f0c23cd0382c8a0271c47052f270e30e2f0" exitCode=0 Mar 12 16:02:03 crc kubenswrapper[4869]: I0312 16:02:03.595362 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555522-7bd8f" event={"ID":"e1ac2137-2cbd-424a-8668-cd87d1fe0334","Type":"ContainerDied","Data":"1ee983b7440ac0ee3b9ea83249f52f0c23cd0382c8a0271c47052f270e30e2f0"} Mar 12 16:02:04 crc kubenswrapper[4869]: I0312 16:02:04.936133 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555522-7bd8f" Mar 12 16:02:05 crc kubenswrapper[4869]: I0312 16:02:05.006157 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qc57j\" (UniqueName: \"kubernetes.io/projected/e1ac2137-2cbd-424a-8668-cd87d1fe0334-kube-api-access-qc57j\") pod \"e1ac2137-2cbd-424a-8668-cd87d1fe0334\" (UID: \"e1ac2137-2cbd-424a-8668-cd87d1fe0334\") " Mar 12 16:02:05 crc kubenswrapper[4869]: I0312 16:02:05.011403 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1ac2137-2cbd-424a-8668-cd87d1fe0334-kube-api-access-qc57j" (OuterVolumeSpecName: "kube-api-access-qc57j") pod "e1ac2137-2cbd-424a-8668-cd87d1fe0334" (UID: "e1ac2137-2cbd-424a-8668-cd87d1fe0334"). InnerVolumeSpecName "kube-api-access-qc57j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 16:02:05 crc kubenswrapper[4869]: I0312 16:02:05.108613 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qc57j\" (UniqueName: \"kubernetes.io/projected/e1ac2137-2cbd-424a-8668-cd87d1fe0334-kube-api-access-qc57j\") on node \"crc\" DevicePath \"\"" Mar 12 16:02:05 crc kubenswrapper[4869]: I0312 16:02:05.615060 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555522-7bd8f" event={"ID":"e1ac2137-2cbd-424a-8668-cd87d1fe0334","Type":"ContainerDied","Data":"35920b920e2c0442b903236499076896e7eb84665f0ded83d27e11c53b57e99b"} Mar 12 16:02:05 crc kubenswrapper[4869]: I0312 16:02:05.615131 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35920b920e2c0442b903236499076896e7eb84665f0ded83d27e11c53b57e99b" Mar 12 16:02:05 crc kubenswrapper[4869]: I0312 16:02:05.615224 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555522-7bd8f" Mar 12 16:02:06 crc kubenswrapper[4869]: I0312 16:02:05.999903 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555516-pr7cb"] Mar 12 16:02:06 crc kubenswrapper[4869]: I0312 16:02:06.008027 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555516-pr7cb"] Mar 12 16:02:06 crc kubenswrapper[4869]: I0312 16:02:06.346479 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45f832c1-0ece-49ee-9a70-08ec6ceac27e" path="/var/lib/kubelet/pods/45f832c1-0ece-49ee-9a70-08ec6ceac27e/volumes" Mar 12 16:02:11 crc kubenswrapper[4869]: I0312 16:02:11.128285 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-296v9"] Mar 12 16:02:11 crc kubenswrapper[4869]: E0312 16:02:11.131104 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1ac2137-2cbd-424a-8668-cd87d1fe0334" containerName="oc" Mar 12 16:02:11 crc kubenswrapper[4869]: I0312 16:02:11.131136 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1ac2137-2cbd-424a-8668-cd87d1fe0334" containerName="oc" Mar 12 16:02:11 crc kubenswrapper[4869]: I0312 16:02:11.131434 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1ac2137-2cbd-424a-8668-cd87d1fe0334" containerName="oc" Mar 12 16:02:11 crc kubenswrapper[4869]: I0312 16:02:11.133639 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-296v9" Mar 12 16:02:11 crc kubenswrapper[4869]: I0312 16:02:11.140557 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-296v9"] Mar 12 16:02:11 crc kubenswrapper[4869]: I0312 16:02:11.185470 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hct6l\" (UniqueName: \"kubernetes.io/projected/17f3aa03-66bd-4f6e-810e-b23bec9894b2-kube-api-access-hct6l\") pod \"redhat-operators-296v9\" (UID: \"17f3aa03-66bd-4f6e-810e-b23bec9894b2\") " pod="openshift-marketplace/redhat-operators-296v9" Mar 12 16:02:11 crc kubenswrapper[4869]: I0312 16:02:11.185513 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17f3aa03-66bd-4f6e-810e-b23bec9894b2-catalog-content\") pod \"redhat-operators-296v9\" (UID: \"17f3aa03-66bd-4f6e-810e-b23bec9894b2\") " pod="openshift-marketplace/redhat-operators-296v9" Mar 12 16:02:11 crc kubenswrapper[4869]: I0312 16:02:11.185834 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17f3aa03-66bd-4f6e-810e-b23bec9894b2-utilities\") pod \"redhat-operators-296v9\" (UID: \"17f3aa03-66bd-4f6e-810e-b23bec9894b2\") " pod="openshift-marketplace/redhat-operators-296v9" Mar 12 16:02:11 crc kubenswrapper[4869]: I0312 16:02:11.287354 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hct6l\" (UniqueName: \"kubernetes.io/projected/17f3aa03-66bd-4f6e-810e-b23bec9894b2-kube-api-access-hct6l\") pod \"redhat-operators-296v9\" (UID: \"17f3aa03-66bd-4f6e-810e-b23bec9894b2\") " pod="openshift-marketplace/redhat-operators-296v9" Mar 12 16:02:11 crc kubenswrapper[4869]: I0312 16:02:11.287397 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17f3aa03-66bd-4f6e-810e-b23bec9894b2-catalog-content\") pod \"redhat-operators-296v9\" (UID: \"17f3aa03-66bd-4f6e-810e-b23bec9894b2\") " pod="openshift-marketplace/redhat-operators-296v9" Mar 12 16:02:11 crc kubenswrapper[4869]: I0312 16:02:11.287444 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17f3aa03-66bd-4f6e-810e-b23bec9894b2-utilities\") pod \"redhat-operators-296v9\" (UID: \"17f3aa03-66bd-4f6e-810e-b23bec9894b2\") " pod="openshift-marketplace/redhat-operators-296v9" Mar 12 16:02:11 crc kubenswrapper[4869]: I0312 16:02:11.288101 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17f3aa03-66bd-4f6e-810e-b23bec9894b2-catalog-content\") pod \"redhat-operators-296v9\" (UID: \"17f3aa03-66bd-4f6e-810e-b23bec9894b2\") " pod="openshift-marketplace/redhat-operators-296v9" Mar 12 16:02:11 crc kubenswrapper[4869]: I0312 16:02:11.288179 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17f3aa03-66bd-4f6e-810e-b23bec9894b2-utilities\") pod \"redhat-operators-296v9\" (UID: \"17f3aa03-66bd-4f6e-810e-b23bec9894b2\") " pod="openshift-marketplace/redhat-operators-296v9" Mar 12 16:02:11 crc kubenswrapper[4869]: I0312 16:02:11.313618 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hct6l\" (UniqueName: \"kubernetes.io/projected/17f3aa03-66bd-4f6e-810e-b23bec9894b2-kube-api-access-hct6l\") pod \"redhat-operators-296v9\" (UID: \"17f3aa03-66bd-4f6e-810e-b23bec9894b2\") " pod="openshift-marketplace/redhat-operators-296v9" Mar 12 16:02:11 crc kubenswrapper[4869]: I0312 16:02:11.469294 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-296v9" Mar 12 16:02:12 crc kubenswrapper[4869]: I0312 16:02:12.004499 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-296v9"] Mar 12 16:02:12 crc kubenswrapper[4869]: I0312 16:02:12.689909 4869 generic.go:334] "Generic (PLEG): container finished" podID="17f3aa03-66bd-4f6e-810e-b23bec9894b2" containerID="0334c3bf049384281bce87f68c624426996b49452c20e97b279faea47df2484d" exitCode=0 Mar 12 16:02:12 crc kubenswrapper[4869]: I0312 16:02:12.690372 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-296v9" event={"ID":"17f3aa03-66bd-4f6e-810e-b23bec9894b2","Type":"ContainerDied","Data":"0334c3bf049384281bce87f68c624426996b49452c20e97b279faea47df2484d"} Mar 12 16:02:12 crc kubenswrapper[4869]: I0312 16:02:12.690405 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-296v9" event={"ID":"17f3aa03-66bd-4f6e-810e-b23bec9894b2","Type":"ContainerStarted","Data":"93c7b83785a018ec235db06d05b40372472168e11910e35b806877118779ff62"} Mar 12 16:02:14 crc kubenswrapper[4869]: I0312 16:02:14.718590 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-296v9" event={"ID":"17f3aa03-66bd-4f6e-810e-b23bec9894b2","Type":"ContainerStarted","Data":"c0b55a67b86962bdc7c2bb83bd43734b5ab790ee2401440f7cb62c3059af398d"} Mar 12 16:02:18 crc kubenswrapper[4869]: I0312 16:02:18.761023 4869 generic.go:334] "Generic (PLEG): container finished" podID="17f3aa03-66bd-4f6e-810e-b23bec9894b2" containerID="c0b55a67b86962bdc7c2bb83bd43734b5ab790ee2401440f7cb62c3059af398d" exitCode=0 Mar 12 16:02:18 crc kubenswrapper[4869]: I0312 16:02:18.761554 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-296v9" event={"ID":"17f3aa03-66bd-4f6e-810e-b23bec9894b2","Type":"ContainerDied","Data":"c0b55a67b86962bdc7c2bb83bd43734b5ab790ee2401440f7cb62c3059af398d"} Mar 12 16:02:19 crc kubenswrapper[4869]: I0312 16:02:19.777177 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-296v9" event={"ID":"17f3aa03-66bd-4f6e-810e-b23bec9894b2","Type":"ContainerStarted","Data":"f9359631b2e402fe083e902deda1fa733cffec214543e1722956a9aba850cc70"} Mar 12 16:02:19 crc kubenswrapper[4869]: I0312 16:02:19.806355 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-296v9" podStartSLOduration=2.314580425 podStartE2EDuration="8.806290206s" podCreationTimestamp="2026-03-12 16:02:11 +0000 UTC" firstStartedPulling="2026-03-12 16:02:12.694100605 +0000 UTC m=+4484.979325883" lastFinishedPulling="2026-03-12 16:02:19.185810386 +0000 UTC m=+4491.471035664" observedRunningTime="2026-03-12 16:02:19.799370331 +0000 UTC m=+4492.084595639" watchObservedRunningTime="2026-03-12 16:02:19.806290206 +0000 UTC m=+4492.091515524" Mar 12 16:02:21 crc kubenswrapper[4869]: I0312 16:02:21.469747 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-296v9" Mar 12 16:02:21 crc kubenswrapper[4869]: I0312 16:02:21.470527 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-296v9" Mar 12 16:02:22 crc kubenswrapper[4869]: I0312 16:02:22.516957 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-296v9" podUID="17f3aa03-66bd-4f6e-810e-b23bec9894b2" containerName="registry-server" probeResult="failure" output=< Mar 12 16:02:22 crc kubenswrapper[4869]: timeout: failed to connect service ":50051" within 1s Mar 12 16:02:22 crc kubenswrapper[4869]: > Mar 12 16:02:31 crc kubenswrapper[4869]: I0312 16:02:31.560663 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-296v9" Mar 12 16:02:31 crc kubenswrapper[4869]: I0312 16:02:31.631315 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-296v9" Mar 12 16:02:31 crc kubenswrapper[4869]: I0312 16:02:31.824048 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-296v9"] Mar 12 16:02:32 crc kubenswrapper[4869]: I0312 16:02:32.910422 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-296v9" podUID="17f3aa03-66bd-4f6e-810e-b23bec9894b2" containerName="registry-server" containerID="cri-o://f9359631b2e402fe083e902deda1fa733cffec214543e1722956a9aba850cc70" gracePeriod=2 Mar 12 16:02:33 crc kubenswrapper[4869]: I0312 16:02:33.104071 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-w5hx4/must-gather-5zgtd"] Mar 12 16:02:33 crc kubenswrapper[4869]: I0312 16:02:33.106519 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w5hx4/must-gather-5zgtd" Mar 12 16:02:33 crc kubenswrapper[4869]: I0312 16:02:33.108366 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-w5hx4"/"kube-root-ca.crt" Mar 12 16:02:33 crc kubenswrapper[4869]: I0312 16:02:33.108677 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-w5hx4"/"openshift-service-ca.crt" Mar 12 16:02:33 crc kubenswrapper[4869]: I0312 16:02:33.109373 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-w5hx4"/"default-dockercfg-hzsdh" Mar 12 16:02:33 crc kubenswrapper[4869]: I0312 16:02:33.175136 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-w5hx4/must-gather-5zgtd"] Mar 12 16:02:33 crc kubenswrapper[4869]: I0312 16:02:33.194779 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmspj\" (UniqueName: \"kubernetes.io/projected/93aa2341-4454-4086-8a4e-c63f6d317bbf-kube-api-access-qmspj\") pod \"must-gather-5zgtd\" (UID: \"93aa2341-4454-4086-8a4e-c63f6d317bbf\") " pod="openshift-must-gather-w5hx4/must-gather-5zgtd" Mar 12 16:02:33 crc kubenswrapper[4869]: I0312 16:02:33.194852 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/93aa2341-4454-4086-8a4e-c63f6d317bbf-must-gather-output\") pod \"must-gather-5zgtd\" (UID: \"93aa2341-4454-4086-8a4e-c63f6d317bbf\") " pod="openshift-must-gather-w5hx4/must-gather-5zgtd" Mar 12 16:02:33 crc kubenswrapper[4869]: I0312 16:02:33.296704 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmspj\" (UniqueName: \"kubernetes.io/projected/93aa2341-4454-4086-8a4e-c63f6d317bbf-kube-api-access-qmspj\") pod \"must-gather-5zgtd\" (UID: \"93aa2341-4454-4086-8a4e-c63f6d317bbf\") " pod="openshift-must-gather-w5hx4/must-gather-5zgtd" Mar 12 16:02:33 crc kubenswrapper[4869]: I0312 16:02:33.296769 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/93aa2341-4454-4086-8a4e-c63f6d317bbf-must-gather-output\") pod \"must-gather-5zgtd\" (UID: \"93aa2341-4454-4086-8a4e-c63f6d317bbf\") " pod="openshift-must-gather-w5hx4/must-gather-5zgtd" Mar 12 16:02:33 crc kubenswrapper[4869]: I0312 16:02:33.297670 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/93aa2341-4454-4086-8a4e-c63f6d317bbf-must-gather-output\") pod \"must-gather-5zgtd\" (UID: \"93aa2341-4454-4086-8a4e-c63f6d317bbf\") " pod="openshift-must-gather-w5hx4/must-gather-5zgtd" Mar 12 16:02:33 crc kubenswrapper[4869]: I0312 16:02:33.338563 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmspj\" (UniqueName: \"kubernetes.io/projected/93aa2341-4454-4086-8a4e-c63f6d317bbf-kube-api-access-qmspj\") pod \"must-gather-5zgtd\" (UID: \"93aa2341-4454-4086-8a4e-c63f6d317bbf\") " pod="openshift-must-gather-w5hx4/must-gather-5zgtd" Mar 12 16:02:33 crc kubenswrapper[4869]: I0312 16:02:33.462051 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w5hx4/must-gather-5zgtd" Mar 12 16:02:33 crc kubenswrapper[4869]: I0312 16:02:33.960773 4869 generic.go:334] "Generic (PLEG): container finished" podID="17f3aa03-66bd-4f6e-810e-b23bec9894b2" containerID="f9359631b2e402fe083e902deda1fa733cffec214543e1722956a9aba850cc70" exitCode=0 Mar 12 16:02:33 crc kubenswrapper[4869]: I0312 16:02:33.961144 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-296v9" event={"ID":"17f3aa03-66bd-4f6e-810e-b23bec9894b2","Type":"ContainerDied","Data":"f9359631b2e402fe083e902deda1fa733cffec214543e1722956a9aba850cc70"} Mar 12 16:02:34 crc kubenswrapper[4869]: I0312 16:02:34.008717 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-w5hx4/must-gather-5zgtd"] Mar 12 16:02:34 crc kubenswrapper[4869]: I0312 16:02:34.073856 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-296v9" Mar 12 16:02:34 crc kubenswrapper[4869]: I0312 16:02:34.154609 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17f3aa03-66bd-4f6e-810e-b23bec9894b2-utilities\") pod \"17f3aa03-66bd-4f6e-810e-b23bec9894b2\" (UID: \"17f3aa03-66bd-4f6e-810e-b23bec9894b2\") " Mar 12 16:02:34 crc kubenswrapper[4869]: I0312 16:02:34.154664 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hct6l\" (UniqueName: \"kubernetes.io/projected/17f3aa03-66bd-4f6e-810e-b23bec9894b2-kube-api-access-hct6l\") pod \"17f3aa03-66bd-4f6e-810e-b23bec9894b2\" (UID: \"17f3aa03-66bd-4f6e-810e-b23bec9894b2\") " Mar 12 16:02:34 crc kubenswrapper[4869]: I0312 16:02:34.154843 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17f3aa03-66bd-4f6e-810e-b23bec9894b2-catalog-content\") pod \"17f3aa03-66bd-4f6e-810e-b23bec9894b2\" (UID: \"17f3aa03-66bd-4f6e-810e-b23bec9894b2\") " Mar 12 16:02:34 crc kubenswrapper[4869]: I0312 16:02:34.155688 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17f3aa03-66bd-4f6e-810e-b23bec9894b2-utilities" (OuterVolumeSpecName: "utilities") pod "17f3aa03-66bd-4f6e-810e-b23bec9894b2" (UID: "17f3aa03-66bd-4f6e-810e-b23bec9894b2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 16:02:34 crc kubenswrapper[4869]: I0312 16:02:34.160671 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17f3aa03-66bd-4f6e-810e-b23bec9894b2-kube-api-access-hct6l" (OuterVolumeSpecName: "kube-api-access-hct6l") pod "17f3aa03-66bd-4f6e-810e-b23bec9894b2" (UID: "17f3aa03-66bd-4f6e-810e-b23bec9894b2"). InnerVolumeSpecName "kube-api-access-hct6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 16:02:34 crc kubenswrapper[4869]: I0312 16:02:34.257514 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17f3aa03-66bd-4f6e-810e-b23bec9894b2-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 16:02:34 crc kubenswrapper[4869]: I0312 16:02:34.257575 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hct6l\" (UniqueName: \"kubernetes.io/projected/17f3aa03-66bd-4f6e-810e-b23bec9894b2-kube-api-access-hct6l\") on node \"crc\" DevicePath \"\"" Mar 12 16:02:34 crc kubenswrapper[4869]: I0312 16:02:34.304508 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17f3aa03-66bd-4f6e-810e-b23bec9894b2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17f3aa03-66bd-4f6e-810e-b23bec9894b2" (UID: "17f3aa03-66bd-4f6e-810e-b23bec9894b2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 16:02:34 crc kubenswrapper[4869]: I0312 16:02:34.366235 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17f3aa03-66bd-4f6e-810e-b23bec9894b2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 16:02:34 crc kubenswrapper[4869]: I0312 16:02:34.977522 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w5hx4/must-gather-5zgtd" event={"ID":"93aa2341-4454-4086-8a4e-c63f6d317bbf","Type":"ContainerStarted","Data":"5e9860ea71c213bff5c64dee3b8128d415e3c13d0e2f550ce31b2d96df14b242"} Mar 12 16:02:34 crc kubenswrapper[4869]: I0312 16:02:34.990908 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-296v9" event={"ID":"17f3aa03-66bd-4f6e-810e-b23bec9894b2","Type":"ContainerDied","Data":"93c7b83785a018ec235db06d05b40372472168e11910e35b806877118779ff62"} Mar 12 16:02:34 crc kubenswrapper[4869]: I0312 16:02:34.992146 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-296v9" Mar 12 16:02:34 crc kubenswrapper[4869]: I0312 16:02:34.992666 4869 scope.go:117] "RemoveContainer" containerID="f9359631b2e402fe083e902deda1fa733cffec214543e1722956a9aba850cc70" Mar 12 16:02:35 crc kubenswrapper[4869]: I0312 16:02:35.046769 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-296v9"] Mar 12 16:02:35 crc kubenswrapper[4869]: I0312 16:02:35.061943 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-296v9"] Mar 12 16:02:35 crc kubenswrapper[4869]: I0312 16:02:35.089737 4869 scope.go:117] "RemoveContainer" containerID="c0b55a67b86962bdc7c2bb83bd43734b5ab790ee2401440f7cb62c3059af398d" Mar 12 16:02:35 crc kubenswrapper[4869]: I0312 16:02:35.160726 4869 scope.go:117] "RemoveContainer" containerID="0334c3bf049384281bce87f68c624426996b49452c20e97b279faea47df2484d" Mar 12 16:02:36 crc kubenswrapper[4869]: I0312 16:02:36.350655 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17f3aa03-66bd-4f6e-810e-b23bec9894b2" path="/var/lib/kubelet/pods/17f3aa03-66bd-4f6e-810e-b23bec9894b2/volumes" Mar 12 16:02:41 crc kubenswrapper[4869]: I0312 16:02:41.070925 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w5hx4/must-gather-5zgtd" event={"ID":"93aa2341-4454-4086-8a4e-c63f6d317bbf","Type":"ContainerStarted","Data":"784e6adc52403be149c9523e640c80df52676c9ba4095f82874b4dacef002fa7"} Mar 12 16:02:41 crc kubenswrapper[4869]: I0312 16:02:41.071735 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w5hx4/must-gather-5zgtd" event={"ID":"93aa2341-4454-4086-8a4e-c63f6d317bbf","Type":"ContainerStarted","Data":"3a1670db671d421b749e5492f8dba026d95450ba8ac11e3568a08de17e7033ba"} Mar 12 16:02:41 crc kubenswrapper[4869]: I0312 16:02:41.103704 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-w5hx4/must-gather-5zgtd" podStartSLOduration=1.902793567 podStartE2EDuration="8.103664092s" podCreationTimestamp="2026-03-12 16:02:33 +0000 UTC" firstStartedPulling="2026-03-12 16:02:34.016612971 +0000 UTC m=+4506.301838259" lastFinishedPulling="2026-03-12 16:02:40.217483506 +0000 UTC m=+4512.502708784" observedRunningTime="2026-03-12 16:02:41.089591794 +0000 UTC m=+4513.374817072" watchObservedRunningTime="2026-03-12 16:02:41.103664092 +0000 UTC m=+4513.388889390" Mar 12 16:02:46 crc kubenswrapper[4869]: I0312 16:02:46.314080 4869 scope.go:117] "RemoveContainer" containerID="d8d16c83826a008d768f530722c1e4b2dd706756e102e6e7b19811950c83e927" Mar 12 16:02:46 crc kubenswrapper[4869]: I0312 16:02:46.505240 4869 scope.go:117] "RemoveContainer" containerID="290a75b3566b355e4311f4dc90c395bbdb068b422e77ad31602427c8e47786f1" Mar 12 16:02:46 crc kubenswrapper[4869]: I0312 16:02:46.543160 4869 scope.go:117] "RemoveContainer" containerID="d180263f821b245f288ddc63ed63e0a03401b9204639d61097b20ac644620e85" Mar 12 16:02:46 crc kubenswrapper[4869]: I0312 16:02:46.641560 4869 scope.go:117] "RemoveContainer" containerID="7dba95faaa5ee2e852fa2a03dd76ddc77d80bafebe1befb813732fc5faaf93fe" Mar 12 16:02:47 crc kubenswrapper[4869]: I0312 16:02:47.081849 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-w5hx4/crc-debug-d8jfc"] Mar 12 16:02:47 crc kubenswrapper[4869]: E0312 16:02:47.082612 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17f3aa03-66bd-4f6e-810e-b23bec9894b2" containerName="extract-utilities" Mar 12 16:02:47 crc kubenswrapper[4869]: I0312 16:02:47.082637 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="17f3aa03-66bd-4f6e-810e-b23bec9894b2" containerName="extract-utilities" Mar 12 16:02:47 crc kubenswrapper[4869]: E0312 16:02:47.082674 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17f3aa03-66bd-4f6e-810e-b23bec9894b2" containerName="extract-content" Mar 12 16:02:47 crc kubenswrapper[4869]: I0312 16:02:47.082683 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="17f3aa03-66bd-4f6e-810e-b23bec9894b2" containerName="extract-content" Mar 12 16:02:47 crc kubenswrapper[4869]: E0312 16:02:47.082702 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17f3aa03-66bd-4f6e-810e-b23bec9894b2" containerName="registry-server" Mar 12 16:02:47 crc kubenswrapper[4869]: I0312 16:02:47.082709 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="17f3aa03-66bd-4f6e-810e-b23bec9894b2" containerName="registry-server" Mar 12 16:02:47 crc kubenswrapper[4869]: I0312 16:02:47.082920 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="17f3aa03-66bd-4f6e-810e-b23bec9894b2" containerName="registry-server" Mar 12 16:02:47 crc kubenswrapper[4869]: I0312 16:02:47.083755 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w5hx4/crc-debug-d8jfc" Mar 12 16:02:47 crc kubenswrapper[4869]: I0312 16:02:47.248094 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7cebdea4-5359-424d-9166-2b1a242a8197-host\") pod \"crc-debug-d8jfc\" (UID: \"7cebdea4-5359-424d-9166-2b1a242a8197\") " pod="openshift-must-gather-w5hx4/crc-debug-d8jfc" Mar 12 16:02:47 crc kubenswrapper[4869]: I0312 16:02:47.248158 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8vgb\" (UniqueName: \"kubernetes.io/projected/7cebdea4-5359-424d-9166-2b1a242a8197-kube-api-access-t8vgb\") pod \"crc-debug-d8jfc\" (UID: \"7cebdea4-5359-424d-9166-2b1a242a8197\") " pod="openshift-must-gather-w5hx4/crc-debug-d8jfc" Mar 12 16:02:47 crc kubenswrapper[4869]: I0312 16:02:47.349647 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7cebdea4-5359-424d-9166-2b1a242a8197-host\") pod \"crc-debug-d8jfc\" (UID: \"7cebdea4-5359-424d-9166-2b1a242a8197\") " pod="openshift-must-gather-w5hx4/crc-debug-d8jfc" Mar 12 16:02:47 crc kubenswrapper[4869]: I0312 16:02:47.349713 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8vgb\" (UniqueName: \"kubernetes.io/projected/7cebdea4-5359-424d-9166-2b1a242a8197-kube-api-access-t8vgb\") pod \"crc-debug-d8jfc\" (UID: \"7cebdea4-5359-424d-9166-2b1a242a8197\") " pod="openshift-must-gather-w5hx4/crc-debug-d8jfc" Mar 12 16:02:47 crc kubenswrapper[4869]: I0312 16:02:47.349767 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7cebdea4-5359-424d-9166-2b1a242a8197-host\") pod \"crc-debug-d8jfc\" (UID: \"7cebdea4-5359-424d-9166-2b1a242a8197\") " pod="openshift-must-gather-w5hx4/crc-debug-d8jfc" Mar 12 16:02:47 crc kubenswrapper[4869]: I0312 16:02:47.376792 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8vgb\" (UniqueName: \"kubernetes.io/projected/7cebdea4-5359-424d-9166-2b1a242a8197-kube-api-access-t8vgb\") pod \"crc-debug-d8jfc\" (UID: \"7cebdea4-5359-424d-9166-2b1a242a8197\") " pod="openshift-must-gather-w5hx4/crc-debug-d8jfc" Mar 12 16:02:47 crc kubenswrapper[4869]: I0312 16:02:47.403534 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w5hx4/crc-debug-d8jfc" Mar 12 16:02:47 crc kubenswrapper[4869]: W0312 16:02:47.431984 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cebdea4_5359_424d_9166_2b1a242a8197.slice/crio-e99acaa40ace62f3e1f267c3a6f12023993e775ee8f3c7f704a8700524344594 WatchSource:0}: Error finding container e99acaa40ace62f3e1f267c3a6f12023993e775ee8f3c7f704a8700524344594: Status 404 returned error can't find the container with id e99acaa40ace62f3e1f267c3a6f12023993e775ee8f3c7f704a8700524344594 Mar 12 16:02:48 crc kubenswrapper[4869]: I0312 16:02:48.135453 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w5hx4/crc-debug-d8jfc" event={"ID":"7cebdea4-5359-424d-9166-2b1a242a8197","Type":"ContainerStarted","Data":"e99acaa40ace62f3e1f267c3a6f12023993e775ee8f3c7f704a8700524344594"} Mar 12 16:02:49 crc kubenswrapper[4869]: I0312 16:02:49.684695 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 16:02:49 crc kubenswrapper[4869]: I0312 16:02:49.685288 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 16:02:59 crc kubenswrapper[4869]: I0312 16:02:59.256031 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w5hx4/crc-debug-d8jfc" event={"ID":"7cebdea4-5359-424d-9166-2b1a242a8197","Type":"ContainerStarted","Data":"1048a45ac39c4b4e0c329b04cbef94cb49c55efdde97fbf64c3cd2d9a94ab2ea"} Mar 12 16:02:59 crc kubenswrapper[4869]: I0312 16:02:59.280532 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-w5hx4/crc-debug-d8jfc" podStartSLOduration=1.304741184 podStartE2EDuration="12.280515818s" podCreationTimestamp="2026-03-12 16:02:47 +0000 UTC" firstStartedPulling="2026-03-12 16:02:47.434161319 +0000 UTC m=+4519.719386597" lastFinishedPulling="2026-03-12 16:02:58.409935953 +0000 UTC m=+4530.695161231" observedRunningTime="2026-03-12 16:02:59.274260401 +0000 UTC m=+4531.559485679" watchObservedRunningTime="2026-03-12 16:02:59.280515818 +0000 UTC m=+4531.565741096" Mar 12 16:03:19 crc kubenswrapper[4869]: I0312 16:03:19.684586 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 16:03:19 crc kubenswrapper[4869]: I0312 16:03:19.686493 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 16:03:44 crc kubenswrapper[4869]: I0312 16:03:44.690154 4869 generic.go:334] "Generic (PLEG): container finished" podID="7cebdea4-5359-424d-9166-2b1a242a8197" containerID="1048a45ac39c4b4e0c329b04cbef94cb49c55efdde97fbf64c3cd2d9a94ab2ea" exitCode=0 Mar 12 16:03:44 crc kubenswrapper[4869]: I0312 16:03:44.690210 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w5hx4/crc-debug-d8jfc" event={"ID":"7cebdea4-5359-424d-9166-2b1a242a8197","Type":"ContainerDied","Data":"1048a45ac39c4b4e0c329b04cbef94cb49c55efdde97fbf64c3cd2d9a94ab2ea"} Mar 12 16:03:45 crc kubenswrapper[4869]: I0312 16:03:45.811096 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w5hx4/crc-debug-d8jfc" Mar 12 16:03:45 crc kubenswrapper[4869]: I0312 16:03:45.846228 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-w5hx4/crc-debug-d8jfc"] Mar 12 16:03:45 crc kubenswrapper[4869]: I0312 16:03:45.855764 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-w5hx4/crc-debug-d8jfc"] Mar 12 16:03:45 crc kubenswrapper[4869]: I0312 16:03:45.948877 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7cebdea4-5359-424d-9166-2b1a242a8197-host\") pod \"7cebdea4-5359-424d-9166-2b1a242a8197\" (UID: \"7cebdea4-5359-424d-9166-2b1a242a8197\") " Mar 12 16:03:45 crc kubenswrapper[4869]: I0312 16:03:45.949148 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cebdea4-5359-424d-9166-2b1a242a8197-host" (OuterVolumeSpecName: "host") pod "7cebdea4-5359-424d-9166-2b1a242a8197" (UID: "7cebdea4-5359-424d-9166-2b1a242a8197"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 16:03:45 crc kubenswrapper[4869]: I0312 16:03:45.949194 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8vgb\" (UniqueName: \"kubernetes.io/projected/7cebdea4-5359-424d-9166-2b1a242a8197-kube-api-access-t8vgb\") pod \"7cebdea4-5359-424d-9166-2b1a242a8197\" (UID: \"7cebdea4-5359-424d-9166-2b1a242a8197\") " Mar 12 16:03:45 crc kubenswrapper[4869]: I0312 16:03:45.949600 4869 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7cebdea4-5359-424d-9166-2b1a242a8197-host\") on node \"crc\" DevicePath \"\"" Mar 12 16:03:45 crc kubenswrapper[4869]: I0312 16:03:45.970890 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cebdea4-5359-424d-9166-2b1a242a8197-kube-api-access-t8vgb" (OuterVolumeSpecName: "kube-api-access-t8vgb") pod "7cebdea4-5359-424d-9166-2b1a242a8197" (UID: "7cebdea4-5359-424d-9166-2b1a242a8197"). InnerVolumeSpecName "kube-api-access-t8vgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 16:03:46 crc kubenswrapper[4869]: I0312 16:03:46.051439 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8vgb\" (UniqueName: \"kubernetes.io/projected/7cebdea4-5359-424d-9166-2b1a242a8197-kube-api-access-t8vgb\") on node \"crc\" DevicePath \"\"" Mar 12 16:03:46 crc kubenswrapper[4869]: I0312 16:03:46.350188 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cebdea4-5359-424d-9166-2b1a242a8197" path="/var/lib/kubelet/pods/7cebdea4-5359-424d-9166-2b1a242a8197/volumes" Mar 12 16:03:46 crc kubenswrapper[4869]: I0312 16:03:46.707142 4869 scope.go:117] "RemoveContainer" containerID="1048a45ac39c4b4e0c329b04cbef94cb49c55efdde97fbf64c3cd2d9a94ab2ea" Mar 12 16:03:46 crc kubenswrapper[4869]: I0312 16:03:46.707198 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w5hx4/crc-debug-d8jfc" Mar 12 16:03:47 crc kubenswrapper[4869]: I0312 16:03:47.020219 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-w5hx4/crc-debug-mmc2w"] Mar 12 16:03:47 crc kubenswrapper[4869]: E0312 16:03:47.020702 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cebdea4-5359-424d-9166-2b1a242a8197" containerName="container-00" Mar 12 16:03:47 crc kubenswrapper[4869]: I0312 16:03:47.020720 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cebdea4-5359-424d-9166-2b1a242a8197" containerName="container-00" Mar 12 16:03:47 crc kubenswrapper[4869]: I0312 16:03:47.020945 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cebdea4-5359-424d-9166-2b1a242a8197" containerName="container-00" Mar 12 16:03:47 crc kubenswrapper[4869]: I0312 16:03:47.021675 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w5hx4/crc-debug-mmc2w" Mar 12 16:03:47 crc kubenswrapper[4869]: I0312 16:03:47.155689 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/43cee24d-4704-47ea-ab51-83c03262b024-host\") pod \"crc-debug-mmc2w\" (UID: \"43cee24d-4704-47ea-ab51-83c03262b024\") " pod="openshift-must-gather-w5hx4/crc-debug-mmc2w" Mar 12 16:03:47 crc kubenswrapper[4869]: I0312 16:03:47.155928 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6tq7\" (UniqueName: \"kubernetes.io/projected/43cee24d-4704-47ea-ab51-83c03262b024-kube-api-access-d6tq7\") pod \"crc-debug-mmc2w\" (UID: \"43cee24d-4704-47ea-ab51-83c03262b024\") " pod="openshift-must-gather-w5hx4/crc-debug-mmc2w" Mar 12 16:03:47 crc kubenswrapper[4869]: I0312 16:03:47.257840 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/43cee24d-4704-47ea-ab51-83c03262b024-host\") pod \"crc-debug-mmc2w\" (UID: \"43cee24d-4704-47ea-ab51-83c03262b024\") " pod="openshift-must-gather-w5hx4/crc-debug-mmc2w" Mar 12 16:03:47 crc kubenswrapper[4869]: I0312 16:03:47.257915 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6tq7\" (UniqueName: \"kubernetes.io/projected/43cee24d-4704-47ea-ab51-83c03262b024-kube-api-access-d6tq7\") pod \"crc-debug-mmc2w\" (UID: \"43cee24d-4704-47ea-ab51-83c03262b024\") " pod="openshift-must-gather-w5hx4/crc-debug-mmc2w" Mar 12 16:03:47 crc kubenswrapper[4869]: I0312 16:03:47.257996 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/43cee24d-4704-47ea-ab51-83c03262b024-host\") pod \"crc-debug-mmc2w\" (UID: \"43cee24d-4704-47ea-ab51-83c03262b024\") " pod="openshift-must-gather-w5hx4/crc-debug-mmc2w" Mar 12 16:03:47 crc kubenswrapper[4869]: I0312 16:03:47.279151 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6tq7\" (UniqueName: \"kubernetes.io/projected/43cee24d-4704-47ea-ab51-83c03262b024-kube-api-access-d6tq7\") pod \"crc-debug-mmc2w\" (UID: \"43cee24d-4704-47ea-ab51-83c03262b024\") " pod="openshift-must-gather-w5hx4/crc-debug-mmc2w" Mar 12 16:03:47 crc kubenswrapper[4869]: I0312 16:03:47.344143 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w5hx4/crc-debug-mmc2w" Mar 12 16:03:47 crc kubenswrapper[4869]: I0312 16:03:47.716644 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w5hx4/crc-debug-mmc2w" event={"ID":"43cee24d-4704-47ea-ab51-83c03262b024","Type":"ContainerStarted","Data":"7362b043ce370bd87e3a8423f1c770621cc0bbd6e0a50480a98cd68af8f553d3"} Mar 12 16:03:47 crc kubenswrapper[4869]: I0312 16:03:47.717159 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w5hx4/crc-debug-mmc2w" event={"ID":"43cee24d-4704-47ea-ab51-83c03262b024","Type":"ContainerStarted","Data":"72e3bdfc3795f036aa31feaf08ebc2bf4a47658d76ad79685f72f10fc3574b8e"} Mar 12 16:03:48 crc kubenswrapper[4869]: I0312 16:03:48.181867 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-w5hx4/crc-debug-mmc2w" podStartSLOduration=1.181847931 podStartE2EDuration="1.181847931s" podCreationTimestamp="2026-03-12 16:03:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:03:48.178520677 +0000 UTC m=+4580.463745955" watchObservedRunningTime="2026-03-12 16:03:48.181847931 +0000 UTC m=+4580.467073209" Mar 12 16:03:49 crc kubenswrapper[4869]: I0312 16:03:49.683917 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 16:03:49 crc kubenswrapper[4869]: I0312 16:03:49.684430 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 16:03:49 crc kubenswrapper[4869]: I0312 16:03:49.684490 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" Mar 12 16:03:49 crc kubenswrapper[4869]: I0312 16:03:49.685458 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7f386f4ac9366e91219ce91ba4e94753e5c56573855dfb2b2ddd058f1eec367e"} pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 16:03:49 crc kubenswrapper[4869]: I0312 16:03:49.685517 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" containerID="cri-o://7f386f4ac9366e91219ce91ba4e94753e5c56573855dfb2b2ddd058f1eec367e" gracePeriod=600 Mar 12 16:03:49 crc kubenswrapper[4869]: E0312 16:03:49.814959 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 16:03:50 crc kubenswrapper[4869]: I0312 16:03:50.653328 4869 generic.go:334] "Generic (PLEG): container finished" podID="1621c994-94d2-4105-a988-f4739518ba91" containerID="7f386f4ac9366e91219ce91ba4e94753e5c56573855dfb2b2ddd058f1eec367e" exitCode=0 Mar 12 16:03:50 crc kubenswrapper[4869]: I0312 16:03:50.653420 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerDied","Data":"7f386f4ac9366e91219ce91ba4e94753e5c56573855dfb2b2ddd058f1eec367e"} Mar 12 16:03:50 crc kubenswrapper[4869]: I0312 16:03:50.653762 4869 scope.go:117] "RemoveContainer" containerID="4f1f716a83ed2ed2475b6a6f3d5f92c71cf571d0688c327fc80039a9d481905b" Mar 12 16:03:50 crc kubenswrapper[4869]: I0312 16:03:50.654858 4869 scope.go:117] "RemoveContainer" containerID="7f386f4ac9366e91219ce91ba4e94753e5c56573855dfb2b2ddd058f1eec367e" Mar 12 16:03:50 crc kubenswrapper[4869]: E0312 16:03:50.655243 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 16:03:50 crc kubenswrapper[4869]: I0312 16:03:50.655808 4869 generic.go:334] "Generic (PLEG): container finished" podID="43cee24d-4704-47ea-ab51-83c03262b024" containerID="7362b043ce370bd87e3a8423f1c770621cc0bbd6e0a50480a98cd68af8f553d3" exitCode=0 Mar 12 16:03:50 crc kubenswrapper[4869]: I0312 16:03:50.655851 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w5hx4/crc-debug-mmc2w" event={"ID":"43cee24d-4704-47ea-ab51-83c03262b024","Type":"ContainerDied","Data":"7362b043ce370bd87e3a8423f1c770621cc0bbd6e0a50480a98cd68af8f553d3"} Mar 12 16:03:51 crc kubenswrapper[4869]: I0312 16:03:51.771646 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w5hx4/crc-debug-mmc2w" Mar 12 16:03:51 crc kubenswrapper[4869]: I0312 16:03:51.846593 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-w5hx4/crc-debug-mmc2w"] Mar 12 16:03:51 crc kubenswrapper[4869]: I0312 16:03:51.855392 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-w5hx4/crc-debug-mmc2w"] Mar 12 16:03:51 crc kubenswrapper[4869]: I0312 16:03:51.898111 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/43cee24d-4704-47ea-ab51-83c03262b024-host\") pod \"43cee24d-4704-47ea-ab51-83c03262b024\" (UID: \"43cee24d-4704-47ea-ab51-83c03262b024\") " Mar 12 16:03:51 crc kubenswrapper[4869]: I0312 16:03:51.898228 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6tq7\" (UniqueName: \"kubernetes.io/projected/43cee24d-4704-47ea-ab51-83c03262b024-kube-api-access-d6tq7\") pod \"43cee24d-4704-47ea-ab51-83c03262b024\" (UID: \"43cee24d-4704-47ea-ab51-83c03262b024\") " Mar 12 16:03:51 crc kubenswrapper[4869]: I0312 16:03:51.898271 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43cee24d-4704-47ea-ab51-83c03262b024-host" (OuterVolumeSpecName: "host") pod "43cee24d-4704-47ea-ab51-83c03262b024" (UID: "43cee24d-4704-47ea-ab51-83c03262b024"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 16:03:51 crc kubenswrapper[4869]: I0312 16:03:51.898843 4869 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/43cee24d-4704-47ea-ab51-83c03262b024-host\") on node \"crc\" DevicePath \"\"" Mar 12 16:03:51 crc kubenswrapper[4869]: I0312 16:03:51.903886 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43cee24d-4704-47ea-ab51-83c03262b024-kube-api-access-d6tq7" (OuterVolumeSpecName: "kube-api-access-d6tq7") pod "43cee24d-4704-47ea-ab51-83c03262b024" (UID: "43cee24d-4704-47ea-ab51-83c03262b024"). InnerVolumeSpecName "kube-api-access-d6tq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 16:03:52 crc kubenswrapper[4869]: I0312 16:03:52.002450 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6tq7\" (UniqueName: \"kubernetes.io/projected/43cee24d-4704-47ea-ab51-83c03262b024-kube-api-access-d6tq7\") on node \"crc\" DevicePath \"\"" Mar 12 16:03:52 crc kubenswrapper[4869]: I0312 16:03:52.355118 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43cee24d-4704-47ea-ab51-83c03262b024" path="/var/lib/kubelet/pods/43cee24d-4704-47ea-ab51-83c03262b024/volumes" Mar 12 16:03:52 crc kubenswrapper[4869]: I0312 16:03:52.699848 4869 scope.go:117] "RemoveContainer" containerID="7362b043ce370bd87e3a8423f1c770621cc0bbd6e0a50480a98cd68af8f553d3" Mar 12 16:03:52 crc kubenswrapper[4869]: I0312 16:03:52.699879 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w5hx4/crc-debug-mmc2w" Mar 12 16:03:53 crc kubenswrapper[4869]: I0312 16:03:53.005666 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-w5hx4/crc-debug-99pnb"] Mar 12 16:03:53 crc kubenswrapper[4869]: E0312 16:03:53.006169 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43cee24d-4704-47ea-ab51-83c03262b024" containerName="container-00" Mar 12 16:03:53 crc kubenswrapper[4869]: I0312 16:03:53.006185 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="43cee24d-4704-47ea-ab51-83c03262b024" containerName="container-00" Mar 12 16:03:53 crc kubenswrapper[4869]: I0312 16:03:53.006385 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="43cee24d-4704-47ea-ab51-83c03262b024" containerName="container-00" Mar 12 16:03:53 crc kubenswrapper[4869]: I0312 16:03:53.006978 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w5hx4/crc-debug-99pnb" Mar 12 16:03:53 crc kubenswrapper[4869]: I0312 16:03:53.123278 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/512db845-c28e-4cc2-a81b-1f114688c0c4-host\") pod \"crc-debug-99pnb\" (UID: \"512db845-c28e-4cc2-a81b-1f114688c0c4\") " pod="openshift-must-gather-w5hx4/crc-debug-99pnb" Mar 12 16:03:53 crc kubenswrapper[4869]: I0312 16:03:53.123474 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plnhs\" (UniqueName: \"kubernetes.io/projected/512db845-c28e-4cc2-a81b-1f114688c0c4-kube-api-access-plnhs\") pod \"crc-debug-99pnb\" (UID: \"512db845-c28e-4cc2-a81b-1f114688c0c4\") " pod="openshift-must-gather-w5hx4/crc-debug-99pnb" Mar 12 16:03:53 crc kubenswrapper[4869]: I0312 16:03:53.225469 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plnhs\" (UniqueName: \"kubernetes.io/projected/512db845-c28e-4cc2-a81b-1f114688c0c4-kube-api-access-plnhs\") pod \"crc-debug-99pnb\" (UID: \"512db845-c28e-4cc2-a81b-1f114688c0c4\") " pod="openshift-must-gather-w5hx4/crc-debug-99pnb" Mar 12 16:03:53 crc kubenswrapper[4869]: I0312 16:03:53.225609 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/512db845-c28e-4cc2-a81b-1f114688c0c4-host\") pod \"crc-debug-99pnb\" (UID: \"512db845-c28e-4cc2-a81b-1f114688c0c4\") " pod="openshift-must-gather-w5hx4/crc-debug-99pnb" Mar 12 16:03:53 crc kubenswrapper[4869]: I0312 16:03:53.225697 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/512db845-c28e-4cc2-a81b-1f114688c0c4-host\") pod \"crc-debug-99pnb\" (UID: \"512db845-c28e-4cc2-a81b-1f114688c0c4\") " pod="openshift-must-gather-w5hx4/crc-debug-99pnb" Mar 12 16:03:53 crc kubenswrapper[4869]: I0312 16:03:53.246271 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plnhs\" (UniqueName: \"kubernetes.io/projected/512db845-c28e-4cc2-a81b-1f114688c0c4-kube-api-access-plnhs\") pod \"crc-debug-99pnb\" (UID: \"512db845-c28e-4cc2-a81b-1f114688c0c4\") " pod="openshift-must-gather-w5hx4/crc-debug-99pnb" Mar 12 16:03:53 crc kubenswrapper[4869]: I0312 16:03:53.325741 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w5hx4/crc-debug-99pnb" Mar 12 16:03:54 crc kubenswrapper[4869]: I0312 16:03:54.724508 4869 generic.go:334] "Generic (PLEG): container finished" podID="512db845-c28e-4cc2-a81b-1f114688c0c4" containerID="2f51ff44fcec4aa84ab3cd982efe12bcc330d4b32dd9cce359629325e04bf140" exitCode=0 Mar 12 16:03:54 crc kubenswrapper[4869]: I0312 16:03:54.724639 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w5hx4/crc-debug-99pnb" event={"ID":"512db845-c28e-4cc2-a81b-1f114688c0c4","Type":"ContainerDied","Data":"2f51ff44fcec4aa84ab3cd982efe12bcc330d4b32dd9cce359629325e04bf140"} Mar 12 16:03:54 crc kubenswrapper[4869]: I0312 16:03:54.724946 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w5hx4/crc-debug-99pnb" event={"ID":"512db845-c28e-4cc2-a81b-1f114688c0c4","Type":"ContainerStarted","Data":"58d1ffe1e76b7fb4dd0894854afd412053c7050d69f8717e5dd29df162bff0b4"} Mar 12 16:03:54 crc kubenswrapper[4869]: I0312 16:03:54.802517 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-w5hx4/crc-debug-99pnb"] Mar 12 16:03:54 crc kubenswrapper[4869]: I0312 16:03:54.821204 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-w5hx4/crc-debug-99pnb"] Mar 12 16:03:56 crc kubenswrapper[4869]: I0312 16:03:56.488243 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w5hx4/crc-debug-99pnb" Mar 12 16:03:56 crc kubenswrapper[4869]: I0312 16:03:56.597740 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plnhs\" (UniqueName: \"kubernetes.io/projected/512db845-c28e-4cc2-a81b-1f114688c0c4-kube-api-access-plnhs\") pod \"512db845-c28e-4cc2-a81b-1f114688c0c4\" (UID: \"512db845-c28e-4cc2-a81b-1f114688c0c4\") " Mar 12 16:03:56 crc kubenswrapper[4869]: I0312 16:03:56.597841 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/512db845-c28e-4cc2-a81b-1f114688c0c4-host\") pod \"512db845-c28e-4cc2-a81b-1f114688c0c4\" (UID: \"512db845-c28e-4cc2-a81b-1f114688c0c4\") " Mar 12 16:03:56 crc kubenswrapper[4869]: I0312 16:03:56.597964 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/512db845-c28e-4cc2-a81b-1f114688c0c4-host" (OuterVolumeSpecName: "host") pod "512db845-c28e-4cc2-a81b-1f114688c0c4" (UID: "512db845-c28e-4cc2-a81b-1f114688c0c4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 16:03:56 crc kubenswrapper[4869]: I0312 16:03:56.598409 4869 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/512db845-c28e-4cc2-a81b-1f114688c0c4-host\") on node \"crc\" DevicePath \"\"" Mar 12 16:03:56 crc kubenswrapper[4869]: I0312 16:03:56.612161 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/512db845-c28e-4cc2-a81b-1f114688c0c4-kube-api-access-plnhs" (OuterVolumeSpecName: "kube-api-access-plnhs") pod "512db845-c28e-4cc2-a81b-1f114688c0c4" (UID: "512db845-c28e-4cc2-a81b-1f114688c0c4"). InnerVolumeSpecName "kube-api-access-plnhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 16:03:56 crc kubenswrapper[4869]: I0312 16:03:56.700176 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plnhs\" (UniqueName: \"kubernetes.io/projected/512db845-c28e-4cc2-a81b-1f114688c0c4-kube-api-access-plnhs\") on node \"crc\" DevicePath \"\"" Mar 12 16:03:56 crc kubenswrapper[4869]: I0312 16:03:56.750828 4869 scope.go:117] "RemoveContainer" containerID="2f51ff44fcec4aa84ab3cd982efe12bcc330d4b32dd9cce359629325e04bf140" Mar 12 16:03:56 crc kubenswrapper[4869]: I0312 16:03:56.750976 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w5hx4/crc-debug-99pnb" Mar 12 16:03:58 crc kubenswrapper[4869]: I0312 16:03:58.347186 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="512db845-c28e-4cc2-a81b-1f114688c0c4" path="/var/lib/kubelet/pods/512db845-c28e-4cc2-a81b-1f114688c0c4/volumes" Mar 12 16:04:00 crc kubenswrapper[4869]: I0312 16:04:00.142978 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555524-5nkw9"] Mar 12 16:04:00 crc kubenswrapper[4869]: E0312 16:04:00.143410 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="512db845-c28e-4cc2-a81b-1f114688c0c4" containerName="container-00" Mar 12 16:04:00 crc kubenswrapper[4869]: I0312 16:04:00.143423 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="512db845-c28e-4cc2-a81b-1f114688c0c4" containerName="container-00" Mar 12 16:04:00 crc kubenswrapper[4869]: I0312 16:04:00.143634 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="512db845-c28e-4cc2-a81b-1f114688c0c4" containerName="container-00" Mar 12 16:04:00 crc kubenswrapper[4869]: I0312 16:04:00.144240 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555524-5nkw9" Mar 12 16:04:00 crc kubenswrapper[4869]: I0312 16:04:00.148873 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 16:04:00 crc kubenswrapper[4869]: I0312 16:04:00.148878 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 16:04:00 crc kubenswrapper[4869]: I0312 16:04:00.148878 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 16:04:00 crc kubenswrapper[4869]: I0312 16:04:00.155196 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555524-5nkw9"] Mar 12 16:04:00 crc kubenswrapper[4869]: I0312 16:04:00.268641 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccr79\" (UniqueName: \"kubernetes.io/projected/a972d17c-7397-40c9-a6d4-610df87a932a-kube-api-access-ccr79\") pod \"auto-csr-approver-29555524-5nkw9\" (UID: \"a972d17c-7397-40c9-a6d4-610df87a932a\") " pod="openshift-infra/auto-csr-approver-29555524-5nkw9" Mar 12 16:04:00 crc kubenswrapper[4869]: I0312 16:04:00.369866 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccr79\" (UniqueName: \"kubernetes.io/projected/a972d17c-7397-40c9-a6d4-610df87a932a-kube-api-access-ccr79\") pod \"auto-csr-approver-29555524-5nkw9\" (UID: \"a972d17c-7397-40c9-a6d4-610df87a932a\") " pod="openshift-infra/auto-csr-approver-29555524-5nkw9" Mar 12 16:04:00 crc kubenswrapper[4869]: I0312 16:04:00.389615 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccr79\" (UniqueName: \"kubernetes.io/projected/a972d17c-7397-40c9-a6d4-610df87a932a-kube-api-access-ccr79\") pod \"auto-csr-approver-29555524-5nkw9\" (UID: \"a972d17c-7397-40c9-a6d4-610df87a932a\") " pod="openshift-infra/auto-csr-approver-29555524-5nkw9" Mar 12 16:04:00 crc kubenswrapper[4869]: I0312 16:04:00.462683 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555524-5nkw9" Mar 12 16:04:00 crc kubenswrapper[4869]: I0312 16:04:00.953711 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555524-5nkw9"] Mar 12 16:04:01 crc kubenswrapper[4869]: I0312 16:04:01.336926 4869 scope.go:117] "RemoveContainer" containerID="7f386f4ac9366e91219ce91ba4e94753e5c56573855dfb2b2ddd058f1eec367e" Mar 12 16:04:01 crc kubenswrapper[4869]: E0312 16:04:01.337221 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 16:04:01 crc kubenswrapper[4869]: I0312 16:04:01.794181 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555524-5nkw9" event={"ID":"a972d17c-7397-40c9-a6d4-610df87a932a","Type":"ContainerStarted","Data":"cf1e1a3c1ef049f6b9b40bcd4d21937678e18af6d1cc34c8c3259a4ad33bbe04"} Mar 12 16:04:02 crc kubenswrapper[4869]: I0312 16:04:02.823301 4869 generic.go:334] "Generic (PLEG): container finished" podID="a972d17c-7397-40c9-a6d4-610df87a932a" containerID="ccf1c0d1e8b1fef8c02e8b26f9a9bfc0e993ad3b12bb78034f0921cee699e056" exitCode=0 Mar 12 16:04:02 crc kubenswrapper[4869]: I0312 16:04:02.823578 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555524-5nkw9" event={"ID":"a972d17c-7397-40c9-a6d4-610df87a932a","Type":"ContainerDied","Data":"ccf1c0d1e8b1fef8c02e8b26f9a9bfc0e993ad3b12bb78034f0921cee699e056"} Mar 12 16:04:04 crc kubenswrapper[4869]: I0312 16:04:04.286284 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555524-5nkw9" Mar 12 16:04:04 crc kubenswrapper[4869]: I0312 16:04:04.352608 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccr79\" (UniqueName: \"kubernetes.io/projected/a972d17c-7397-40c9-a6d4-610df87a932a-kube-api-access-ccr79\") pod \"a972d17c-7397-40c9-a6d4-610df87a932a\" (UID: \"a972d17c-7397-40c9-a6d4-610df87a932a\") " Mar 12 16:04:04 crc kubenswrapper[4869]: I0312 16:04:04.362029 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a972d17c-7397-40c9-a6d4-610df87a932a-kube-api-access-ccr79" (OuterVolumeSpecName: "kube-api-access-ccr79") pod "a972d17c-7397-40c9-a6d4-610df87a932a" (UID: "a972d17c-7397-40c9-a6d4-610df87a932a"). InnerVolumeSpecName "kube-api-access-ccr79". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 16:04:04 crc kubenswrapper[4869]: I0312 16:04:04.458363 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccr79\" (UniqueName: \"kubernetes.io/projected/a972d17c-7397-40c9-a6d4-610df87a932a-kube-api-access-ccr79\") on node \"crc\" DevicePath \"\"" Mar 12 16:04:04 crc kubenswrapper[4869]: I0312 16:04:04.841587 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555524-5nkw9" event={"ID":"a972d17c-7397-40c9-a6d4-610df87a932a","Type":"ContainerDied","Data":"cf1e1a3c1ef049f6b9b40bcd4d21937678e18af6d1cc34c8c3259a4ad33bbe04"} Mar 12 16:04:04 crc kubenswrapper[4869]: I0312 16:04:04.841639 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf1e1a3c1ef049f6b9b40bcd4d21937678e18af6d1cc34c8c3259a4ad33bbe04" Mar 12 16:04:04 crc kubenswrapper[4869]: I0312 16:04:04.841697 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555524-5nkw9" Mar 12 16:04:05 crc kubenswrapper[4869]: I0312 16:04:05.354517 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555518-9lfc8"] Mar 12 16:04:05 crc kubenswrapper[4869]: I0312 16:04:05.363344 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555518-9lfc8"] Mar 12 16:04:06 crc kubenswrapper[4869]: I0312 16:04:06.347333 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d9798e7-7936-421d-a9b8-f37daf8efc1e" path="/var/lib/kubelet/pods/9d9798e7-7936-421d-a9b8-f37daf8efc1e/volumes" Mar 12 16:04:11 crc kubenswrapper[4869]: I0312 16:04:11.891708 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-566d64c64b-vx76m_573219b1-c8f0-49ab-86ac-d2861f55dfae/barbican-api/0.log" Mar 12 16:04:12 crc kubenswrapper[4869]: I0312 16:04:12.047523 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-566d64c64b-vx76m_573219b1-c8f0-49ab-86ac-d2861f55dfae/barbican-api-log/0.log" Mar 12 16:04:12 crc kubenswrapper[4869]: I0312 16:04:12.171825 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-b57d7b9cd-fglxg_41e8dfe2-49a6-48a2-a037-d090aa545010/barbican-keystone-listener/0.log" Mar 12 16:04:12 crc kubenswrapper[4869]: I0312 16:04:12.334187 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5fd5649b5f-5rjlk_6d671489-0a10-4779-8567-c34e80544dbb/barbican-worker/0.log" Mar 12 16:04:12 crc kubenswrapper[4869]: I0312 16:04:12.421482 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5fd5649b5f-5rjlk_6d671489-0a10-4779-8567-c34e80544dbb/barbican-worker-log/0.log" Mar 12 16:04:12 crc kubenswrapper[4869]: I0312 16:04:12.661031 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-wj29q_300f0918-acb3-42ed-a67a-560608d31eda/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 16:04:12 crc kubenswrapper[4869]: I0312 16:04:12.868238 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_699fa772-3cdc-41d3-840e-ced246a8090f/ceilometer-central-agent/0.log" Mar 12 16:04:12 crc kubenswrapper[4869]: I0312 16:04:12.887882 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-b57d7b9cd-fglxg_41e8dfe2-49a6-48a2-a037-d090aa545010/barbican-keystone-listener-log/0.log" Mar 12 16:04:12 crc kubenswrapper[4869]: I0312 16:04:12.940678 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_699fa772-3cdc-41d3-840e-ced246a8090f/proxy-httpd/0.log" Mar 12 16:04:12 crc kubenswrapper[4869]: I0312 16:04:12.946236 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_699fa772-3cdc-41d3-840e-ced246a8090f/ceilometer-notification-agent/0.log" Mar 12 16:04:13 crc kubenswrapper[4869]: I0312 16:04:13.110049 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_699fa772-3cdc-41d3-840e-ced246a8090f/sg-core/0.log" Mar 12 16:04:13 crc kubenswrapper[4869]: I0312 16:04:13.242900 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph_84a3366d-25a7-430b-839c-f7f21cbac99a/ceph/0.log" Mar 12 16:04:14 crc kubenswrapper[4869]: I0312 16:04:14.145185 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_3d172b0b-59f1-408e-befd-28542b61af1b/cinder-api/0.log" Mar 12 16:04:14 crc kubenswrapper[4869]: I0312 16:04:14.148946 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_3d172b0b-59f1-408e-befd-28542b61af1b/cinder-api-log/0.log" Mar 12 16:04:14 crc kubenswrapper[4869]: I0312 16:04:14.207529 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_449d619a-d94f-45bc-9066-919a85d84f76/probe/0.log" Mar 12 16:04:14 crc kubenswrapper[4869]: I0312 16:04:14.420911 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_95fcb56b-b075-469f-9a02-bcc737a18c26/cinder-scheduler/0.log" Mar 12 16:04:14 crc kubenswrapper[4869]: I0312 16:04:14.486595 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_95fcb56b-b075-469f-9a02-bcc737a18c26/probe/0.log" Mar 12 16:04:14 crc kubenswrapper[4869]: I0312 16:04:14.702958 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_449d619a-d94f-45bc-9066-919a85d84f76/cinder-backup/0.log" Mar 12 16:04:14 crc kubenswrapper[4869]: I0312 16:04:14.732886 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_870974b7-5400-44ae-91be-ca7bc532764c/probe/0.log" Mar 12 16:04:14 crc kubenswrapper[4869]: I0312 16:04:14.987637 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-lfqq6_7d0293d4-0a27-4535-8a7e-53a6b1c7a835/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 16:04:15 crc kubenswrapper[4869]: I0312 16:04:15.667364 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-9lxgf_ed6cecf2-9a8d-4bd2-8f07-29410d12dba2/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 16:04:15 crc kubenswrapper[4869]: I0312 16:04:15.735772 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d99fc9df9-bblrn_052cf777-9cf7-44bf-bbc7-5cc89c1d5e22/init/0.log" Mar 12 16:04:16 crc kubenswrapper[4869]: I0312 16:04:16.149440 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d99fc9df9-bblrn_052cf777-9cf7-44bf-bbc7-5cc89c1d5e22/init/0.log" Mar 12 16:04:16 crc kubenswrapper[4869]: I0312 16:04:16.337370 4869 scope.go:117] "RemoveContainer" containerID="7f386f4ac9366e91219ce91ba4e94753e5c56573855dfb2b2ddd058f1eec367e" Mar 12 16:04:16 crc kubenswrapper[4869]: E0312 16:04:16.340018 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 16:04:16 crc kubenswrapper[4869]: I0312 16:04:16.372146 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-57nnj_26bf39c9-9d2b-4d94-a4e8-c70409bb1f4f/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 16:04:16 crc kubenswrapper[4869]: I0312 16:04:16.433843 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_442f7481-9051-4bc5-b3e5-13eef1e6ff01/glance-httpd/0.log" Mar 12 16:04:16 crc kubenswrapper[4869]: I0312 16:04:16.500367 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d99fc9df9-bblrn_052cf777-9cf7-44bf-bbc7-5cc89c1d5e22/dnsmasq-dns/0.log" Mar 12 16:04:16 crc kubenswrapper[4869]: I0312 16:04:16.559408 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_442f7481-9051-4bc5-b3e5-13eef1e6ff01/glance-log/0.log" Mar 12 16:04:16 crc kubenswrapper[4869]: I0312 16:04:16.716360 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_471f34e2-b42f-468a-bd32-a1fa9612dbdc/glance-httpd/0.log" Mar 12 16:04:16 crc kubenswrapper[4869]: I0312 16:04:16.750654 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_471f34e2-b42f-468a-bd32-a1fa9612dbdc/glance-log/0.log" Mar 12 16:04:16 crc kubenswrapper[4869]: I0312 16:04:16.921297 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-776cdff46d-hvjw9_f1a34267-2bcd-4a01-b2b6-7528c474a7a2/horizon/0.log" Mar 12 16:04:16 crc kubenswrapper[4869]: I0312 16:04:16.947224 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_870974b7-5400-44ae-91be-ca7bc532764c/cinder-volume/0.log" Mar 12 16:04:17 crc kubenswrapper[4869]: I0312 16:04:17.083656 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-8sc5g_29b19fb6-d0a2-4944-972b-0dc4d285fd67/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 16:04:17 crc kubenswrapper[4869]: I0312 16:04:17.159522 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-776cdff46d-hvjw9_f1a34267-2bcd-4a01-b2b6-7528c474a7a2/horizon-log/0.log" Mar 12 16:04:17 crc kubenswrapper[4869]: I0312 16:04:17.362898 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-xlnsg_cc29ed56-a386-475e-aece-7d5d20a479dd/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 16:04:17 crc kubenswrapper[4869]: I0312 16:04:17.532507 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29555521-g6jvn_e2481b36-642b-40ec-a6b2-c5f9750997a1/keystone-cron/0.log" Mar 12 16:04:17 crc kubenswrapper[4869]: I0312 16:04:17.697519 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_39aee064-bbb0-45fa-93b9-8c720fdca852/kube-state-metrics/0.log" Mar 12 16:04:17 crc kubenswrapper[4869]: I0312 16:04:17.868359 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-ncsnw_2fe7a0f9-519a-4cbc-aacd-2c14989e8d7a/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 16:04:18 crc kubenswrapper[4869]: I0312 16:04:18.374356 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_b0b57477-4e74-4eb5-a0a0-c11e022f6919/probe/0.log" Mar 12 16:04:18 crc kubenswrapper[4869]: I0312 16:04:18.405793 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_b0c3a0a4-5746-41fe-a7a5-0386aedb2a86/manila-api/0.log" Mar 12 16:04:18 crc kubenswrapper[4869]: I0312 16:04:18.493477 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_b0b57477-4e74-4eb5-a0a0-c11e022f6919/manila-scheduler/0.log" Mar 12 16:04:18 crc kubenswrapper[4869]: I0312 16:04:18.765519 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_9523917e-a416-4304-9479-9ef0a1a9a09d/probe/0.log" Mar 12 16:04:18 crc kubenswrapper[4869]: I0312 16:04:18.981323 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_9523917e-a416-4304-9479-9ef0a1a9a09d/manila-share/0.log" Mar 12 16:04:19 crc kubenswrapper[4869]: I0312 16:04:19.018951 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_b0c3a0a4-5746-41fe-a7a5-0386aedb2a86/manila-api-log/0.log" Mar 12 16:04:19 crc kubenswrapper[4869]: I0312 16:04:19.625255 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-b8cts_42c001af-bc5e-4906-b65f-0ec328893bce/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 16:04:19 crc kubenswrapper[4869]: I0312 16:04:19.864941 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-79647c85bf-8lt5x_b4e6ce48-0823-4cd4-8b05-252a9ecbe205/neutron-httpd/0.log" Mar 12 16:04:20 crc kubenswrapper[4869]: I0312 16:04:20.431126 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-79647c85bf-8lt5x_b4e6ce48-0823-4cd4-8b05-252a9ecbe205/neutron-api/0.log" Mar 12 16:04:21 crc kubenswrapper[4869]: I0312 16:04:21.373329 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_ba5ec75b-57f3-43a2-b6eb-4f876d368fae/nova-cell0-conductor-conductor/0.log" Mar 12 16:04:21 crc kubenswrapper[4869]: I0312 16:04:21.697234 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-ccddd4fb7-22sz6_791764fa-f933-49c0-82ee-0a716cce2f01/keystone-api/0.log" Mar 12 16:04:21 crc kubenswrapper[4869]: I0312 16:04:21.952742 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_cda652da-aabd-4fd6-91f3-0da6ec497845/nova-cell1-conductor-conductor/0.log" Mar 12 16:04:22 crc kubenswrapper[4869]: I0312 16:04:22.252118 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_a10d0ea8-db1d-4779-9b4c-d97edb20c85e/nova-cell1-novncproxy-novncproxy/0.log" Mar 12 16:04:22 crc kubenswrapper[4869]: I0312 16:04:22.435236 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ded10d99-1db4-470e-bc5f-356cf78424c4/nova-api-log/0.log" Mar 12 16:04:22 crc kubenswrapper[4869]: I0312 16:04:22.444714 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-zssnd_8510f1f3-9ca7-40ce-acb1-4ad3ab0e7ae4/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 16:04:22 crc kubenswrapper[4869]: I0312 16:04:22.758478 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2defbef6-f414-43b6-b36e-6d60027baa77/nova-metadata-log/0.log" Mar 12 16:04:22 crc kubenswrapper[4869]: I0312 16:04:22.931063 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ded10d99-1db4-470e-bc5f-356cf78424c4/nova-api-api/0.log" Mar 12 16:04:23 crc kubenswrapper[4869]: I0312 16:04:23.157631 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c8b58aad-8641-4aec-8053-f4b75d5931e8/mysql-bootstrap/0.log" Mar 12 16:04:23 crc kubenswrapper[4869]: I0312 16:04:23.324950 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c8b58aad-8641-4aec-8053-f4b75d5931e8/mysql-bootstrap/0.log" Mar 12 16:04:23 crc kubenswrapper[4869]: I0312 16:04:23.380920 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c8b58aad-8641-4aec-8053-f4b75d5931e8/galera/0.log" Mar 12 16:04:23 crc kubenswrapper[4869]: I0312 16:04:23.435616 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_f8a25239-e1b9-4ae9-a044-8362e70d6959/nova-scheduler-scheduler/0.log" Mar 12 16:04:24 crc kubenswrapper[4869]: I0312 16:04:24.579106 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4406efc2-cefd-4e44-a5f0-7384101c9b36/mysql-bootstrap/0.log" Mar 12 16:04:24 crc kubenswrapper[4869]: I0312 16:04:24.585392 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4406efc2-cefd-4e44-a5f0-7384101c9b36/mysql-bootstrap/0.log" Mar 12 16:04:24 crc kubenswrapper[4869]: I0312 16:04:24.630702 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4406efc2-cefd-4e44-a5f0-7384101c9b36/galera/0.log" Mar 12 16:04:24 crc kubenswrapper[4869]: I0312 16:04:24.818001 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_df9ac527-ae76-4cb7-b474-61f5699e610f/openstackclient/0.log" Mar 12 16:04:24 crc kubenswrapper[4869]: I0312 16:04:24.957631 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2defbef6-f414-43b6-b36e-6d60027baa77/nova-metadata-metadata/0.log" Mar 12 16:04:25 crc kubenswrapper[4869]: I0312 16:04:25.116520 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-hrwrb_6c89cdd2-7ac2-46d2-be58-77f8b79acd81/openstack-network-exporter/0.log" Mar 12 16:04:25 crc kubenswrapper[4869]: I0312 16:04:25.240193 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jvmhv_5b0ed011-5259-4904-82e5-320adf5ff1cf/ovsdb-server-init/0.log" Mar 12 16:04:26 crc kubenswrapper[4869]: I0312 16:04:26.017374 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jvmhv_5b0ed011-5259-4904-82e5-320adf5ff1cf/ovsdb-server/0.log" Mar 12 16:04:26 crc kubenswrapper[4869]: I0312 16:04:26.036125 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jvmhv_5b0ed011-5259-4904-82e5-320adf5ff1cf/ovsdb-server-init/0.log" Mar 12 16:04:26 crc kubenswrapper[4869]: I0312 16:04:26.099047 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jvmhv_5b0ed011-5259-4904-82e5-320adf5ff1cf/ovs-vswitchd/0.log" Mar 12 16:04:26 crc kubenswrapper[4869]: I0312 16:04:26.245243 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-wzzxq_ad973222-a042-43af-9c00-b0f6d795c7d1/ovn-controller/0.log" Mar 12 16:04:26 crc kubenswrapper[4869]: I0312 16:04:26.363267 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-d9ljv_f9846b6b-2f29-45b7-86c0-aff84144a93a/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 16:04:26 crc kubenswrapper[4869]: I0312 16:04:26.491010 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_94c54bed-21a1-4a68-8cde-f45c89a05e85/ovn-northd/0.log" Mar 12 16:04:26 crc kubenswrapper[4869]: I0312 16:04:26.492697 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_94c54bed-21a1-4a68-8cde-f45c89a05e85/openstack-network-exporter/0.log" Mar 12 16:04:26 crc kubenswrapper[4869]: I0312 16:04:26.585971 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_5bf47c8a-507e-4eba-9776-b516e4555df4/openstack-network-exporter/0.log" Mar 12 16:04:26 crc kubenswrapper[4869]: I0312 16:04:26.700523 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_5bf47c8a-507e-4eba-9776-b516e4555df4/ovsdbserver-nb/0.log" Mar 12 16:04:26 crc kubenswrapper[4869]: I0312 16:04:26.820771 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_0454dc15-f63d-475c-9640-71a8d60d9e56/openstack-network-exporter/0.log" Mar 12 16:04:26 crc kubenswrapper[4869]: I0312 16:04:26.888472 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_0454dc15-f63d-475c-9640-71a8d60d9e56/ovsdbserver-sb/0.log" Mar 12 16:04:27 crc kubenswrapper[4869]: I0312 16:04:27.239073 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_be06ec61-ee02-4ab3-8e42-d98f94af4a87/setup-container/0.log" Mar 12 16:04:27 crc kubenswrapper[4869]: I0312 16:04:27.336608 4869 scope.go:117] "RemoveContainer" containerID="7f386f4ac9366e91219ce91ba4e94753e5c56573855dfb2b2ddd058f1eec367e" Mar 12 16:04:27 crc kubenswrapper[4869]: E0312 16:04:27.336864 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 16:04:27 crc kubenswrapper[4869]: I0312 16:04:27.375593 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-54df54bd-p64lp_efae3fc6-7dc2-4ad5-86af-ca7e623670c8/placement-api/0.log" Mar 12 16:04:27 crc kubenswrapper[4869]: I0312 16:04:27.407865 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_be06ec61-ee02-4ab3-8e42-d98f94af4a87/setup-container/0.log" Mar 12 16:04:27 crc kubenswrapper[4869]: I0312 16:04:27.450607 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_be06ec61-ee02-4ab3-8e42-d98f94af4a87/rabbitmq/0.log" Mar 12 16:04:27 crc kubenswrapper[4869]: I0312 16:04:27.518185 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-54df54bd-p64lp_efae3fc6-7dc2-4ad5-86af-ca7e623670c8/placement-log/0.log" Mar 12 16:04:27 crc kubenswrapper[4869]: I0312 16:04:27.587284 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f1182ade-04e6-4329-b41f-eda38443a859/setup-container/0.log" Mar 12 16:04:27 crc kubenswrapper[4869]: I0312 16:04:27.819227 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f1182ade-04e6-4329-b41f-eda38443a859/setup-container/0.log" Mar 12 16:04:27 crc kubenswrapper[4869]: I0312 16:04:27.851440 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f1182ade-04e6-4329-b41f-eda38443a859/rabbitmq/0.log" Mar 12 16:04:27 crc kubenswrapper[4869]: I0312 16:04:27.877904 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-fpxkt_90ac4b9f-1907-430a-9264-dcab4c86e118/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 16:04:28 crc kubenswrapper[4869]: I0312 16:04:28.050992 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-x6ckl_230d1ac1-539b-4bb6-9560-ce764540f933/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 16:04:28 crc kubenswrapper[4869]: I0312 16:04:28.123839 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-nwdmq_e27517ee-4f44-4cfe-9b05-5cd9dd21165a/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 16:04:28 crc kubenswrapper[4869]: I0312 16:04:28.280378 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-gg2w9_21afe139-09ed-4ebc-b81b-1be277a8e199/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 16:04:28 crc kubenswrapper[4869]: I0312 16:04:28.377746 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-9hsm8_e9d28b57-1b1d-46a0-8a03-3ff1cdcfdcd3/ssh-known-hosts-edpm-deployment/0.log" Mar 12 16:04:28 crc kubenswrapper[4869]: I0312 16:04:28.640429 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5bcd88f7f5-258jj_944bd0f4-e7ea-430d-995e-fabdf1f79bab/proxy-server/0.log" Mar 12 16:04:28 crc kubenswrapper[4869]: I0312 16:04:28.736110 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5bcd88f7f5-258jj_944bd0f4-e7ea-430d-995e-fabdf1f79bab/proxy-httpd/0.log" Mar 12 16:04:28 crc kubenswrapper[4869]: I0312 16:04:28.743265 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-mbvj6_db6dbe81-f329-4b21-be85-3ac61ed4c428/swift-ring-rebalance/0.log" Mar 12 16:04:28 crc kubenswrapper[4869]: I0312 16:04:28.909107 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c2260d9c-2497-44bb-9952-341844cf85d0/account-auditor/0.log" Mar 12 16:04:28 crc kubenswrapper[4869]: I0312 16:04:28.995651 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c2260d9c-2497-44bb-9952-341844cf85d0/account-replicator/0.log" Mar 12 16:04:29 crc kubenswrapper[4869]: I0312 16:04:29.016000 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c2260d9c-2497-44bb-9952-341844cf85d0/account-reaper/0.log" Mar 12 16:04:29 crc kubenswrapper[4869]: I0312 16:04:29.148940 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c2260d9c-2497-44bb-9952-341844cf85d0/container-auditor/0.log" Mar 12 16:04:29 crc kubenswrapper[4869]: I0312 16:04:29.331649 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c2260d9c-2497-44bb-9952-341844cf85d0/account-server/0.log" Mar 12 16:04:29 crc kubenswrapper[4869]: I0312 16:04:29.436197 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c2260d9c-2497-44bb-9952-341844cf85d0/container-replicator/0.log" Mar 12 16:04:29 crc kubenswrapper[4869]: I0312 16:04:29.443308 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c2260d9c-2497-44bb-9952-341844cf85d0/container-server/0.log" Mar 12 16:04:29 crc kubenswrapper[4869]: I0312 16:04:29.580497 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c2260d9c-2497-44bb-9952-341844cf85d0/container-updater/0.log" Mar 12 16:04:29 crc kubenswrapper[4869]: I0312 16:04:29.594765 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c2260d9c-2497-44bb-9952-341844cf85d0/object-auditor/0.log" Mar 12 16:04:29 crc kubenswrapper[4869]: I0312 16:04:29.675156 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c2260d9c-2497-44bb-9952-341844cf85d0/object-replicator/0.log" Mar 12 16:04:29 crc kubenswrapper[4869]: I0312 16:04:29.683774 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c2260d9c-2497-44bb-9952-341844cf85d0/object-expirer/0.log" Mar 12 16:04:29 crc kubenswrapper[4869]: I0312 16:04:29.813732 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c2260d9c-2497-44bb-9952-341844cf85d0/object-server/0.log" Mar 12 16:04:29 crc kubenswrapper[4869]: I0312 16:04:29.859947 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c2260d9c-2497-44bb-9952-341844cf85d0/object-updater/0.log" Mar 12 16:04:29 crc kubenswrapper[4869]: I0312 16:04:29.931243 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c2260d9c-2497-44bb-9952-341844cf85d0/rsync/0.log" Mar 12 16:04:29 crc kubenswrapper[4869]: I0312 16:04:29.977720 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c2260d9c-2497-44bb-9952-341844cf85d0/swift-recon-cron/0.log" Mar 12 16:04:30 crc kubenswrapper[4869]: I0312 16:04:30.155248 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-wmss4_2b744985-65a8-49f0-b177-85c19eeb86c1/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 16:04:30 crc kubenswrapper[4869]: I0312 16:04:30.335228 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_e6182d72-d424-4a24-bb32-1c43aaa82bba/tempest-tests-tempest-tests-runner/0.log" Mar 12 16:04:30 crc kubenswrapper[4869]: I0312 16:04:30.379210 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_00cc4d19-9222-42f6-a43c-46cfd886fd77/test-operator-logs-container/0.log" Mar 12 16:04:30 crc kubenswrapper[4869]: I0312 16:04:30.530557 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-2794b_69ed9651-a1cb-4470-8629-adeb0dea6377/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 16:04:33 crc kubenswrapper[4869]: I0312 16:04:33.764899 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_28b0b79e-10ab-436b-a34b-af51bd63d60a/memcached/0.log" Mar 12 16:04:42 crc kubenswrapper[4869]: I0312 16:04:42.336997 4869 scope.go:117] "RemoveContainer" containerID="7f386f4ac9366e91219ce91ba4e94753e5c56573855dfb2b2ddd058f1eec367e" Mar 12 16:04:42 crc kubenswrapper[4869]: E0312 16:04:42.338221 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 16:04:46 crc kubenswrapper[4869]: I0312 16:04:46.813681 4869 scope.go:117] "RemoveContainer" containerID="66cc10f53c26eb56090620228e3b1f9949e63f83b447992500ab95a6446c254c" Mar 12 16:04:57 crc kubenswrapper[4869]: I0312 16:04:57.336393 4869 scope.go:117] "RemoveContainer" containerID="7f386f4ac9366e91219ce91ba4e94753e5c56573855dfb2b2ddd058f1eec367e" Mar 12 16:04:57 crc kubenswrapper[4869]: E0312 16:04:57.337186 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 16:04:58 crc kubenswrapper[4869]: I0312 16:04:58.506307 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-2phvg_d1f4e20f-8a64-4692-aa9b-73e8eb2aeb26/manager/0.log" Mar 12 16:04:58 crc kubenswrapper[4869]: I0312 16:04:58.737459 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57_7dadcf56-7151-4b50-953e-f469f19ac9be/util/0.log" Mar 12 16:04:58 crc kubenswrapper[4869]: I0312 16:04:58.982513 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57_7dadcf56-7151-4b50-953e-f469f19ac9be/pull/0.log" Mar 12 16:04:59 crc kubenswrapper[4869]: I0312 16:04:59.018316 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57_7dadcf56-7151-4b50-953e-f469f19ac9be/util/0.log" Mar 12 16:04:59 crc kubenswrapper[4869]: I0312 16:04:59.275350 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57_7dadcf56-7151-4b50-953e-f469f19ac9be/pull/0.log" Mar 12 16:04:59 crc kubenswrapper[4869]: I0312 16:04:59.364819 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57_7dadcf56-7151-4b50-953e-f469f19ac9be/util/0.log" Mar 12 16:04:59 crc kubenswrapper[4869]: I0312 16:04:59.666209 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57_7dadcf56-7151-4b50-953e-f469f19ac9be/pull/0.log" Mar 12 16:04:59 crc kubenswrapper[4869]: I0312 16:04:59.743802 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dfd0a5aa88ceb988578091b7e436c5632868c2113fb1281941f0396cd6lnl57_7dadcf56-7151-4b50-953e-f469f19ac9be/extract/0.log" Mar 12 16:05:00 crc kubenswrapper[4869]: I0312 16:05:00.079928 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-bbfwc_8d41de12-9bea-4bbc-a276-296376e563a8/manager/0.log" Mar 12 16:05:00 crc kubenswrapper[4869]: I0312 16:05:00.276385 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-qhsnc_454883b6-ef08-4828-acc8-237632cf4a35/manager/0.log" Mar 12 16:05:00 crc kubenswrapper[4869]: I0312 16:05:00.606257 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-hxljg_9c4c52f4-d899-4059-8d91-29e4dd1971fd/manager/0.log" Mar 12 16:05:00 crc kubenswrapper[4869]: I0312 16:05:00.863801 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-hn64k_acaa3149-c349-4d8e-95ba-56d1714eb3b6/manager/0.log" Mar 12 16:05:00 crc kubenswrapper[4869]: I0312 16:05:00.996991 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-cxtpp_bdcee336-08a4-4504-8eb3-b09b4899d2ed/manager/0.log" Mar 12 16:05:01 crc kubenswrapper[4869]: I0312 16:05:01.209186 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-vshpv_c9089d1a-4d28-4973-a826-c7fa8b99acab/manager/0.log" Mar 12 16:05:01 crc kubenswrapper[4869]: I0312 16:05:01.371155 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-bgssp_9c8724e6-72a2-441e-bb6a-330ee1ccfe6f/manager/0.log" Mar 12 16:05:01 crc kubenswrapper[4869]: I0312 16:05:01.487006 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-sggnn_50bd86b6-6404-44de-822c-b75e6692a36b/manager/0.log" Mar 12 16:05:02 crc kubenswrapper[4869]: I0312 16:05:02.116052 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-zsvnq_2195e51b-747e-4e94-b616-8fe940ffe5ed/manager/0.log" Mar 12 16:05:02 crc kubenswrapper[4869]: I0312 16:05:02.249173 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-h9gxw_fe9827df-43c1-4491-923e-0c745e025aec/manager/0.log" Mar 12 16:05:02 crc kubenswrapper[4869]: I0312 16:05:02.553882 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-569cc54c5-7p294_cbf420e9-2c19-4582-b36b-0d4651f6d067/manager/0.log" Mar 12 16:05:02 crc kubenswrapper[4869]: I0312 16:05:02.653964 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-f9gh7_4532e8a5-d461-4d46-99b9-6da31edb678b/manager/0.log" Mar 12 16:05:02 crc kubenswrapper[4869]: I0312 16:05:02.757229 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-557ccf57b7jrkfz_6e8cbe5c-f85f-4c0a-a2e1-6aa0ef3ca715/manager/0.log" Mar 12 16:05:03 crc kubenswrapper[4869]: I0312 16:05:03.091291 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-666b5bf768-cfwmm_fe21499b-24de-4455-86e6-41ca9441e3d4/operator/0.log" Mar 12 16:05:03 crc kubenswrapper[4869]: I0312 16:05:03.178082 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-89bmr_7fa1dc1f-0f9a-4a6d-b3f8-62df1e5e6a23/registry-server/0.log" Mar 12 16:05:03 crc kubenswrapper[4869]: I0312 16:05:03.331657 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-hthfl_e07ee970-a07b-4a6b-b5b5-3387bd4b6da2/manager/0.log" Mar 12 16:05:03 crc kubenswrapper[4869]: I0312 16:05:03.961818 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-t2cd8_87c7aae2-dd1d-40f2-a54c-9239fc1998de/manager/0.log" Mar 12 16:05:04 crc kubenswrapper[4869]: I0312 16:05:04.162025 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-zlbth_93492e48-5fe6-4e20-8597-738a93b6412c/operator/0.log" Mar 12 16:05:04 crc kubenswrapper[4869]: I0312 16:05:04.249219 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-677c674df7-fh776_98c64520-2dc6-4f39-a102-58e0205f7d46/manager/0.log" Mar 12 16:05:04 crc kubenswrapper[4869]: I0312 16:05:04.690495 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6cd66dbd4b-mhw7h_3760f848-91f6-4fdf-9bd4-d6ffbac2de6d/manager/0.log" Mar 12 16:05:04 crc kubenswrapper[4869]: I0312 16:05:04.774911 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7d46bf84bd-r6xt7_2fd71c8f-f3bf-416b-9a7b-fd108b10853d/manager/0.log" Mar 12 16:05:04 crc kubenswrapper[4869]: I0312 16:05:04.785218 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-96d7k_da890c60-e9bb-49a9-97cc-67696823d7d8/manager/0.log" Mar 12 16:05:04 crc kubenswrapper[4869]: I0312 16:05:04.899667 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd88c6f67-s5ptd_eff4102a-7465-4994-b6d5-1982a6ec713b/manager/0.log" Mar 12 16:05:08 crc kubenswrapper[4869]: I0312 16:05:08.382073 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-g9sqx_62ce9e5f-8d12-425b-b966-aca955bd96d9/manager/0.log" Mar 12 16:05:12 crc kubenswrapper[4869]: I0312 16:05:12.336464 4869 scope.go:117] "RemoveContainer" containerID="7f386f4ac9366e91219ce91ba4e94753e5c56573855dfb2b2ddd058f1eec367e" Mar 12 16:05:12 crc kubenswrapper[4869]: E0312 16:05:12.337286 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 16:05:23 crc kubenswrapper[4869]: I0312 16:05:23.336231 4869 scope.go:117] "RemoveContainer" containerID="7f386f4ac9366e91219ce91ba4e94753e5c56573855dfb2b2ddd058f1eec367e" Mar 12 16:05:23 crc kubenswrapper[4869]: E0312 16:05:23.337041 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 16:05:27 crc kubenswrapper[4869]: I0312 16:05:27.171722 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-xq5fz_68451e19-64a1-471e-85c8-7238bb88e14c/control-plane-machine-set-operator/0.log" Mar 12 16:05:27 crc kubenswrapper[4869]: I0312 16:05:27.418495 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-2drd7_90ae9f42-560b-4b79-a947-25c6de331025/kube-rbac-proxy/0.log" Mar 12 16:05:27 crc kubenswrapper[4869]: I0312 16:05:27.465432 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-2drd7_90ae9f42-560b-4b79-a947-25c6de331025/machine-api-operator/0.log" Mar 12 16:05:38 crc kubenswrapper[4869]: I0312 16:05:38.365689 4869 scope.go:117] "RemoveContainer" containerID="7f386f4ac9366e91219ce91ba4e94753e5c56573855dfb2b2ddd058f1eec367e" Mar 12 16:05:38 crc kubenswrapper[4869]: E0312 16:05:38.367137 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 16:05:41 crc kubenswrapper[4869]: I0312 16:05:41.143373 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-9869l_1a9edcd9-b1dd-44a3-8d46-8a8e0ae4fcf5/cert-manager-controller/0.log" Mar 12 16:05:41 crc kubenswrapper[4869]: I0312 16:05:41.301579 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-rxw2v_06128598-c475-46aa-8109-12eea3f15bfb/cert-manager-cainjector/0.log" Mar 12 16:05:41 crc kubenswrapper[4869]: I0312 16:05:41.364505 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-bgzmw_ab036352-d0df-4dea-bd2d-27fde11414c7/cert-manager-webhook/0.log" Mar 12 16:05:49 crc kubenswrapper[4869]: I0312 16:05:49.337316 4869 scope.go:117] "RemoveContainer" containerID="7f386f4ac9366e91219ce91ba4e94753e5c56573855dfb2b2ddd058f1eec367e" Mar 12 16:05:49 crc kubenswrapper[4869]: E0312 16:05:49.339806 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 16:05:53 crc kubenswrapper[4869]: I0312 16:05:53.591997 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-xljxl_685a6d03-a0dd-4818-82f4-1839b0c094c3/nmstate-console-plugin/0.log" Mar 12 16:05:53 crc kubenswrapper[4869]: I0312 16:05:53.625813 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-b74gm_71ed27a7-141a-44c4-833d-baa692ec7af0/nmstate-handler/0.log" Mar 12 16:05:53 crc kubenswrapper[4869]: I0312 16:05:53.791871 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-29dkf_f2b26b4e-3c20-4729-b93e-81e8909a4c86/kube-rbac-proxy/0.log" Mar 12 16:05:53 crc kubenswrapper[4869]: I0312 16:05:53.797301 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-29dkf_f2b26b4e-3c20-4729-b93e-81e8909a4c86/nmstate-metrics/0.log" Mar 12 16:05:53 crc kubenswrapper[4869]: I0312 16:05:53.962079 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-l4dd7_6bd6a2b1-e9dc-4f93-9c48-a2c2d8622e49/nmstate-operator/0.log" Mar 12 16:05:54 crc kubenswrapper[4869]: I0312 16:05:54.006509 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-gkvrq_7073c354-8a03-4c91-a9aa-9ec780f52b65/nmstate-webhook/0.log" Mar 12 16:05:55 crc kubenswrapper[4869]: I0312 16:05:55.635842 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ds6zj"] Mar 12 16:05:55 crc kubenswrapper[4869]: E0312 16:05:55.636792 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a972d17c-7397-40c9-a6d4-610df87a932a" containerName="oc" Mar 12 16:05:55 crc kubenswrapper[4869]: I0312 16:05:55.636810 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="a972d17c-7397-40c9-a6d4-610df87a932a" containerName="oc" Mar 12 16:05:55 crc kubenswrapper[4869]: I0312 16:05:55.637105 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="a972d17c-7397-40c9-a6d4-610df87a932a" containerName="oc" Mar 12 16:05:55 crc kubenswrapper[4869]: I0312 16:05:55.638846 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ds6zj" Mar 12 16:05:55 crc kubenswrapper[4869]: I0312 16:05:55.644899 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ds6zj"] Mar 12 16:05:55 crc kubenswrapper[4869]: I0312 16:05:55.794232 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92c15473-2803-485b-8f9f-aa11c85b81e1-utilities\") pod \"certified-operators-ds6zj\" (UID: \"92c15473-2803-485b-8f9f-aa11c85b81e1\") " pod="openshift-marketplace/certified-operators-ds6zj" Mar 12 16:05:55 crc kubenswrapper[4869]: I0312 16:05:55.794676 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6hwr\" (UniqueName: \"kubernetes.io/projected/92c15473-2803-485b-8f9f-aa11c85b81e1-kube-api-access-w6hwr\") pod \"certified-operators-ds6zj\" (UID: \"92c15473-2803-485b-8f9f-aa11c85b81e1\") " pod="openshift-marketplace/certified-operators-ds6zj" Mar 12 16:05:55 crc kubenswrapper[4869]: I0312 16:05:55.794709 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92c15473-2803-485b-8f9f-aa11c85b81e1-catalog-content\") pod \"certified-operators-ds6zj\" (UID: \"92c15473-2803-485b-8f9f-aa11c85b81e1\") " pod="openshift-marketplace/certified-operators-ds6zj" Mar 12 16:05:55 crc kubenswrapper[4869]: I0312 16:05:55.896967 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92c15473-2803-485b-8f9f-aa11c85b81e1-utilities\") pod \"certified-operators-ds6zj\" (UID: \"92c15473-2803-485b-8f9f-aa11c85b81e1\") " pod="openshift-marketplace/certified-operators-ds6zj" Mar 12 16:05:55 crc kubenswrapper[4869]: I0312 16:05:55.897068 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6hwr\" (UniqueName: \"kubernetes.io/projected/92c15473-2803-485b-8f9f-aa11c85b81e1-kube-api-access-w6hwr\") pod \"certified-operators-ds6zj\" (UID: \"92c15473-2803-485b-8f9f-aa11c85b81e1\") " pod="openshift-marketplace/certified-operators-ds6zj" Mar 12 16:05:55 crc kubenswrapper[4869]: I0312 16:05:55.897102 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92c15473-2803-485b-8f9f-aa11c85b81e1-catalog-content\") pod \"certified-operators-ds6zj\" (UID: \"92c15473-2803-485b-8f9f-aa11c85b81e1\") " pod="openshift-marketplace/certified-operators-ds6zj" Mar 12 16:05:55 crc kubenswrapper[4869]: I0312 16:05:55.897718 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92c15473-2803-485b-8f9f-aa11c85b81e1-catalog-content\") pod \"certified-operators-ds6zj\" (UID: \"92c15473-2803-485b-8f9f-aa11c85b81e1\") " pod="openshift-marketplace/certified-operators-ds6zj" Mar 12 16:05:55 crc kubenswrapper[4869]: I0312 16:05:55.897765 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92c15473-2803-485b-8f9f-aa11c85b81e1-utilities\") pod \"certified-operators-ds6zj\" (UID: \"92c15473-2803-485b-8f9f-aa11c85b81e1\") " pod="openshift-marketplace/certified-operators-ds6zj" Mar 12 16:05:55 crc kubenswrapper[4869]: I0312 16:05:55.916361 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6hwr\" (UniqueName: \"kubernetes.io/projected/92c15473-2803-485b-8f9f-aa11c85b81e1-kube-api-access-w6hwr\") pod \"certified-operators-ds6zj\" (UID: \"92c15473-2803-485b-8f9f-aa11c85b81e1\") " pod="openshift-marketplace/certified-operators-ds6zj" Mar 12 16:05:56 crc kubenswrapper[4869]: I0312 16:05:56.010193 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ds6zj" Mar 12 16:05:56 crc kubenswrapper[4869]: I0312 16:05:56.524937 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ds6zj"] Mar 12 16:05:56 crc kubenswrapper[4869]: I0312 16:05:56.942705 4869 generic.go:334] "Generic (PLEG): container finished" podID="92c15473-2803-485b-8f9f-aa11c85b81e1" containerID="50395e2ea1e8c7a9a5a916f517fb35e41950079713a5ead43671f7ebd05bfd80" exitCode=0 Mar 12 16:05:56 crc kubenswrapper[4869]: I0312 16:05:56.942925 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ds6zj" event={"ID":"92c15473-2803-485b-8f9f-aa11c85b81e1","Type":"ContainerDied","Data":"50395e2ea1e8c7a9a5a916f517fb35e41950079713a5ead43671f7ebd05bfd80"} Mar 12 16:05:56 crc kubenswrapper[4869]: I0312 16:05:56.942979 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ds6zj" event={"ID":"92c15473-2803-485b-8f9f-aa11c85b81e1","Type":"ContainerStarted","Data":"aff78976d4cf4f1c0c3c4d3ba0a15cdb808f2019e35824e085e46b4eee0121b0"} Mar 12 16:05:57 crc kubenswrapper[4869]: I0312 16:05:57.953292 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ds6zj" event={"ID":"92c15473-2803-485b-8f9f-aa11c85b81e1","Type":"ContainerStarted","Data":"1fe76244eb347aa93c89fdd38bc0ea5f926a9d87450e8d0316c145bc4e7a4448"} Mar 12 16:05:58 crc kubenswrapper[4869]: I0312 16:05:58.961888 4869 generic.go:334] "Generic (PLEG): container finished" podID="92c15473-2803-485b-8f9f-aa11c85b81e1" containerID="1fe76244eb347aa93c89fdd38bc0ea5f926a9d87450e8d0316c145bc4e7a4448" exitCode=0 Mar 12 16:05:58 crc kubenswrapper[4869]: I0312 16:05:58.961943 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ds6zj" event={"ID":"92c15473-2803-485b-8f9f-aa11c85b81e1","Type":"ContainerDied","Data":"1fe76244eb347aa93c89fdd38bc0ea5f926a9d87450e8d0316c145bc4e7a4448"} Mar 12 16:05:59 crc kubenswrapper[4869]: I0312 16:05:59.972152 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ds6zj" event={"ID":"92c15473-2803-485b-8f9f-aa11c85b81e1","Type":"ContainerStarted","Data":"b775063bd894e1b722874b367fa7c9937a7e9604687433c128f5356c14d38c69"} Mar 12 16:06:00 crc kubenswrapper[4869]: I0312 16:06:00.000331 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ds6zj" podStartSLOduration=2.551379552 podStartE2EDuration="5.000307958s" podCreationTimestamp="2026-03-12 16:05:55 +0000 UTC" firstStartedPulling="2026-03-12 16:05:56.94495256 +0000 UTC m=+4709.230177838" lastFinishedPulling="2026-03-12 16:05:59.393880966 +0000 UTC m=+4711.679106244" observedRunningTime="2026-03-12 16:05:59.993519777 +0000 UTC m=+4712.278745055" watchObservedRunningTime="2026-03-12 16:06:00.000307958 +0000 UTC m=+4712.285533246" Mar 12 16:06:00 crc kubenswrapper[4869]: I0312 16:06:00.151789 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555526-7w9kb"] Mar 12 16:06:00 crc kubenswrapper[4869]: I0312 16:06:00.157930 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555526-7w9kb" Mar 12 16:06:00 crc kubenswrapper[4869]: I0312 16:06:00.163045 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 16:06:00 crc kubenswrapper[4869]: I0312 16:06:00.163348 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 16:06:00 crc kubenswrapper[4869]: I0312 16:06:00.163933 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 16:06:00 crc kubenswrapper[4869]: I0312 16:06:00.166063 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555526-7w9kb"] Mar 12 16:06:00 crc kubenswrapper[4869]: I0312 16:06:00.296218 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9xr8\" (UniqueName: \"kubernetes.io/projected/31c11f7f-699b-4477-85c1-81fed25b5b00-kube-api-access-f9xr8\") pod \"auto-csr-approver-29555526-7w9kb\" (UID: \"31c11f7f-699b-4477-85c1-81fed25b5b00\") " pod="openshift-infra/auto-csr-approver-29555526-7w9kb" Mar 12 16:06:00 crc kubenswrapper[4869]: I0312 16:06:00.398844 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9xr8\" (UniqueName: \"kubernetes.io/projected/31c11f7f-699b-4477-85c1-81fed25b5b00-kube-api-access-f9xr8\") pod \"auto-csr-approver-29555526-7w9kb\" (UID: \"31c11f7f-699b-4477-85c1-81fed25b5b00\") " pod="openshift-infra/auto-csr-approver-29555526-7w9kb" Mar 12 16:06:00 crc kubenswrapper[4869]: I0312 16:06:00.434590 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9xr8\" (UniqueName: \"kubernetes.io/projected/31c11f7f-699b-4477-85c1-81fed25b5b00-kube-api-access-f9xr8\") pod \"auto-csr-approver-29555526-7w9kb\" (UID: \"31c11f7f-699b-4477-85c1-81fed25b5b00\") " pod="openshift-infra/auto-csr-approver-29555526-7w9kb" Mar 12 16:06:00 crc kubenswrapper[4869]: I0312 16:06:00.473920 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555526-7w9kb" Mar 12 16:06:00 crc kubenswrapper[4869]: I0312 16:06:00.934153 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555526-7w9kb"] Mar 12 16:06:00 crc kubenswrapper[4869]: W0312 16:06:00.943215 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31c11f7f_699b_4477_85c1_81fed25b5b00.slice/crio-31f6e458393f26a4a52d400a4a71995287cb7099b61e4b237f8f6f3af5b34f2f WatchSource:0}: Error finding container 31f6e458393f26a4a52d400a4a71995287cb7099b61e4b237f8f6f3af5b34f2f: Status 404 returned error can't find the container with id 31f6e458393f26a4a52d400a4a71995287cb7099b61e4b237f8f6f3af5b34f2f Mar 12 16:06:00 crc kubenswrapper[4869]: I0312 16:06:00.986971 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555526-7w9kb" event={"ID":"31c11f7f-699b-4477-85c1-81fed25b5b00","Type":"ContainerStarted","Data":"31f6e458393f26a4a52d400a4a71995287cb7099b61e4b237f8f6f3af5b34f2f"} Mar 12 16:06:03 crc kubenswrapper[4869]: I0312 16:06:03.010212 4869 generic.go:334] "Generic (PLEG): container finished" podID="31c11f7f-699b-4477-85c1-81fed25b5b00" containerID="e7581b3c6022fd109d5803bc1c5c2afbc5a54dde6d875d3e085d432e1bfde613" exitCode=0 Mar 12 16:06:03 crc kubenswrapper[4869]: I0312 16:06:03.010333 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555526-7w9kb" event={"ID":"31c11f7f-699b-4477-85c1-81fed25b5b00","Type":"ContainerDied","Data":"e7581b3c6022fd109d5803bc1c5c2afbc5a54dde6d875d3e085d432e1bfde613"} Mar 12 16:06:04 crc kubenswrapper[4869]: I0312 16:06:04.337668 4869 scope.go:117] "RemoveContainer" containerID="7f386f4ac9366e91219ce91ba4e94753e5c56573855dfb2b2ddd058f1eec367e" Mar 12 16:06:04 crc kubenswrapper[4869]: E0312 16:06:04.338331 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 16:06:04 crc kubenswrapper[4869]: I0312 16:06:04.352068 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555526-7w9kb" Mar 12 16:06:04 crc kubenswrapper[4869]: I0312 16:06:04.498304 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9xr8\" (UniqueName: \"kubernetes.io/projected/31c11f7f-699b-4477-85c1-81fed25b5b00-kube-api-access-f9xr8\") pod \"31c11f7f-699b-4477-85c1-81fed25b5b00\" (UID: \"31c11f7f-699b-4477-85c1-81fed25b5b00\") " Mar 12 16:06:04 crc kubenswrapper[4869]: I0312 16:06:04.517276 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31c11f7f-699b-4477-85c1-81fed25b5b00-kube-api-access-f9xr8" (OuterVolumeSpecName: "kube-api-access-f9xr8") pod "31c11f7f-699b-4477-85c1-81fed25b5b00" (UID: "31c11f7f-699b-4477-85c1-81fed25b5b00"). InnerVolumeSpecName "kube-api-access-f9xr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 16:06:04 crc kubenswrapper[4869]: I0312 16:06:04.602285 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9xr8\" (UniqueName: \"kubernetes.io/projected/31c11f7f-699b-4477-85c1-81fed25b5b00-kube-api-access-f9xr8\") on node \"crc\" DevicePath \"\"" Mar 12 16:06:05 crc kubenswrapper[4869]: I0312 16:06:05.028121 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555526-7w9kb" event={"ID":"31c11f7f-699b-4477-85c1-81fed25b5b00","Type":"ContainerDied","Data":"31f6e458393f26a4a52d400a4a71995287cb7099b61e4b237f8f6f3af5b34f2f"} Mar 12 16:06:05 crc kubenswrapper[4869]: I0312 16:06:05.028347 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31f6e458393f26a4a52d400a4a71995287cb7099b61e4b237f8f6f3af5b34f2f" Mar 12 16:06:05 crc kubenswrapper[4869]: I0312 16:06:05.028179 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555526-7w9kb" Mar 12 16:06:05 crc kubenswrapper[4869]: I0312 16:06:05.418498 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555520-v8kjt"] Mar 12 16:06:05 crc kubenswrapper[4869]: I0312 16:06:05.427672 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555520-v8kjt"] Mar 12 16:06:06 crc kubenswrapper[4869]: I0312 16:06:06.011124 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ds6zj" Mar 12 16:06:06 crc kubenswrapper[4869]: I0312 16:06:06.011760 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ds6zj" Mar 12 16:06:06 crc kubenswrapper[4869]: I0312 16:06:06.075149 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ds6zj" Mar 12 16:06:06 crc kubenswrapper[4869]: I0312 16:06:06.141858 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ds6zj" Mar 12 16:06:06 crc kubenswrapper[4869]: I0312 16:06:06.322829 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ds6zj"] Mar 12 16:06:06 crc kubenswrapper[4869]: I0312 16:06:06.347728 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb4e751e-a8db-4ffd-97e9-a0dba749a305" path="/var/lib/kubelet/pods/eb4e751e-a8db-4ffd-97e9-a0dba749a305/volumes" Mar 12 16:06:08 crc kubenswrapper[4869]: I0312 16:06:08.052346 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ds6zj" podUID="92c15473-2803-485b-8f9f-aa11c85b81e1" containerName="registry-server" containerID="cri-o://b775063bd894e1b722874b367fa7c9937a7e9604687433c128f5356c14d38c69" gracePeriod=2 Mar 12 16:06:09 crc kubenswrapper[4869]: I0312 16:06:09.064461 4869 generic.go:334] "Generic (PLEG): container finished" podID="92c15473-2803-485b-8f9f-aa11c85b81e1" containerID="b775063bd894e1b722874b367fa7c9937a7e9604687433c128f5356c14d38c69" exitCode=0 Mar 12 16:06:09 crc kubenswrapper[4869]: I0312 16:06:09.064581 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ds6zj" event={"ID":"92c15473-2803-485b-8f9f-aa11c85b81e1","Type":"ContainerDied","Data":"b775063bd894e1b722874b367fa7c9937a7e9604687433c128f5356c14d38c69"} Mar 12 16:06:09 crc kubenswrapper[4869]: I0312 16:06:09.780782 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ds6zj" Mar 12 16:06:09 crc kubenswrapper[4869]: I0312 16:06:09.914768 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6hwr\" (UniqueName: \"kubernetes.io/projected/92c15473-2803-485b-8f9f-aa11c85b81e1-kube-api-access-w6hwr\") pod \"92c15473-2803-485b-8f9f-aa11c85b81e1\" (UID: \"92c15473-2803-485b-8f9f-aa11c85b81e1\") " Mar 12 16:06:09 crc kubenswrapper[4869]: I0312 16:06:09.915001 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92c15473-2803-485b-8f9f-aa11c85b81e1-catalog-content\") pod \"92c15473-2803-485b-8f9f-aa11c85b81e1\" (UID: \"92c15473-2803-485b-8f9f-aa11c85b81e1\") " Mar 12 16:06:09 crc kubenswrapper[4869]: I0312 16:06:09.915031 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92c15473-2803-485b-8f9f-aa11c85b81e1-utilities\") pod \"92c15473-2803-485b-8f9f-aa11c85b81e1\" (UID: \"92c15473-2803-485b-8f9f-aa11c85b81e1\") " Mar 12 16:06:09 crc kubenswrapper[4869]: I0312 16:06:09.915811 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92c15473-2803-485b-8f9f-aa11c85b81e1-utilities" (OuterVolumeSpecName: "utilities") pod "92c15473-2803-485b-8f9f-aa11c85b81e1" (UID: "92c15473-2803-485b-8f9f-aa11c85b81e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 16:06:09 crc kubenswrapper[4869]: I0312 16:06:09.923901 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92c15473-2803-485b-8f9f-aa11c85b81e1-kube-api-access-w6hwr" (OuterVolumeSpecName: "kube-api-access-w6hwr") pod "92c15473-2803-485b-8f9f-aa11c85b81e1" (UID: "92c15473-2803-485b-8f9f-aa11c85b81e1"). InnerVolumeSpecName "kube-api-access-w6hwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 16:06:09 crc kubenswrapper[4869]: I0312 16:06:09.967005 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92c15473-2803-485b-8f9f-aa11c85b81e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "92c15473-2803-485b-8f9f-aa11c85b81e1" (UID: "92c15473-2803-485b-8f9f-aa11c85b81e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 16:06:10 crc kubenswrapper[4869]: I0312 16:06:10.017711 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92c15473-2803-485b-8f9f-aa11c85b81e1-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 16:06:10 crc kubenswrapper[4869]: I0312 16:06:10.017748 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6hwr\" (UniqueName: \"kubernetes.io/projected/92c15473-2803-485b-8f9f-aa11c85b81e1-kube-api-access-w6hwr\") on node \"crc\" DevicePath \"\"" Mar 12 16:06:10 crc kubenswrapper[4869]: I0312 16:06:10.017762 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92c15473-2803-485b-8f9f-aa11c85b81e1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 16:06:10 crc kubenswrapper[4869]: I0312 16:06:10.077756 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ds6zj" event={"ID":"92c15473-2803-485b-8f9f-aa11c85b81e1","Type":"ContainerDied","Data":"aff78976d4cf4f1c0c3c4d3ba0a15cdb808f2019e35824e085e46b4eee0121b0"} Mar 12 16:06:10 crc kubenswrapper[4869]: I0312 16:06:10.077824 4869 scope.go:117] "RemoveContainer" containerID="b775063bd894e1b722874b367fa7c9937a7e9604687433c128f5356c14d38c69" Mar 12 16:06:10 crc kubenswrapper[4869]: I0312 16:06:10.077832 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ds6zj" Mar 12 16:06:10 crc kubenswrapper[4869]: I0312 16:06:10.099605 4869 scope.go:117] "RemoveContainer" containerID="1fe76244eb347aa93c89fdd38bc0ea5f926a9d87450e8d0316c145bc4e7a4448" Mar 12 16:06:10 crc kubenswrapper[4869]: I0312 16:06:10.114312 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ds6zj"] Mar 12 16:06:10 crc kubenswrapper[4869]: I0312 16:06:10.124938 4869 scope.go:117] "RemoveContainer" containerID="50395e2ea1e8c7a9a5a916f517fb35e41950079713a5ead43671f7ebd05bfd80" Mar 12 16:06:10 crc kubenswrapper[4869]: I0312 16:06:10.128584 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ds6zj"] Mar 12 16:06:10 crc kubenswrapper[4869]: I0312 16:06:10.346814 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92c15473-2803-485b-8f9f-aa11c85b81e1" path="/var/lib/kubelet/pods/92c15473-2803-485b-8f9f-aa11c85b81e1/volumes" Mar 12 16:06:18 crc kubenswrapper[4869]: I0312 16:06:18.346702 4869 scope.go:117] "RemoveContainer" containerID="7f386f4ac9366e91219ce91ba4e94753e5c56573855dfb2b2ddd058f1eec367e" Mar 12 16:06:18 crc kubenswrapper[4869]: E0312 16:06:18.347613 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 16:06:23 crc kubenswrapper[4869]: I0312 16:06:22.999975 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-xr7s4_0f3cc7c1-69f1-47ac-bc57-20ea088c9d6a/kube-rbac-proxy/0.log" Mar 12 16:06:23 crc kubenswrapper[4869]: I0312 16:06:23.098510 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-xr7s4_0f3cc7c1-69f1-47ac-bc57-20ea088c9d6a/controller/0.log" Mar 12 16:06:23 crc kubenswrapper[4869]: I0312 16:06:23.214944 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kk29n_4c5190c8-103c-41d8-b144-a5fc95608626/cp-frr-files/0.log" Mar 12 16:06:23 crc kubenswrapper[4869]: I0312 16:06:23.431308 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kk29n_4c5190c8-103c-41d8-b144-a5fc95608626/cp-reloader/0.log" Mar 12 16:06:23 crc kubenswrapper[4869]: I0312 16:06:23.449941 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kk29n_4c5190c8-103c-41d8-b144-a5fc95608626/cp-metrics/0.log" Mar 12 16:06:23 crc kubenswrapper[4869]: I0312 16:06:23.463000 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kk29n_4c5190c8-103c-41d8-b144-a5fc95608626/cp-frr-files/0.log" Mar 12 16:06:23 crc kubenswrapper[4869]: I0312 16:06:23.483510 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kk29n_4c5190c8-103c-41d8-b144-a5fc95608626/cp-reloader/0.log" Mar 12 16:06:23 crc kubenswrapper[4869]: I0312 16:06:23.654861 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kk29n_4c5190c8-103c-41d8-b144-a5fc95608626/cp-reloader/0.log" Mar 12 16:06:23 crc kubenswrapper[4869]: I0312 16:06:23.659827 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kk29n_4c5190c8-103c-41d8-b144-a5fc95608626/cp-frr-files/0.log" Mar 12 16:06:23 crc kubenswrapper[4869]: I0312 16:06:23.692845 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kk29n_4c5190c8-103c-41d8-b144-a5fc95608626/cp-metrics/0.log" Mar 12 16:06:23 crc kubenswrapper[4869]: I0312 16:06:23.718405 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kk29n_4c5190c8-103c-41d8-b144-a5fc95608626/cp-metrics/0.log" Mar 12 16:06:23 crc kubenswrapper[4869]: I0312 16:06:23.913366 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kk29n_4c5190c8-103c-41d8-b144-a5fc95608626/cp-frr-files/0.log" Mar 12 16:06:23 crc kubenswrapper[4869]: I0312 16:06:23.971907 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kk29n_4c5190c8-103c-41d8-b144-a5fc95608626/cp-metrics/0.log" Mar 12 16:06:23 crc kubenswrapper[4869]: I0312 16:06:23.987647 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kk29n_4c5190c8-103c-41d8-b144-a5fc95608626/cp-reloader/0.log" Mar 12 16:06:23 crc kubenswrapper[4869]: I0312 16:06:23.994976 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kk29n_4c5190c8-103c-41d8-b144-a5fc95608626/controller/0.log" Mar 12 16:06:24 crc kubenswrapper[4869]: I0312 16:06:24.152527 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kk29n_4c5190c8-103c-41d8-b144-a5fc95608626/frr-metrics/0.log" Mar 12 16:06:24 crc kubenswrapper[4869]: I0312 16:06:24.190250 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kk29n_4c5190c8-103c-41d8-b144-a5fc95608626/kube-rbac-proxy/0.log" Mar 12 16:06:24 crc kubenswrapper[4869]: I0312 16:06:24.221888 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kk29n_4c5190c8-103c-41d8-b144-a5fc95608626/kube-rbac-proxy-frr/0.log" Mar 12 16:06:24 crc kubenswrapper[4869]: I0312 16:06:24.432452 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kk29n_4c5190c8-103c-41d8-b144-a5fc95608626/reloader/0.log" Mar 12 16:06:24 crc kubenswrapper[4869]: I0312 16:06:24.439133 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-gw94s_2d1f63b9-3da1-4d8d-a803-6d8e1cec0081/frr-k8s-webhook-server/0.log" Mar 12 16:06:24 crc kubenswrapper[4869]: I0312 16:06:24.679462 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7899659c6c-5rm9x_2c705e1d-1ea5-4f81-a1d9-9888f96b36be/manager/0.log" Mar 12 16:06:24 crc kubenswrapper[4869]: I0312 16:06:24.914512 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-86c86ddd4-pds8d_c5eb89db-d2ff-4a93-b12b-83ac78e93e65/webhook-server/0.log" Mar 12 16:06:25 crc kubenswrapper[4869]: I0312 16:06:25.013904 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jkzb4_4162c7d8-9940-4754-884e-7a8ed66d3281/kube-rbac-proxy/0.log" Mar 12 16:06:25 crc kubenswrapper[4869]: I0312 16:06:25.802772 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jkzb4_4162c7d8-9940-4754-884e-7a8ed66d3281/speaker/0.log" Mar 12 16:06:26 crc kubenswrapper[4869]: I0312 16:06:26.133184 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kk29n_4c5190c8-103c-41d8-b144-a5fc95608626/frr/0.log" Mar 12 16:06:31 crc kubenswrapper[4869]: I0312 16:06:31.337034 4869 scope.go:117] "RemoveContainer" containerID="7f386f4ac9366e91219ce91ba4e94753e5c56573855dfb2b2ddd058f1eec367e" Mar 12 16:06:31 crc kubenswrapper[4869]: E0312 16:06:31.337835 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 16:06:38 crc kubenswrapper[4869]: I0312 16:06:38.403383 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw_15ef5f34-7088-4444-8423-ace6aaa9661a/util/0.log" Mar 12 16:06:38 crc kubenswrapper[4869]: I0312 16:06:38.641814 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw_15ef5f34-7088-4444-8423-ace6aaa9661a/pull/0.log" Mar 12 16:06:38 crc kubenswrapper[4869]: I0312 16:06:38.683992 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw_15ef5f34-7088-4444-8423-ace6aaa9661a/pull/0.log" Mar 12 16:06:38 crc kubenswrapper[4869]: I0312 16:06:38.686972 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw_15ef5f34-7088-4444-8423-ace6aaa9661a/util/0.log" Mar 12 16:06:38 crc kubenswrapper[4869]: I0312 16:06:38.918042 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw_15ef5f34-7088-4444-8423-ace6aaa9661a/extract/0.log" Mar 12 16:06:38 crc kubenswrapper[4869]: I0312 16:06:38.919808 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw_15ef5f34-7088-4444-8423-ace6aaa9661a/util/0.log" Mar 12 16:06:38 crc kubenswrapper[4869]: I0312 16:06:38.960889 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pf5sw_15ef5f34-7088-4444-8423-ace6aaa9661a/pull/0.log" Mar 12 16:06:39 crc kubenswrapper[4869]: I0312 16:06:39.087763 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6_3647e6ae-9add-4c7f-a1d8-abb0397e4954/util/0.log" Mar 12 16:06:39 crc kubenswrapper[4869]: I0312 16:06:39.214733 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6_3647e6ae-9add-4c7f-a1d8-abb0397e4954/pull/0.log" Mar 12 16:06:39 crc kubenswrapper[4869]: I0312 16:06:39.233226 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6_3647e6ae-9add-4c7f-a1d8-abb0397e4954/util/0.log" Mar 12 16:06:39 crc kubenswrapper[4869]: I0312 16:06:39.260300 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6_3647e6ae-9add-4c7f-a1d8-abb0397e4954/pull/0.log" Mar 12 16:06:39 crc kubenswrapper[4869]: I0312 16:06:39.390472 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6_3647e6ae-9add-4c7f-a1d8-abb0397e4954/util/0.log" Mar 12 16:06:39 crc kubenswrapper[4869]: I0312 16:06:39.397944 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6_3647e6ae-9add-4c7f-a1d8-abb0397e4954/pull/0.log" Mar 12 16:06:39 crc kubenswrapper[4869]: I0312 16:06:39.418805 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ghrh6_3647e6ae-9add-4c7f-a1d8-abb0397e4954/extract/0.log" Mar 12 16:06:39 crc kubenswrapper[4869]: I0312 16:06:39.545941 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8d8dl_91158fdd-957d-44dc-889c-325cdcffb980/extract-utilities/0.log" Mar 12 16:06:39 crc kubenswrapper[4869]: I0312 16:06:39.698108 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8d8dl_91158fdd-957d-44dc-889c-325cdcffb980/extract-content/0.log" Mar 12 16:06:39 crc kubenswrapper[4869]: I0312 16:06:39.710573 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8d8dl_91158fdd-957d-44dc-889c-325cdcffb980/extract-utilities/0.log" Mar 12 16:06:39 crc kubenswrapper[4869]: I0312 16:06:39.742772 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8d8dl_91158fdd-957d-44dc-889c-325cdcffb980/extract-content/0.log" Mar 12 16:06:39 crc kubenswrapper[4869]: I0312 16:06:39.929239 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8d8dl_91158fdd-957d-44dc-889c-325cdcffb980/extract-content/0.log" Mar 12 16:06:39 crc kubenswrapper[4869]: I0312 16:06:39.934339 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8d8dl_91158fdd-957d-44dc-889c-325cdcffb980/extract-utilities/0.log" Mar 12 16:06:40 crc kubenswrapper[4869]: I0312 16:06:40.110717 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-994qp_6366ae71-5f0e-40d7-b27a-d94649d6ca4f/extract-utilities/0.log" Mar 12 16:06:40 crc kubenswrapper[4869]: I0312 16:06:40.391947 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-994qp_6366ae71-5f0e-40d7-b27a-d94649d6ca4f/extract-utilities/0.log" Mar 12 16:06:40 crc kubenswrapper[4869]: I0312 16:06:40.398248 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-994qp_6366ae71-5f0e-40d7-b27a-d94649d6ca4f/extract-content/0.log" Mar 12 16:06:40 crc kubenswrapper[4869]: I0312 16:06:40.423395 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-994qp_6366ae71-5f0e-40d7-b27a-d94649d6ca4f/extract-content/0.log" Mar 12 16:06:40 crc kubenswrapper[4869]: I0312 16:06:40.563606 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8d8dl_91158fdd-957d-44dc-889c-325cdcffb980/registry-server/0.log" Mar 12 16:06:40 crc kubenswrapper[4869]: I0312 16:06:40.580325 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-994qp_6366ae71-5f0e-40d7-b27a-d94649d6ca4f/extract-content/0.log" Mar 12 16:06:40 crc kubenswrapper[4869]: I0312 16:06:40.634534 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-994qp_6366ae71-5f0e-40d7-b27a-d94649d6ca4f/extract-utilities/0.log" Mar 12 16:06:40 crc kubenswrapper[4869]: I0312 16:06:40.810698 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-9spzt_57f6b5a5-3396-4238-b09c-5c5cf9a81ff9/marketplace-operator/0.log" Mar 12 16:06:41 crc kubenswrapper[4869]: I0312 16:06:41.029188 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-76wh2_a697779f-0fbb-4a95-aa75-b8fe3fc77944/extract-utilities/0.log" Mar 12 16:06:41 crc kubenswrapper[4869]: I0312 16:06:41.114222 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-994qp_6366ae71-5f0e-40d7-b27a-d94649d6ca4f/registry-server/0.log" Mar 12 16:06:41 crc kubenswrapper[4869]: I0312 16:06:41.180947 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-76wh2_a697779f-0fbb-4a95-aa75-b8fe3fc77944/extract-utilities/0.log" Mar 12 16:06:41 crc kubenswrapper[4869]: I0312 16:06:41.234166 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-76wh2_a697779f-0fbb-4a95-aa75-b8fe3fc77944/extract-content/0.log" Mar 12 16:06:41 crc kubenswrapper[4869]: I0312 16:06:41.246345 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-76wh2_a697779f-0fbb-4a95-aa75-b8fe3fc77944/extract-content/0.log" Mar 12 16:06:41 crc kubenswrapper[4869]: I0312 16:06:41.402310 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-76wh2_a697779f-0fbb-4a95-aa75-b8fe3fc77944/extract-utilities/0.log" Mar 12 16:06:41 crc kubenswrapper[4869]: I0312 16:06:41.442877 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-76wh2_a697779f-0fbb-4a95-aa75-b8fe3fc77944/extract-content/0.log" Mar 12 16:06:41 crc kubenswrapper[4869]: I0312 16:06:41.634250 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-76wh2_a697779f-0fbb-4a95-aa75-b8fe3fc77944/registry-server/0.log" Mar 12 16:06:41 crc kubenswrapper[4869]: I0312 16:06:41.644602 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v6qsw_c1f177f0-abce-4ed6-9aee-0d11fd6818ef/extract-utilities/0.log" Mar 12 16:06:41 crc kubenswrapper[4869]: I0312 16:06:41.825656 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v6qsw_c1f177f0-abce-4ed6-9aee-0d11fd6818ef/extract-utilities/0.log" Mar 12 16:06:41 crc kubenswrapper[4869]: I0312 16:06:41.837149 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v6qsw_c1f177f0-abce-4ed6-9aee-0d11fd6818ef/extract-content/0.log" Mar 12 16:06:41 crc kubenswrapper[4869]: I0312 16:06:41.855441 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v6qsw_c1f177f0-abce-4ed6-9aee-0d11fd6818ef/extract-content/0.log" Mar 12 16:06:42 crc kubenswrapper[4869]: I0312 16:06:42.036673 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v6qsw_c1f177f0-abce-4ed6-9aee-0d11fd6818ef/extract-utilities/0.log" Mar 12 16:06:42 crc kubenswrapper[4869]: I0312 16:06:42.061317 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v6qsw_c1f177f0-abce-4ed6-9aee-0d11fd6818ef/extract-content/0.log" Mar 12 16:06:42 crc kubenswrapper[4869]: I0312 16:06:42.336754 4869 scope.go:117] "RemoveContainer" containerID="7f386f4ac9366e91219ce91ba4e94753e5c56573855dfb2b2ddd058f1eec367e" Mar 12 16:06:42 crc kubenswrapper[4869]: E0312 16:06:42.337061 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 16:06:42 crc kubenswrapper[4869]: I0312 16:06:42.477718 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v6qsw_c1f177f0-abce-4ed6-9aee-0d11fd6818ef/registry-server/0.log" Mar 12 16:06:46 crc kubenswrapper[4869]: I0312 16:06:46.942978 4869 scope.go:117] "RemoveContainer" containerID="cf11f4c01b7522bb921936f8376c24ccc4d962bfce414ecc898c2b8a038f1114" Mar 12 16:06:54 crc kubenswrapper[4869]: I0312 16:06:54.337025 4869 scope.go:117] "RemoveContainer" containerID="7f386f4ac9366e91219ce91ba4e94753e5c56573855dfb2b2ddd058f1eec367e" Mar 12 16:06:54 crc kubenswrapper[4869]: E0312 16:06:54.337891 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 16:07:05 crc kubenswrapper[4869]: I0312 16:07:05.336777 4869 scope.go:117] "RemoveContainer" containerID="7f386f4ac9366e91219ce91ba4e94753e5c56573855dfb2b2ddd058f1eec367e" Mar 12 16:07:05 crc kubenswrapper[4869]: E0312 16:07:05.337532 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 16:07:18 crc kubenswrapper[4869]: I0312 16:07:18.344515 4869 scope.go:117] "RemoveContainer" containerID="7f386f4ac9366e91219ce91ba4e94753e5c56573855dfb2b2ddd058f1eec367e" Mar 12 16:07:18 crc kubenswrapper[4869]: E0312 16:07:18.345389 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 16:07:33 crc kubenswrapper[4869]: I0312 16:07:33.337403 4869 scope.go:117] "RemoveContainer" containerID="7f386f4ac9366e91219ce91ba4e94753e5c56573855dfb2b2ddd058f1eec367e" Mar 12 16:07:33 crc kubenswrapper[4869]: E0312 16:07:33.338338 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 16:07:46 crc kubenswrapper[4869]: I0312 16:07:46.336756 4869 scope.go:117] "RemoveContainer" containerID="7f386f4ac9366e91219ce91ba4e94753e5c56573855dfb2b2ddd058f1eec367e" Mar 12 16:07:46 crc kubenswrapper[4869]: E0312 16:07:46.338756 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 16:07:58 crc kubenswrapper[4869]: I0312 16:07:58.345219 4869 scope.go:117] "RemoveContainer" containerID="7f386f4ac9366e91219ce91ba4e94753e5c56573855dfb2b2ddd058f1eec367e" Mar 12 16:07:58 crc kubenswrapper[4869]: E0312 16:07:58.346304 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 16:08:00 crc kubenswrapper[4869]: I0312 16:08:00.151268 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555528-5zpcm"] Mar 12 16:08:00 crc kubenswrapper[4869]: E0312 16:08:00.152123 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92c15473-2803-485b-8f9f-aa11c85b81e1" containerName="extract-utilities" Mar 12 16:08:00 crc kubenswrapper[4869]: I0312 16:08:00.152139 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="92c15473-2803-485b-8f9f-aa11c85b81e1" containerName="extract-utilities" Mar 12 16:08:00 crc kubenswrapper[4869]: E0312 16:08:00.152179 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31c11f7f-699b-4477-85c1-81fed25b5b00" containerName="oc" Mar 12 16:08:00 crc kubenswrapper[4869]: I0312 16:08:00.152187 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="31c11f7f-699b-4477-85c1-81fed25b5b00" containerName="oc" Mar 12 16:08:00 crc kubenswrapper[4869]: E0312 16:08:00.152213 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92c15473-2803-485b-8f9f-aa11c85b81e1" containerName="extract-content" Mar 12 16:08:00 crc kubenswrapper[4869]: I0312 16:08:00.152221 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="92c15473-2803-485b-8f9f-aa11c85b81e1" containerName="extract-content" Mar 12 16:08:00 crc kubenswrapper[4869]: E0312 16:08:00.152232 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92c15473-2803-485b-8f9f-aa11c85b81e1" containerName="registry-server" Mar 12 16:08:00 crc kubenswrapper[4869]: I0312 16:08:00.152240 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="92c15473-2803-485b-8f9f-aa11c85b81e1" containerName="registry-server" Mar 12 16:08:00 crc kubenswrapper[4869]: I0312 16:08:00.152451 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="31c11f7f-699b-4477-85c1-81fed25b5b00" containerName="oc" Mar 12 16:08:00 crc kubenswrapper[4869]: I0312 16:08:00.152482 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="92c15473-2803-485b-8f9f-aa11c85b81e1" containerName="registry-server" Mar 12 16:08:00 crc kubenswrapper[4869]: I0312 16:08:00.153208 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555528-5zpcm" Mar 12 16:08:00 crc kubenswrapper[4869]: I0312 16:08:00.155802 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 16:08:00 crc kubenswrapper[4869]: I0312 16:08:00.156135 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 16:08:00 crc kubenswrapper[4869]: I0312 16:08:00.156598 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 16:08:00 crc kubenswrapper[4869]: I0312 16:08:00.181148 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555528-5zpcm"] Mar 12 16:08:00 crc kubenswrapper[4869]: I0312 16:08:00.247760 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmmkx\" (UniqueName: \"kubernetes.io/projected/6aa8f5c4-75cb-4eb5-9439-b4e56e51c3a1-kube-api-access-gmmkx\") pod \"auto-csr-approver-29555528-5zpcm\" (UID: \"6aa8f5c4-75cb-4eb5-9439-b4e56e51c3a1\") " pod="openshift-infra/auto-csr-approver-29555528-5zpcm" Mar 12 16:08:00 crc kubenswrapper[4869]: I0312 16:08:00.351167 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmmkx\" (UniqueName: \"kubernetes.io/projected/6aa8f5c4-75cb-4eb5-9439-b4e56e51c3a1-kube-api-access-gmmkx\") pod \"auto-csr-approver-29555528-5zpcm\" (UID: \"6aa8f5c4-75cb-4eb5-9439-b4e56e51c3a1\") " pod="openshift-infra/auto-csr-approver-29555528-5zpcm" Mar 12 16:08:00 crc kubenswrapper[4869]: I0312 16:08:00.377386 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmmkx\" (UniqueName: \"kubernetes.io/projected/6aa8f5c4-75cb-4eb5-9439-b4e56e51c3a1-kube-api-access-gmmkx\") pod \"auto-csr-approver-29555528-5zpcm\" (UID: \"6aa8f5c4-75cb-4eb5-9439-b4e56e51c3a1\") " pod="openshift-infra/auto-csr-approver-29555528-5zpcm" Mar 12 16:08:00 crc kubenswrapper[4869]: I0312 16:08:00.478514 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555528-5zpcm" Mar 12 16:08:00 crc kubenswrapper[4869]: I0312 16:08:00.971788 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555528-5zpcm"] Mar 12 16:08:00 crc kubenswrapper[4869]: I0312 16:08:00.980167 4869 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 16:08:01 crc kubenswrapper[4869]: I0312 16:08:01.039052 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555528-5zpcm" event={"ID":"6aa8f5c4-75cb-4eb5-9439-b4e56e51c3a1","Type":"ContainerStarted","Data":"0bd52e7a827700b676528ac8c503b7e1f166e4794adf601cfd8735d46515ea9c"} Mar 12 16:08:03 crc kubenswrapper[4869]: I0312 16:08:03.056044 4869 generic.go:334] "Generic (PLEG): container finished" podID="6aa8f5c4-75cb-4eb5-9439-b4e56e51c3a1" containerID="ef370dca860c461361007e0e1fb8f6b40de0ccbc3113cb34446e907c602a871e" exitCode=0 Mar 12 16:08:03 crc kubenswrapper[4869]: I0312 16:08:03.056098 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555528-5zpcm" event={"ID":"6aa8f5c4-75cb-4eb5-9439-b4e56e51c3a1","Type":"ContainerDied","Data":"ef370dca860c461361007e0e1fb8f6b40de0ccbc3113cb34446e907c602a871e"} Mar 12 16:08:04 crc kubenswrapper[4869]: I0312 16:08:04.472073 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555528-5zpcm" Mar 12 16:08:04 crc kubenswrapper[4869]: I0312 16:08:04.639999 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmmkx\" (UniqueName: \"kubernetes.io/projected/6aa8f5c4-75cb-4eb5-9439-b4e56e51c3a1-kube-api-access-gmmkx\") pod \"6aa8f5c4-75cb-4eb5-9439-b4e56e51c3a1\" (UID: \"6aa8f5c4-75cb-4eb5-9439-b4e56e51c3a1\") " Mar 12 16:08:04 crc kubenswrapper[4869]: I0312 16:08:04.645041 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aa8f5c4-75cb-4eb5-9439-b4e56e51c3a1-kube-api-access-gmmkx" (OuterVolumeSpecName: "kube-api-access-gmmkx") pod "6aa8f5c4-75cb-4eb5-9439-b4e56e51c3a1" (UID: "6aa8f5c4-75cb-4eb5-9439-b4e56e51c3a1"). InnerVolumeSpecName "kube-api-access-gmmkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 16:08:04 crc kubenswrapper[4869]: I0312 16:08:04.742764 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmmkx\" (UniqueName: \"kubernetes.io/projected/6aa8f5c4-75cb-4eb5-9439-b4e56e51c3a1-kube-api-access-gmmkx\") on node \"crc\" DevicePath \"\"" Mar 12 16:08:05 crc kubenswrapper[4869]: I0312 16:08:05.075330 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555528-5zpcm" event={"ID":"6aa8f5c4-75cb-4eb5-9439-b4e56e51c3a1","Type":"ContainerDied","Data":"0bd52e7a827700b676528ac8c503b7e1f166e4794adf601cfd8735d46515ea9c"} Mar 12 16:08:05 crc kubenswrapper[4869]: I0312 16:08:05.075370 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bd52e7a827700b676528ac8c503b7e1f166e4794adf601cfd8735d46515ea9c" Mar 12 16:08:05 crc kubenswrapper[4869]: I0312 16:08:05.075401 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555528-5zpcm" Mar 12 16:08:05 crc kubenswrapper[4869]: I0312 16:08:05.562865 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555522-7bd8f"] Mar 12 16:08:05 crc kubenswrapper[4869]: I0312 16:08:05.571874 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555522-7bd8f"] Mar 12 16:08:06 crc kubenswrapper[4869]: I0312 16:08:06.349175 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1ac2137-2cbd-424a-8668-cd87d1fe0334" path="/var/lib/kubelet/pods/e1ac2137-2cbd-424a-8668-cd87d1fe0334/volumes" Mar 12 16:08:12 crc kubenswrapper[4869]: I0312 16:08:12.337325 4869 scope.go:117] "RemoveContainer" containerID="7f386f4ac9366e91219ce91ba4e94753e5c56573855dfb2b2ddd058f1eec367e" Mar 12 16:08:12 crc kubenswrapper[4869]: E0312 16:08:12.338212 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 16:08:27 crc kubenswrapper[4869]: I0312 16:08:27.337099 4869 scope.go:117] "RemoveContainer" containerID="7f386f4ac9366e91219ce91ba4e94753e5c56573855dfb2b2ddd058f1eec367e" Mar 12 16:08:27 crc kubenswrapper[4869]: E0312 16:08:27.338151 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 16:08:38 crc kubenswrapper[4869]: I0312 16:08:38.347662 4869 scope.go:117] "RemoveContainer" containerID="7f386f4ac9366e91219ce91ba4e94753e5c56573855dfb2b2ddd058f1eec367e" Mar 12 16:08:38 crc kubenswrapper[4869]: E0312 16:08:38.348486 4869 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2lgzz_openshift-machine-config-operator(1621c994-94d2-4105-a988-f4739518ba91)\"" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" Mar 12 16:08:47 crc kubenswrapper[4869]: I0312 16:08:47.074340 4869 scope.go:117] "RemoveContainer" containerID="1ee983b7440ac0ee3b9ea83249f52f0c23cd0382c8a0271c47052f270e30e2f0" Mar 12 16:08:51 crc kubenswrapper[4869]: I0312 16:08:51.337035 4869 scope.go:117] "RemoveContainer" containerID="7f386f4ac9366e91219ce91ba4e94753e5c56573855dfb2b2ddd058f1eec367e" Mar 12 16:08:52 crc kubenswrapper[4869]: I0312 16:08:52.514037 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerStarted","Data":"6776e811c46f14d863ba424139ec1761f141f4c49b869f301495cffe675c4a4d"} Mar 12 16:08:56 crc kubenswrapper[4869]: I0312 16:08:56.549435 4869 generic.go:334] "Generic (PLEG): container finished" podID="93aa2341-4454-4086-8a4e-c63f6d317bbf" containerID="3a1670db671d421b749e5492f8dba026d95450ba8ac11e3568a08de17e7033ba" exitCode=0 Mar 12 16:08:56 crc kubenswrapper[4869]: I0312 16:08:56.549985 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w5hx4/must-gather-5zgtd" event={"ID":"93aa2341-4454-4086-8a4e-c63f6d317bbf","Type":"ContainerDied","Data":"3a1670db671d421b749e5492f8dba026d95450ba8ac11e3568a08de17e7033ba"} Mar 12 16:08:56 crc kubenswrapper[4869]: I0312 16:08:56.550647 4869 scope.go:117] "RemoveContainer" containerID="3a1670db671d421b749e5492f8dba026d95450ba8ac11e3568a08de17e7033ba" Mar 12 16:08:57 crc kubenswrapper[4869]: I0312 16:08:57.325506 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-w5hx4_must-gather-5zgtd_93aa2341-4454-4086-8a4e-c63f6d317bbf/gather/0.log" Mar 12 16:09:05 crc kubenswrapper[4869]: I0312 16:09:05.941055 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-w5hx4/must-gather-5zgtd"] Mar 12 16:09:05 crc kubenswrapper[4869]: I0312 16:09:05.941818 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-w5hx4/must-gather-5zgtd" podUID="93aa2341-4454-4086-8a4e-c63f6d317bbf" containerName="copy" containerID="cri-o://784e6adc52403be149c9523e640c80df52676c9ba4095f82874b4dacef002fa7" gracePeriod=2 Mar 12 16:09:05 crc kubenswrapper[4869]: I0312 16:09:05.966292 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-w5hx4/must-gather-5zgtd"] Mar 12 16:09:06 crc kubenswrapper[4869]: I0312 16:09:06.427408 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-w5hx4_must-gather-5zgtd_93aa2341-4454-4086-8a4e-c63f6d317bbf/copy/0.log" Mar 12 16:09:06 crc kubenswrapper[4869]: I0312 16:09:06.428845 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w5hx4/must-gather-5zgtd" Mar 12 16:09:06 crc kubenswrapper[4869]: I0312 16:09:06.547689 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmspj\" (UniqueName: \"kubernetes.io/projected/93aa2341-4454-4086-8a4e-c63f6d317bbf-kube-api-access-qmspj\") pod \"93aa2341-4454-4086-8a4e-c63f6d317bbf\" (UID: \"93aa2341-4454-4086-8a4e-c63f6d317bbf\") " Mar 12 16:09:06 crc kubenswrapper[4869]: I0312 16:09:06.547875 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/93aa2341-4454-4086-8a4e-c63f6d317bbf-must-gather-output\") pod \"93aa2341-4454-4086-8a4e-c63f6d317bbf\" (UID: \"93aa2341-4454-4086-8a4e-c63f6d317bbf\") " Mar 12 16:09:06 crc kubenswrapper[4869]: I0312 16:09:06.555914 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93aa2341-4454-4086-8a4e-c63f6d317bbf-kube-api-access-qmspj" (OuterVolumeSpecName: "kube-api-access-qmspj") pod "93aa2341-4454-4086-8a4e-c63f6d317bbf" (UID: "93aa2341-4454-4086-8a4e-c63f6d317bbf"). InnerVolumeSpecName "kube-api-access-qmspj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 16:09:06 crc kubenswrapper[4869]: I0312 16:09:06.644708 4869 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-w5hx4_must-gather-5zgtd_93aa2341-4454-4086-8a4e-c63f6d317bbf/copy/0.log" Mar 12 16:09:06 crc kubenswrapper[4869]: I0312 16:09:06.647079 4869 generic.go:334] "Generic (PLEG): container finished" podID="93aa2341-4454-4086-8a4e-c63f6d317bbf" containerID="784e6adc52403be149c9523e640c80df52676c9ba4095f82874b4dacef002fa7" exitCode=143 Mar 12 16:09:06 crc kubenswrapper[4869]: I0312 16:09:06.647138 4869 scope.go:117] "RemoveContainer" containerID="784e6adc52403be149c9523e640c80df52676c9ba4095f82874b4dacef002fa7" Mar 12 16:09:06 crc kubenswrapper[4869]: I0312 16:09:06.647201 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w5hx4/must-gather-5zgtd" Mar 12 16:09:06 crc kubenswrapper[4869]: I0312 16:09:06.651100 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmspj\" (UniqueName: \"kubernetes.io/projected/93aa2341-4454-4086-8a4e-c63f6d317bbf-kube-api-access-qmspj\") on node \"crc\" DevicePath \"\"" Mar 12 16:09:06 crc kubenswrapper[4869]: I0312 16:09:06.678595 4869 scope.go:117] "RemoveContainer" containerID="3a1670db671d421b749e5492f8dba026d95450ba8ac11e3568a08de17e7033ba" Mar 12 16:09:06 crc kubenswrapper[4869]: I0312 16:09:06.763334 4869 scope.go:117] "RemoveContainer" containerID="784e6adc52403be149c9523e640c80df52676c9ba4095f82874b4dacef002fa7" Mar 12 16:09:06 crc kubenswrapper[4869]: E0312 16:09:06.764769 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"784e6adc52403be149c9523e640c80df52676c9ba4095f82874b4dacef002fa7\": container with ID starting with 784e6adc52403be149c9523e640c80df52676c9ba4095f82874b4dacef002fa7 not found: ID does not exist" containerID="784e6adc52403be149c9523e640c80df52676c9ba4095f82874b4dacef002fa7" Mar 12 16:09:06 crc kubenswrapper[4869]: I0312 16:09:06.764801 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"784e6adc52403be149c9523e640c80df52676c9ba4095f82874b4dacef002fa7"} err="failed to get container status \"784e6adc52403be149c9523e640c80df52676c9ba4095f82874b4dacef002fa7\": rpc error: code = NotFound desc = could not find container \"784e6adc52403be149c9523e640c80df52676c9ba4095f82874b4dacef002fa7\": container with ID starting with 784e6adc52403be149c9523e640c80df52676c9ba4095f82874b4dacef002fa7 not found: ID does not exist" Mar 12 16:09:06 crc kubenswrapper[4869]: I0312 16:09:06.764822 4869 scope.go:117] "RemoveContainer" containerID="3a1670db671d421b749e5492f8dba026d95450ba8ac11e3568a08de17e7033ba" Mar 12 16:09:06 crc kubenswrapper[4869]: E0312 16:09:06.765090 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a1670db671d421b749e5492f8dba026d95450ba8ac11e3568a08de17e7033ba\": container with ID starting with 3a1670db671d421b749e5492f8dba026d95450ba8ac11e3568a08de17e7033ba not found: ID does not exist" containerID="3a1670db671d421b749e5492f8dba026d95450ba8ac11e3568a08de17e7033ba" Mar 12 16:09:06 crc kubenswrapper[4869]: I0312 16:09:06.765104 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a1670db671d421b749e5492f8dba026d95450ba8ac11e3568a08de17e7033ba"} err="failed to get container status \"3a1670db671d421b749e5492f8dba026d95450ba8ac11e3568a08de17e7033ba\": rpc error: code = NotFound desc = could not find container \"3a1670db671d421b749e5492f8dba026d95450ba8ac11e3568a08de17e7033ba\": container with ID starting with 3a1670db671d421b749e5492f8dba026d95450ba8ac11e3568a08de17e7033ba not found: ID does not exist" Mar 12 16:09:06 crc kubenswrapper[4869]: I0312 16:09:06.786604 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93aa2341-4454-4086-8a4e-c63f6d317bbf-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "93aa2341-4454-4086-8a4e-c63f6d317bbf" (UID: "93aa2341-4454-4086-8a4e-c63f6d317bbf"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 16:09:06 crc kubenswrapper[4869]: I0312 16:09:06.864950 4869 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/93aa2341-4454-4086-8a4e-c63f6d317bbf-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 12 16:09:08 crc kubenswrapper[4869]: I0312 16:09:08.346412 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93aa2341-4454-4086-8a4e-c63f6d317bbf" path="/var/lib/kubelet/pods/93aa2341-4454-4086-8a4e-c63f6d317bbf/volumes" Mar 12 16:10:00 crc kubenswrapper[4869]: I0312 16:10:00.142651 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555530-p5jhb"] Mar 12 16:10:00 crc kubenswrapper[4869]: E0312 16:10:00.143634 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aa8f5c4-75cb-4eb5-9439-b4e56e51c3a1" containerName="oc" Mar 12 16:10:00 crc kubenswrapper[4869]: I0312 16:10:00.143653 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aa8f5c4-75cb-4eb5-9439-b4e56e51c3a1" containerName="oc" Mar 12 16:10:00 crc kubenswrapper[4869]: E0312 16:10:00.143719 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93aa2341-4454-4086-8a4e-c63f6d317bbf" containerName="gather" Mar 12 16:10:00 crc kubenswrapper[4869]: I0312 16:10:00.143727 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="93aa2341-4454-4086-8a4e-c63f6d317bbf" containerName="gather" Mar 12 16:10:00 crc kubenswrapper[4869]: E0312 16:10:00.143748 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93aa2341-4454-4086-8a4e-c63f6d317bbf" containerName="copy" Mar 12 16:10:00 crc kubenswrapper[4869]: I0312 16:10:00.143757 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="93aa2341-4454-4086-8a4e-c63f6d317bbf" containerName="copy" Mar 12 16:10:00 crc kubenswrapper[4869]: I0312 16:10:00.143992 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aa8f5c4-75cb-4eb5-9439-b4e56e51c3a1" containerName="oc" Mar 12 16:10:00 crc kubenswrapper[4869]: I0312 16:10:00.144009 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="93aa2341-4454-4086-8a4e-c63f6d317bbf" containerName="gather" Mar 12 16:10:00 crc kubenswrapper[4869]: I0312 16:10:00.144033 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="93aa2341-4454-4086-8a4e-c63f6d317bbf" containerName="copy" Mar 12 16:10:00 crc kubenswrapper[4869]: I0312 16:10:00.144829 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555530-p5jhb" Mar 12 16:10:00 crc kubenswrapper[4869]: I0312 16:10:00.151825 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 16:10:00 crc kubenswrapper[4869]: I0312 16:10:00.151952 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 16:10:00 crc kubenswrapper[4869]: I0312 16:10:00.152072 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 16:10:00 crc kubenswrapper[4869]: I0312 16:10:00.156492 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555530-p5jhb"] Mar 12 16:10:00 crc kubenswrapper[4869]: I0312 16:10:00.214370 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l858s\" (UniqueName: \"kubernetes.io/projected/63662ee2-d68d-4321-9abb-aaad3ead6c5e-kube-api-access-l858s\") pod \"auto-csr-approver-29555530-p5jhb\" (UID: \"63662ee2-d68d-4321-9abb-aaad3ead6c5e\") " pod="openshift-infra/auto-csr-approver-29555530-p5jhb" Mar 12 16:10:00 crc kubenswrapper[4869]: I0312 16:10:00.316716 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l858s\" (UniqueName: \"kubernetes.io/projected/63662ee2-d68d-4321-9abb-aaad3ead6c5e-kube-api-access-l858s\") pod \"auto-csr-approver-29555530-p5jhb\" (UID: \"63662ee2-d68d-4321-9abb-aaad3ead6c5e\") " pod="openshift-infra/auto-csr-approver-29555530-p5jhb" Mar 12 16:10:00 crc kubenswrapper[4869]: I0312 16:10:00.351651 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l858s\" (UniqueName: \"kubernetes.io/projected/63662ee2-d68d-4321-9abb-aaad3ead6c5e-kube-api-access-l858s\") pod \"auto-csr-approver-29555530-p5jhb\" (UID: \"63662ee2-d68d-4321-9abb-aaad3ead6c5e\") " pod="openshift-infra/auto-csr-approver-29555530-p5jhb" Mar 12 16:10:00 crc kubenswrapper[4869]: I0312 16:10:00.466600 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555530-p5jhb" Mar 12 16:10:00 crc kubenswrapper[4869]: I0312 16:10:00.916812 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555530-p5jhb"] Mar 12 16:10:01 crc kubenswrapper[4869]: I0312 16:10:01.150832 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555530-p5jhb" event={"ID":"63662ee2-d68d-4321-9abb-aaad3ead6c5e","Type":"ContainerStarted","Data":"37b2df6c2e0f19a55b363168a1109f4171958715fd60a06af3dcb15a6320d5c4"} Mar 12 16:10:03 crc kubenswrapper[4869]: I0312 16:10:03.169320 4869 generic.go:334] "Generic (PLEG): container finished" podID="63662ee2-d68d-4321-9abb-aaad3ead6c5e" containerID="a914bdccdeb46fb58d7a4fb31f0952990e5d44d3bcfe7162def93505653a1a69" exitCode=0 Mar 12 16:10:03 crc kubenswrapper[4869]: I0312 16:10:03.169372 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555530-p5jhb" event={"ID":"63662ee2-d68d-4321-9abb-aaad3ead6c5e","Type":"ContainerDied","Data":"a914bdccdeb46fb58d7a4fb31f0952990e5d44d3bcfe7162def93505653a1a69"} Mar 12 16:10:04 crc kubenswrapper[4869]: I0312 16:10:04.524929 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555530-p5jhb" Mar 12 16:10:04 crc kubenswrapper[4869]: I0312 16:10:04.606592 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l858s\" (UniqueName: \"kubernetes.io/projected/63662ee2-d68d-4321-9abb-aaad3ead6c5e-kube-api-access-l858s\") pod \"63662ee2-d68d-4321-9abb-aaad3ead6c5e\" (UID: \"63662ee2-d68d-4321-9abb-aaad3ead6c5e\") " Mar 12 16:10:04 crc kubenswrapper[4869]: I0312 16:10:04.612285 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63662ee2-d68d-4321-9abb-aaad3ead6c5e-kube-api-access-l858s" (OuterVolumeSpecName: "kube-api-access-l858s") pod "63662ee2-d68d-4321-9abb-aaad3ead6c5e" (UID: "63662ee2-d68d-4321-9abb-aaad3ead6c5e"). InnerVolumeSpecName "kube-api-access-l858s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 16:10:04 crc kubenswrapper[4869]: I0312 16:10:04.709309 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l858s\" (UniqueName: \"kubernetes.io/projected/63662ee2-d68d-4321-9abb-aaad3ead6c5e-kube-api-access-l858s\") on node \"crc\" DevicePath \"\"" Mar 12 16:10:05 crc kubenswrapper[4869]: I0312 16:10:05.193817 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555530-p5jhb" event={"ID":"63662ee2-d68d-4321-9abb-aaad3ead6c5e","Type":"ContainerDied","Data":"37b2df6c2e0f19a55b363168a1109f4171958715fd60a06af3dcb15a6320d5c4"} Mar 12 16:10:05 crc kubenswrapper[4869]: I0312 16:10:05.194158 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37b2df6c2e0f19a55b363168a1109f4171958715fd60a06af3dcb15a6320d5c4" Mar 12 16:10:05 crc kubenswrapper[4869]: I0312 16:10:05.193876 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555530-p5jhb" Mar 12 16:10:05 crc kubenswrapper[4869]: I0312 16:10:05.598761 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555524-5nkw9"] Mar 12 16:10:05 crc kubenswrapper[4869]: I0312 16:10:05.609057 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555524-5nkw9"] Mar 12 16:10:06 crc kubenswrapper[4869]: I0312 16:10:06.346158 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a972d17c-7397-40c9-a6d4-610df87a932a" path="/var/lib/kubelet/pods/a972d17c-7397-40c9-a6d4-610df87a932a/volumes" Mar 12 16:10:47 crc kubenswrapper[4869]: I0312 16:10:47.868194 4869 scope.go:117] "RemoveContainer" containerID="ccf1c0d1e8b1fef8c02e8b26f9a9bfc0e993ad3b12bb78034f0921cee699e056" Mar 12 16:11:19 crc kubenswrapper[4869]: I0312 16:11:19.684400 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 16:11:19 crc kubenswrapper[4869]: I0312 16:11:19.685007 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 16:11:49 crc kubenswrapper[4869]: I0312 16:11:49.683996 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 16:11:49 crc kubenswrapper[4869]: I0312 16:11:49.684599 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 16:12:00 crc kubenswrapper[4869]: I0312 16:12:00.146006 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555532-p9jzs"] Mar 12 16:12:00 crc kubenswrapper[4869]: E0312 16:12:00.147090 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63662ee2-d68d-4321-9abb-aaad3ead6c5e" containerName="oc" Mar 12 16:12:00 crc kubenswrapper[4869]: I0312 16:12:00.147110 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="63662ee2-d68d-4321-9abb-aaad3ead6c5e" containerName="oc" Mar 12 16:12:00 crc kubenswrapper[4869]: I0312 16:12:00.147403 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="63662ee2-d68d-4321-9abb-aaad3ead6c5e" containerName="oc" Mar 12 16:12:00 crc kubenswrapper[4869]: I0312 16:12:00.148137 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555532-p9jzs" Mar 12 16:12:00 crc kubenswrapper[4869]: I0312 16:12:00.150194 4869 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ddrp7" Mar 12 16:12:00 crc kubenswrapper[4869]: I0312 16:12:00.154354 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 16:12:00 crc kubenswrapper[4869]: I0312 16:12:00.156731 4869 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 16:12:00 crc kubenswrapper[4869]: I0312 16:12:00.156774 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555532-p9jzs"] Mar 12 16:12:00 crc kubenswrapper[4869]: I0312 16:12:00.167263 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swmzs\" (UniqueName: \"kubernetes.io/projected/2d608096-2a3a-407f-bf7e-ed6cb4d1c935-kube-api-access-swmzs\") pod \"auto-csr-approver-29555532-p9jzs\" (UID: \"2d608096-2a3a-407f-bf7e-ed6cb4d1c935\") " pod="openshift-infra/auto-csr-approver-29555532-p9jzs" Mar 12 16:12:00 crc kubenswrapper[4869]: I0312 16:12:00.269150 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swmzs\" (UniqueName: \"kubernetes.io/projected/2d608096-2a3a-407f-bf7e-ed6cb4d1c935-kube-api-access-swmzs\") pod \"auto-csr-approver-29555532-p9jzs\" (UID: \"2d608096-2a3a-407f-bf7e-ed6cb4d1c935\") " pod="openshift-infra/auto-csr-approver-29555532-p9jzs" Mar 12 16:12:00 crc kubenswrapper[4869]: I0312 16:12:00.291152 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swmzs\" (UniqueName: \"kubernetes.io/projected/2d608096-2a3a-407f-bf7e-ed6cb4d1c935-kube-api-access-swmzs\") pod \"auto-csr-approver-29555532-p9jzs\" (UID: \"2d608096-2a3a-407f-bf7e-ed6cb4d1c935\") " pod="openshift-infra/auto-csr-approver-29555532-p9jzs" Mar 12 16:12:00 crc kubenswrapper[4869]: I0312 16:12:00.465804 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555532-p9jzs" Mar 12 16:12:00 crc kubenswrapper[4869]: I0312 16:12:00.954581 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555532-p9jzs"] Mar 12 16:12:01 crc kubenswrapper[4869]: I0312 16:12:01.291965 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555532-p9jzs" event={"ID":"2d608096-2a3a-407f-bf7e-ed6cb4d1c935","Type":"ContainerStarted","Data":"6cbf3329d332ff0db52e63f1fa9e972227a2fb129075da0c326e6ee6ab6edd85"} Mar 12 16:12:03 crc kubenswrapper[4869]: I0312 16:12:03.029237 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fpg2n"] Mar 12 16:12:03 crc kubenswrapper[4869]: I0312 16:12:03.032293 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fpg2n" Mar 12 16:12:03 crc kubenswrapper[4869]: I0312 16:12:03.046587 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fpg2n"] Mar 12 16:12:03 crc kubenswrapper[4869]: I0312 16:12:03.143983 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f378a11-8a98-4474-b327-a976632b5157-utilities\") pod \"community-operators-fpg2n\" (UID: \"2f378a11-8a98-4474-b327-a976632b5157\") " pod="openshift-marketplace/community-operators-fpg2n" Mar 12 16:12:03 crc kubenswrapper[4869]: I0312 16:12:03.144287 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bbf7\" (UniqueName: \"kubernetes.io/projected/2f378a11-8a98-4474-b327-a976632b5157-kube-api-access-8bbf7\") pod \"community-operators-fpg2n\" (UID: \"2f378a11-8a98-4474-b327-a976632b5157\") " pod="openshift-marketplace/community-operators-fpg2n" Mar 12 16:12:03 crc kubenswrapper[4869]: I0312 16:12:03.144509 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f378a11-8a98-4474-b327-a976632b5157-catalog-content\") pod \"community-operators-fpg2n\" (UID: \"2f378a11-8a98-4474-b327-a976632b5157\") " pod="openshift-marketplace/community-operators-fpg2n" Mar 12 16:12:03 crc kubenswrapper[4869]: I0312 16:12:03.246992 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f378a11-8a98-4474-b327-a976632b5157-utilities\") pod \"community-operators-fpg2n\" (UID: \"2f378a11-8a98-4474-b327-a976632b5157\") " pod="openshift-marketplace/community-operators-fpg2n" Mar 12 16:12:03 crc kubenswrapper[4869]: I0312 16:12:03.247168 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bbf7\" (UniqueName: \"kubernetes.io/projected/2f378a11-8a98-4474-b327-a976632b5157-kube-api-access-8bbf7\") pod \"community-operators-fpg2n\" (UID: \"2f378a11-8a98-4474-b327-a976632b5157\") " pod="openshift-marketplace/community-operators-fpg2n" Mar 12 16:12:03 crc kubenswrapper[4869]: I0312 16:12:03.247257 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f378a11-8a98-4474-b327-a976632b5157-catalog-content\") pod \"community-operators-fpg2n\" (UID: \"2f378a11-8a98-4474-b327-a976632b5157\") " pod="openshift-marketplace/community-operators-fpg2n" Mar 12 16:12:03 crc kubenswrapper[4869]: I0312 16:12:03.247692 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f378a11-8a98-4474-b327-a976632b5157-catalog-content\") pod \"community-operators-fpg2n\" (UID: \"2f378a11-8a98-4474-b327-a976632b5157\") " pod="openshift-marketplace/community-operators-fpg2n" Mar 12 16:12:03 crc kubenswrapper[4869]: I0312 16:12:03.247921 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f378a11-8a98-4474-b327-a976632b5157-utilities\") pod \"community-operators-fpg2n\" (UID: \"2f378a11-8a98-4474-b327-a976632b5157\") " pod="openshift-marketplace/community-operators-fpg2n" Mar 12 16:12:03 crc kubenswrapper[4869]: I0312 16:12:03.273038 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bbf7\" (UniqueName: \"kubernetes.io/projected/2f378a11-8a98-4474-b327-a976632b5157-kube-api-access-8bbf7\") pod \"community-operators-fpg2n\" (UID: \"2f378a11-8a98-4474-b327-a976632b5157\") " pod="openshift-marketplace/community-operators-fpg2n" Mar 12 16:12:03 crc kubenswrapper[4869]: I0312 16:12:03.313029 4869 generic.go:334] "Generic (PLEG): container finished" podID="2d608096-2a3a-407f-bf7e-ed6cb4d1c935" containerID="9a6da2484a1dc752b44c0e372dabe1e3750c2a437326f04bdb16c8f9dbeab2d8" exitCode=0 Mar 12 16:12:03 crc kubenswrapper[4869]: I0312 16:12:03.313088 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555532-p9jzs" event={"ID":"2d608096-2a3a-407f-bf7e-ed6cb4d1c935","Type":"ContainerDied","Data":"9a6da2484a1dc752b44c0e372dabe1e3750c2a437326f04bdb16c8f9dbeab2d8"} Mar 12 16:12:03 crc kubenswrapper[4869]: I0312 16:12:03.363723 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fpg2n" Mar 12 16:12:03 crc kubenswrapper[4869]: I0312 16:12:03.908141 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fpg2n"] Mar 12 16:12:04 crc kubenswrapper[4869]: I0312 16:12:04.327417 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fpg2n" event={"ID":"2f378a11-8a98-4474-b327-a976632b5157","Type":"ContainerStarted","Data":"605711be491b13103893b57db2b3eda563c55e69f250428484495b0c62146ef8"} Mar 12 16:12:04 crc kubenswrapper[4869]: I0312 16:12:04.816067 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555532-p9jzs" Mar 12 16:12:04 crc kubenswrapper[4869]: I0312 16:12:04.816831 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9qphl"] Mar 12 16:12:04 crc kubenswrapper[4869]: E0312 16:12:04.817630 4869 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d608096-2a3a-407f-bf7e-ed6cb4d1c935" containerName="oc" Mar 12 16:12:04 crc kubenswrapper[4869]: I0312 16:12:04.817649 4869 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d608096-2a3a-407f-bf7e-ed6cb4d1c935" containerName="oc" Mar 12 16:12:04 crc kubenswrapper[4869]: I0312 16:12:04.817827 4869 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d608096-2a3a-407f-bf7e-ed6cb4d1c935" containerName="oc" Mar 12 16:12:04 crc kubenswrapper[4869]: I0312 16:12:04.820311 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9qphl" Mar 12 16:12:04 crc kubenswrapper[4869]: I0312 16:12:04.836932 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qphl"] Mar 12 16:12:04 crc kubenswrapper[4869]: I0312 16:12:04.885595 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swmzs\" (UniqueName: \"kubernetes.io/projected/2d608096-2a3a-407f-bf7e-ed6cb4d1c935-kube-api-access-swmzs\") pod \"2d608096-2a3a-407f-bf7e-ed6cb4d1c935\" (UID: \"2d608096-2a3a-407f-bf7e-ed6cb4d1c935\") " Mar 12 16:12:04 crc kubenswrapper[4869]: I0312 16:12:04.888550 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltxxl\" (UniqueName: \"kubernetes.io/projected/7faea1a5-7f11-4f8c-87da-9404250ffe12-kube-api-access-ltxxl\") pod \"redhat-marketplace-9qphl\" (UID: \"7faea1a5-7f11-4f8c-87da-9404250ffe12\") " pod="openshift-marketplace/redhat-marketplace-9qphl" Mar 12 16:12:04 crc kubenswrapper[4869]: I0312 16:12:04.888647 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7faea1a5-7f11-4f8c-87da-9404250ffe12-utilities\") pod \"redhat-marketplace-9qphl\" (UID: \"7faea1a5-7f11-4f8c-87da-9404250ffe12\") " pod="openshift-marketplace/redhat-marketplace-9qphl" Mar 12 16:12:04 crc kubenswrapper[4869]: I0312 16:12:04.888792 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7faea1a5-7f11-4f8c-87da-9404250ffe12-catalog-content\") pod \"redhat-marketplace-9qphl\" (UID: \"7faea1a5-7f11-4f8c-87da-9404250ffe12\") " pod="openshift-marketplace/redhat-marketplace-9qphl" Mar 12 16:12:04 crc kubenswrapper[4869]: I0312 16:12:04.978480 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d608096-2a3a-407f-bf7e-ed6cb4d1c935-kube-api-access-swmzs" (OuterVolumeSpecName: "kube-api-access-swmzs") pod "2d608096-2a3a-407f-bf7e-ed6cb4d1c935" (UID: "2d608096-2a3a-407f-bf7e-ed6cb4d1c935"). InnerVolumeSpecName "kube-api-access-swmzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 16:12:04 crc kubenswrapper[4869]: I0312 16:12:04.990997 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltxxl\" (UniqueName: \"kubernetes.io/projected/7faea1a5-7f11-4f8c-87da-9404250ffe12-kube-api-access-ltxxl\") pod \"redhat-marketplace-9qphl\" (UID: \"7faea1a5-7f11-4f8c-87da-9404250ffe12\") " pod="openshift-marketplace/redhat-marketplace-9qphl" Mar 12 16:12:04 crc kubenswrapper[4869]: I0312 16:12:04.991159 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7faea1a5-7f11-4f8c-87da-9404250ffe12-utilities\") pod \"redhat-marketplace-9qphl\" (UID: \"7faea1a5-7f11-4f8c-87da-9404250ffe12\") " pod="openshift-marketplace/redhat-marketplace-9qphl" Mar 12 16:12:04 crc kubenswrapper[4869]: I0312 16:12:04.991305 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7faea1a5-7f11-4f8c-87da-9404250ffe12-catalog-content\") pod \"redhat-marketplace-9qphl\" (UID: \"7faea1a5-7f11-4f8c-87da-9404250ffe12\") " pod="openshift-marketplace/redhat-marketplace-9qphl" Mar 12 16:12:04 crc kubenswrapper[4869]: I0312 16:12:04.991507 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swmzs\" (UniqueName: \"kubernetes.io/projected/2d608096-2a3a-407f-bf7e-ed6cb4d1c935-kube-api-access-swmzs\") on node \"crc\" DevicePath \"\"" Mar 12 16:12:04 crc kubenswrapper[4869]: I0312 16:12:04.991775 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7faea1a5-7f11-4f8c-87da-9404250ffe12-utilities\") pod \"redhat-marketplace-9qphl\" (UID: \"7faea1a5-7f11-4f8c-87da-9404250ffe12\") " pod="openshift-marketplace/redhat-marketplace-9qphl" Mar 12 16:12:04 crc kubenswrapper[4869]: I0312 16:12:04.991834 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7faea1a5-7f11-4f8c-87da-9404250ffe12-catalog-content\") pod \"redhat-marketplace-9qphl\" (UID: \"7faea1a5-7f11-4f8c-87da-9404250ffe12\") " pod="openshift-marketplace/redhat-marketplace-9qphl" Mar 12 16:12:05 crc kubenswrapper[4869]: I0312 16:12:05.010584 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltxxl\" (UniqueName: \"kubernetes.io/projected/7faea1a5-7f11-4f8c-87da-9404250ffe12-kube-api-access-ltxxl\") pod \"redhat-marketplace-9qphl\" (UID: \"7faea1a5-7f11-4f8c-87da-9404250ffe12\") " pod="openshift-marketplace/redhat-marketplace-9qphl" Mar 12 16:12:05 crc kubenswrapper[4869]: I0312 16:12:05.144324 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9qphl" Mar 12 16:12:05 crc kubenswrapper[4869]: I0312 16:12:05.369752 4869 generic.go:334] "Generic (PLEG): container finished" podID="2f378a11-8a98-4474-b327-a976632b5157" containerID="3ab2de78b57efb151ff6ff023f2fa816f3c13f06caa14439097b15619df8d454" exitCode=0 Mar 12 16:12:05 crc kubenswrapper[4869]: I0312 16:12:05.370184 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fpg2n" event={"ID":"2f378a11-8a98-4474-b327-a976632b5157","Type":"ContainerDied","Data":"3ab2de78b57efb151ff6ff023f2fa816f3c13f06caa14439097b15619df8d454"} Mar 12 16:12:05 crc kubenswrapper[4869]: I0312 16:12:05.377331 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555532-p9jzs" event={"ID":"2d608096-2a3a-407f-bf7e-ed6cb4d1c935","Type":"ContainerDied","Data":"6cbf3329d332ff0db52e63f1fa9e972227a2fb129075da0c326e6ee6ab6edd85"} Mar 12 16:12:05 crc kubenswrapper[4869]: I0312 16:12:05.377371 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cbf3329d332ff0db52e63f1fa9e972227a2fb129075da0c326e6ee6ab6edd85" Mar 12 16:12:05 crc kubenswrapper[4869]: I0312 16:12:05.377442 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555532-p9jzs" Mar 12 16:12:05 crc kubenswrapper[4869]: I0312 16:12:05.636638 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qphl"] Mar 12 16:12:05 crc kubenswrapper[4869]: W0312 16:12:05.646631 4869 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7faea1a5_7f11_4f8c_87da_9404250ffe12.slice/crio-a6b5a764d0b49d37303416ce3e3865450b120985ca936e81cd9266bb52833ef3 WatchSource:0}: Error finding container a6b5a764d0b49d37303416ce3e3865450b120985ca936e81cd9266bb52833ef3: Status 404 returned error can't find the container with id a6b5a764d0b49d37303416ce3e3865450b120985ca936e81cd9266bb52833ef3 Mar 12 16:12:05 crc kubenswrapper[4869]: I0312 16:12:05.896352 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555526-7w9kb"] Mar 12 16:12:05 crc kubenswrapper[4869]: I0312 16:12:05.908368 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555526-7w9kb"] Mar 12 16:12:06 crc kubenswrapper[4869]: I0312 16:12:06.348981 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31c11f7f-699b-4477-85c1-81fed25b5b00" path="/var/lib/kubelet/pods/31c11f7f-699b-4477-85c1-81fed25b5b00/volumes" Mar 12 16:12:06 crc kubenswrapper[4869]: I0312 16:12:06.388568 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qphl" event={"ID":"7faea1a5-7f11-4f8c-87da-9404250ffe12","Type":"ContainerStarted","Data":"a6b5a764d0b49d37303416ce3e3865450b120985ca936e81cd9266bb52833ef3"} Mar 12 16:12:07 crc kubenswrapper[4869]: I0312 16:12:07.399001 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fpg2n" event={"ID":"2f378a11-8a98-4474-b327-a976632b5157","Type":"ContainerStarted","Data":"50e48367d6f52ab47273864488b8b40ff8493ad802e6974167b8c3a1052b69af"} Mar 12 16:12:07 crc kubenswrapper[4869]: I0312 16:12:07.400575 4869 generic.go:334] "Generic (PLEG): container finished" podID="7faea1a5-7f11-4f8c-87da-9404250ffe12" containerID="27ce283cb7a4578876cd17ca76260a33aeb09a54a23948d864c8bd0ab201705f" exitCode=0 Mar 12 16:12:07 crc kubenswrapper[4869]: I0312 16:12:07.400600 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qphl" event={"ID":"7faea1a5-7f11-4f8c-87da-9404250ffe12","Type":"ContainerDied","Data":"27ce283cb7a4578876cd17ca76260a33aeb09a54a23948d864c8bd0ab201705f"} Mar 12 16:12:08 crc kubenswrapper[4869]: I0312 16:12:08.414577 4869 generic.go:334] "Generic (PLEG): container finished" podID="2f378a11-8a98-4474-b327-a976632b5157" containerID="50e48367d6f52ab47273864488b8b40ff8493ad802e6974167b8c3a1052b69af" exitCode=0 Mar 12 16:12:08 crc kubenswrapper[4869]: I0312 16:12:08.414680 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fpg2n" event={"ID":"2f378a11-8a98-4474-b327-a976632b5157","Type":"ContainerDied","Data":"50e48367d6f52ab47273864488b8b40ff8493ad802e6974167b8c3a1052b69af"} Mar 12 16:12:09 crc kubenswrapper[4869]: I0312 16:12:09.425109 4869 generic.go:334] "Generic (PLEG): container finished" podID="7faea1a5-7f11-4f8c-87da-9404250ffe12" containerID="b67068e00bfd09f69177cb4bdecde82e4e07a7e3d3117c16c1c886c9528d5ad7" exitCode=0 Mar 12 16:12:09 crc kubenswrapper[4869]: I0312 16:12:09.425164 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qphl" event={"ID":"7faea1a5-7f11-4f8c-87da-9404250ffe12","Type":"ContainerDied","Data":"b67068e00bfd09f69177cb4bdecde82e4e07a7e3d3117c16c1c886c9528d5ad7"} Mar 12 16:12:09 crc kubenswrapper[4869]: I0312 16:12:09.430915 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fpg2n" event={"ID":"2f378a11-8a98-4474-b327-a976632b5157","Type":"ContainerStarted","Data":"9e20d617a4f1b9e00b82ca4f897b058161d663c01552c4723d2075d80abc2475"} Mar 12 16:12:09 crc kubenswrapper[4869]: I0312 16:12:09.479417 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fpg2n" podStartSLOduration=4.006895366 podStartE2EDuration="7.47939886s" podCreationTimestamp="2026-03-12 16:12:02 +0000 UTC" firstStartedPulling="2026-03-12 16:12:05.372054697 +0000 UTC m=+5077.657279975" lastFinishedPulling="2026-03-12 16:12:08.844558191 +0000 UTC m=+5081.129783469" observedRunningTime="2026-03-12 16:12:09.470661924 +0000 UTC m=+5081.755887202" watchObservedRunningTime="2026-03-12 16:12:09.47939886 +0000 UTC m=+5081.764624138" Mar 12 16:12:10 crc kubenswrapper[4869]: I0312 16:12:10.446269 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qphl" event={"ID":"7faea1a5-7f11-4f8c-87da-9404250ffe12","Type":"ContainerStarted","Data":"8bee68a67bf611f8310d89f139115560cbdad272aeb19c49f5a147ac139b3a99"} Mar 12 16:12:10 crc kubenswrapper[4869]: I0312 16:12:10.466814 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9qphl" podStartSLOduration=4.044710215 podStartE2EDuration="6.466785536s" podCreationTimestamp="2026-03-12 16:12:04 +0000 UTC" firstStartedPulling="2026-03-12 16:12:07.403666952 +0000 UTC m=+5079.688892230" lastFinishedPulling="2026-03-12 16:12:09.825742263 +0000 UTC m=+5082.110967551" observedRunningTime="2026-03-12 16:12:10.463861164 +0000 UTC m=+5082.749086452" watchObservedRunningTime="2026-03-12 16:12:10.466785536 +0000 UTC m=+5082.752010814" Mar 12 16:12:13 crc kubenswrapper[4869]: I0312 16:12:13.364655 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fpg2n" Mar 12 16:12:13 crc kubenswrapper[4869]: I0312 16:12:13.364953 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fpg2n" Mar 12 16:12:13 crc kubenswrapper[4869]: I0312 16:12:13.412056 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fpg2n" Mar 12 16:12:15 crc kubenswrapper[4869]: I0312 16:12:15.144509 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9qphl" Mar 12 16:12:15 crc kubenswrapper[4869]: I0312 16:12:15.144916 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9qphl" Mar 12 16:12:15 crc kubenswrapper[4869]: I0312 16:12:15.196477 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9qphl" Mar 12 16:12:15 crc kubenswrapper[4869]: I0312 16:12:15.529175 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9qphl" Mar 12 16:12:15 crc kubenswrapper[4869]: I0312 16:12:15.819307 4869 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c6vfw"] Mar 12 16:12:15 crc kubenswrapper[4869]: I0312 16:12:15.821818 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c6vfw" Mar 12 16:12:15 crc kubenswrapper[4869]: I0312 16:12:15.850029 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c6vfw"] Mar 12 16:12:15 crc kubenswrapper[4869]: I0312 16:12:15.931312 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqr6h\" (UniqueName: \"kubernetes.io/projected/6ee26dd1-be1a-4c6f-86b8-cb81062f0f64-kube-api-access-qqr6h\") pod \"redhat-operators-c6vfw\" (UID: \"6ee26dd1-be1a-4c6f-86b8-cb81062f0f64\") " pod="openshift-marketplace/redhat-operators-c6vfw" Mar 12 16:12:15 crc kubenswrapper[4869]: I0312 16:12:15.931513 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee26dd1-be1a-4c6f-86b8-cb81062f0f64-catalog-content\") pod \"redhat-operators-c6vfw\" (UID: \"6ee26dd1-be1a-4c6f-86b8-cb81062f0f64\") " pod="openshift-marketplace/redhat-operators-c6vfw" Mar 12 16:12:15 crc kubenswrapper[4869]: I0312 16:12:15.931652 4869 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee26dd1-be1a-4c6f-86b8-cb81062f0f64-utilities\") pod \"redhat-operators-c6vfw\" (UID: \"6ee26dd1-be1a-4c6f-86b8-cb81062f0f64\") " pod="openshift-marketplace/redhat-operators-c6vfw" Mar 12 16:12:16 crc kubenswrapper[4869]: I0312 16:12:16.036023 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqr6h\" (UniqueName: \"kubernetes.io/projected/6ee26dd1-be1a-4c6f-86b8-cb81062f0f64-kube-api-access-qqr6h\") pod \"redhat-operators-c6vfw\" (UID: \"6ee26dd1-be1a-4c6f-86b8-cb81062f0f64\") " pod="openshift-marketplace/redhat-operators-c6vfw" Mar 12 16:12:16 crc kubenswrapper[4869]: I0312 16:12:16.036088 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee26dd1-be1a-4c6f-86b8-cb81062f0f64-catalog-content\") pod \"redhat-operators-c6vfw\" (UID: \"6ee26dd1-be1a-4c6f-86b8-cb81062f0f64\") " pod="openshift-marketplace/redhat-operators-c6vfw" Mar 12 16:12:16 crc kubenswrapper[4869]: I0312 16:12:16.036153 4869 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee26dd1-be1a-4c6f-86b8-cb81062f0f64-utilities\") pod \"redhat-operators-c6vfw\" (UID: \"6ee26dd1-be1a-4c6f-86b8-cb81062f0f64\") " pod="openshift-marketplace/redhat-operators-c6vfw" Mar 12 16:12:16 crc kubenswrapper[4869]: I0312 16:12:16.036755 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee26dd1-be1a-4c6f-86b8-cb81062f0f64-catalog-content\") pod \"redhat-operators-c6vfw\" (UID: \"6ee26dd1-be1a-4c6f-86b8-cb81062f0f64\") " pod="openshift-marketplace/redhat-operators-c6vfw" Mar 12 16:12:16 crc kubenswrapper[4869]: I0312 16:12:16.036808 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee26dd1-be1a-4c6f-86b8-cb81062f0f64-utilities\") pod \"redhat-operators-c6vfw\" (UID: \"6ee26dd1-be1a-4c6f-86b8-cb81062f0f64\") " pod="openshift-marketplace/redhat-operators-c6vfw" Mar 12 16:12:16 crc kubenswrapper[4869]: I0312 16:12:16.068655 4869 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqr6h\" (UniqueName: \"kubernetes.io/projected/6ee26dd1-be1a-4c6f-86b8-cb81062f0f64-kube-api-access-qqr6h\") pod \"redhat-operators-c6vfw\" (UID: \"6ee26dd1-be1a-4c6f-86b8-cb81062f0f64\") " pod="openshift-marketplace/redhat-operators-c6vfw" Mar 12 16:12:16 crc kubenswrapper[4869]: I0312 16:12:16.198414 4869 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c6vfw" Mar 12 16:12:16 crc kubenswrapper[4869]: I0312 16:12:16.683413 4869 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c6vfw"] Mar 12 16:12:17 crc kubenswrapper[4869]: I0312 16:12:17.514474 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c6vfw" event={"ID":"6ee26dd1-be1a-4c6f-86b8-cb81062f0f64","Type":"ContainerStarted","Data":"bc03348482ba80248f7e4e4b210bd287349ba6285b81eef3556d7cf473ba5c97"} Mar 12 16:12:17 crc kubenswrapper[4869]: I0312 16:12:17.514839 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c6vfw" event={"ID":"6ee26dd1-be1a-4c6f-86b8-cb81062f0f64","Type":"ContainerStarted","Data":"e20bae7d2006f7112d78272432c462421c3a4f4fa30192870037257dc1920210"} Mar 12 16:12:17 crc kubenswrapper[4869]: I0312 16:12:17.607561 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qphl"] Mar 12 16:12:17 crc kubenswrapper[4869]: I0312 16:12:17.607785 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9qphl" podUID="7faea1a5-7f11-4f8c-87da-9404250ffe12" containerName="registry-server" containerID="cri-o://8bee68a67bf611f8310d89f139115560cbdad272aeb19c49f5a147ac139b3a99" gracePeriod=2 Mar 12 16:12:18 crc kubenswrapper[4869]: I0312 16:12:18.536673 4869 generic.go:334] "Generic (PLEG): container finished" podID="7faea1a5-7f11-4f8c-87da-9404250ffe12" containerID="8bee68a67bf611f8310d89f139115560cbdad272aeb19c49f5a147ac139b3a99" exitCode=0 Mar 12 16:12:18 crc kubenswrapper[4869]: I0312 16:12:18.537096 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qphl" event={"ID":"7faea1a5-7f11-4f8c-87da-9404250ffe12","Type":"ContainerDied","Data":"8bee68a67bf611f8310d89f139115560cbdad272aeb19c49f5a147ac139b3a99"} Mar 12 16:12:18 crc kubenswrapper[4869]: I0312 16:12:18.537187 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qphl" event={"ID":"7faea1a5-7f11-4f8c-87da-9404250ffe12","Type":"ContainerDied","Data":"a6b5a764d0b49d37303416ce3e3865450b120985ca936e81cd9266bb52833ef3"} Mar 12 16:12:18 crc kubenswrapper[4869]: I0312 16:12:18.537203 4869 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6b5a764d0b49d37303416ce3e3865450b120985ca936e81cd9266bb52833ef3" Mar 12 16:12:18 crc kubenswrapper[4869]: I0312 16:12:18.540096 4869 generic.go:334] "Generic (PLEG): container finished" podID="6ee26dd1-be1a-4c6f-86b8-cb81062f0f64" containerID="bc03348482ba80248f7e4e4b210bd287349ba6285b81eef3556d7cf473ba5c97" exitCode=0 Mar 12 16:12:18 crc kubenswrapper[4869]: I0312 16:12:18.540267 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c6vfw" event={"ID":"6ee26dd1-be1a-4c6f-86b8-cb81062f0f64","Type":"ContainerDied","Data":"bc03348482ba80248f7e4e4b210bd287349ba6285b81eef3556d7cf473ba5c97"} Mar 12 16:12:18 crc kubenswrapper[4869]: I0312 16:12:18.570807 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9qphl" Mar 12 16:12:18 crc kubenswrapper[4869]: I0312 16:12:18.681215 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7faea1a5-7f11-4f8c-87da-9404250ffe12-catalog-content\") pod \"7faea1a5-7f11-4f8c-87da-9404250ffe12\" (UID: \"7faea1a5-7f11-4f8c-87da-9404250ffe12\") " Mar 12 16:12:18 crc kubenswrapper[4869]: I0312 16:12:18.681265 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7faea1a5-7f11-4f8c-87da-9404250ffe12-utilities\") pod \"7faea1a5-7f11-4f8c-87da-9404250ffe12\" (UID: \"7faea1a5-7f11-4f8c-87da-9404250ffe12\") " Mar 12 16:12:18 crc kubenswrapper[4869]: I0312 16:12:18.681465 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltxxl\" (UniqueName: \"kubernetes.io/projected/7faea1a5-7f11-4f8c-87da-9404250ffe12-kube-api-access-ltxxl\") pod \"7faea1a5-7f11-4f8c-87da-9404250ffe12\" (UID: \"7faea1a5-7f11-4f8c-87da-9404250ffe12\") " Mar 12 16:12:18 crc kubenswrapper[4869]: I0312 16:12:18.683401 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7faea1a5-7f11-4f8c-87da-9404250ffe12-utilities" (OuterVolumeSpecName: "utilities") pod "7faea1a5-7f11-4f8c-87da-9404250ffe12" (UID: "7faea1a5-7f11-4f8c-87da-9404250ffe12"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 16:12:18 crc kubenswrapper[4869]: I0312 16:12:18.696495 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7faea1a5-7f11-4f8c-87da-9404250ffe12-kube-api-access-ltxxl" (OuterVolumeSpecName: "kube-api-access-ltxxl") pod "7faea1a5-7f11-4f8c-87da-9404250ffe12" (UID: "7faea1a5-7f11-4f8c-87da-9404250ffe12"). InnerVolumeSpecName "kube-api-access-ltxxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 16:12:18 crc kubenswrapper[4869]: I0312 16:12:18.715871 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7faea1a5-7f11-4f8c-87da-9404250ffe12-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7faea1a5-7f11-4f8c-87da-9404250ffe12" (UID: "7faea1a5-7f11-4f8c-87da-9404250ffe12"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 16:12:18 crc kubenswrapper[4869]: I0312 16:12:18.783299 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7faea1a5-7f11-4f8c-87da-9404250ffe12-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 16:12:18 crc kubenswrapper[4869]: I0312 16:12:18.783375 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7faea1a5-7f11-4f8c-87da-9404250ffe12-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 16:12:18 crc kubenswrapper[4869]: I0312 16:12:18.783387 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltxxl\" (UniqueName: \"kubernetes.io/projected/7faea1a5-7f11-4f8c-87da-9404250ffe12-kube-api-access-ltxxl\") on node \"crc\" DevicePath \"\"" Mar 12 16:12:19 crc kubenswrapper[4869]: I0312 16:12:19.549765 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9qphl" Mar 12 16:12:19 crc kubenswrapper[4869]: I0312 16:12:19.596013 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qphl"] Mar 12 16:12:19 crc kubenswrapper[4869]: I0312 16:12:19.605506 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qphl"] Mar 12 16:12:19 crc kubenswrapper[4869]: I0312 16:12:19.683930 4869 patch_prober.go:28] interesting pod/machine-config-daemon-2lgzz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 16:12:19 crc kubenswrapper[4869]: I0312 16:12:19.683990 4869 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 16:12:19 crc kubenswrapper[4869]: I0312 16:12:19.684032 4869 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" Mar 12 16:12:19 crc kubenswrapper[4869]: I0312 16:12:19.684893 4869 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6776e811c46f14d863ba424139ec1761f141f4c49b869f301495cffe675c4a4d"} pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 16:12:19 crc kubenswrapper[4869]: I0312 16:12:19.684954 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" podUID="1621c994-94d2-4105-a988-f4739518ba91" containerName="machine-config-daemon" containerID="cri-o://6776e811c46f14d863ba424139ec1761f141f4c49b869f301495cffe675c4a4d" gracePeriod=600 Mar 12 16:12:20 crc kubenswrapper[4869]: I0312 16:12:20.348478 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7faea1a5-7f11-4f8c-87da-9404250ffe12" path="/var/lib/kubelet/pods/7faea1a5-7f11-4f8c-87da-9404250ffe12/volumes" Mar 12 16:12:20 crc kubenswrapper[4869]: I0312 16:12:20.559073 4869 generic.go:334] "Generic (PLEG): container finished" podID="1621c994-94d2-4105-a988-f4739518ba91" containerID="6776e811c46f14d863ba424139ec1761f141f4c49b869f301495cffe675c4a4d" exitCode=0 Mar 12 16:12:20 crc kubenswrapper[4869]: I0312 16:12:20.559154 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerDied","Data":"6776e811c46f14d863ba424139ec1761f141f4c49b869f301495cffe675c4a4d"} Mar 12 16:12:20 crc kubenswrapper[4869]: I0312 16:12:20.559202 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2lgzz" event={"ID":"1621c994-94d2-4105-a988-f4739518ba91","Type":"ContainerStarted","Data":"c995fca4cccd44b2416e60304ba97fc3fd159586d03361c883c1849ab2024a42"} Mar 12 16:12:20 crc kubenswrapper[4869]: I0312 16:12:20.559220 4869 scope.go:117] "RemoveContainer" containerID="7f386f4ac9366e91219ce91ba4e94753e5c56573855dfb2b2ddd058f1eec367e" Mar 12 16:12:20 crc kubenswrapper[4869]: I0312 16:12:20.563124 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c6vfw" event={"ID":"6ee26dd1-be1a-4c6f-86b8-cb81062f0f64","Type":"ContainerStarted","Data":"cac6a491b6b417626472ca7d1d7730b0eeee2b644bc0a1f00a333de1989b50dd"} Mar 12 16:12:21 crc kubenswrapper[4869]: I0312 16:12:21.576576 4869 generic.go:334] "Generic (PLEG): container finished" podID="6ee26dd1-be1a-4c6f-86b8-cb81062f0f64" containerID="cac6a491b6b417626472ca7d1d7730b0eeee2b644bc0a1f00a333de1989b50dd" exitCode=0 Mar 12 16:12:21 crc kubenswrapper[4869]: I0312 16:12:21.576672 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c6vfw" event={"ID":"6ee26dd1-be1a-4c6f-86b8-cb81062f0f64","Type":"ContainerDied","Data":"cac6a491b6b417626472ca7d1d7730b0eeee2b644bc0a1f00a333de1989b50dd"} Mar 12 16:12:22 crc kubenswrapper[4869]: I0312 16:12:22.598366 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c6vfw" event={"ID":"6ee26dd1-be1a-4c6f-86b8-cb81062f0f64","Type":"ContainerStarted","Data":"ac430bcfa61a45cd0ebd1f27d3df88510be38d1822c8c50ab680d04d83ebc0d1"} Mar 12 16:12:23 crc kubenswrapper[4869]: I0312 16:12:23.410056 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fpg2n" Mar 12 16:12:23 crc kubenswrapper[4869]: I0312 16:12:23.433688 4869 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c6vfw" podStartSLOduration=5.03569847 podStartE2EDuration="8.433670655s" podCreationTimestamp="2026-03-12 16:12:15 +0000 UTC" firstStartedPulling="2026-03-12 16:12:18.54365021 +0000 UTC m=+5090.828875488" lastFinishedPulling="2026-03-12 16:12:21.941622395 +0000 UTC m=+5094.226847673" observedRunningTime="2026-03-12 16:12:22.626076191 +0000 UTC m=+5094.911301469" watchObservedRunningTime="2026-03-12 16:12:23.433670655 +0000 UTC m=+5095.718895933" Mar 12 16:12:25 crc kubenswrapper[4869]: I0312 16:12:25.011187 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fpg2n"] Mar 12 16:12:25 crc kubenswrapper[4869]: I0312 16:12:25.011760 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fpg2n" podUID="2f378a11-8a98-4474-b327-a976632b5157" containerName="registry-server" containerID="cri-o://9e20d617a4f1b9e00b82ca4f897b058161d663c01552c4723d2075d80abc2475" gracePeriod=2 Mar 12 16:12:25 crc kubenswrapper[4869]: I0312 16:12:25.441883 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fpg2n" Mar 12 16:12:25 crc kubenswrapper[4869]: I0312 16:12:25.614628 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f378a11-8a98-4474-b327-a976632b5157-catalog-content\") pod \"2f378a11-8a98-4474-b327-a976632b5157\" (UID: \"2f378a11-8a98-4474-b327-a976632b5157\") " Mar 12 16:12:25 crc kubenswrapper[4869]: I0312 16:12:25.614825 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bbf7\" (UniqueName: \"kubernetes.io/projected/2f378a11-8a98-4474-b327-a976632b5157-kube-api-access-8bbf7\") pod \"2f378a11-8a98-4474-b327-a976632b5157\" (UID: \"2f378a11-8a98-4474-b327-a976632b5157\") " Mar 12 16:12:25 crc kubenswrapper[4869]: I0312 16:12:25.614980 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f378a11-8a98-4474-b327-a976632b5157-utilities\") pod \"2f378a11-8a98-4474-b327-a976632b5157\" (UID: \"2f378a11-8a98-4474-b327-a976632b5157\") " Mar 12 16:12:25 crc kubenswrapper[4869]: I0312 16:12:25.616091 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f378a11-8a98-4474-b327-a976632b5157-utilities" (OuterVolumeSpecName: "utilities") pod "2f378a11-8a98-4474-b327-a976632b5157" (UID: "2f378a11-8a98-4474-b327-a976632b5157"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 16:12:25 crc kubenswrapper[4869]: I0312 16:12:25.623566 4869 generic.go:334] "Generic (PLEG): container finished" podID="2f378a11-8a98-4474-b327-a976632b5157" containerID="9e20d617a4f1b9e00b82ca4f897b058161d663c01552c4723d2075d80abc2475" exitCode=0 Mar 12 16:12:25 crc kubenswrapper[4869]: I0312 16:12:25.623612 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fpg2n" event={"ID":"2f378a11-8a98-4474-b327-a976632b5157","Type":"ContainerDied","Data":"9e20d617a4f1b9e00b82ca4f897b058161d663c01552c4723d2075d80abc2475"} Mar 12 16:12:25 crc kubenswrapper[4869]: I0312 16:12:25.623638 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fpg2n" event={"ID":"2f378a11-8a98-4474-b327-a976632b5157","Type":"ContainerDied","Data":"605711be491b13103893b57db2b3eda563c55e69f250428484495b0c62146ef8"} Mar 12 16:12:25 crc kubenswrapper[4869]: I0312 16:12:25.623654 4869 scope.go:117] "RemoveContainer" containerID="9e20d617a4f1b9e00b82ca4f897b058161d663c01552c4723d2075d80abc2475" Mar 12 16:12:25 crc kubenswrapper[4869]: I0312 16:12:25.623789 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fpg2n" Mar 12 16:12:25 crc kubenswrapper[4869]: I0312 16:12:25.627739 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f378a11-8a98-4474-b327-a976632b5157-kube-api-access-8bbf7" (OuterVolumeSpecName: "kube-api-access-8bbf7") pod "2f378a11-8a98-4474-b327-a976632b5157" (UID: "2f378a11-8a98-4474-b327-a976632b5157"). InnerVolumeSpecName "kube-api-access-8bbf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 16:12:25 crc kubenswrapper[4869]: I0312 16:12:25.666975 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f378a11-8a98-4474-b327-a976632b5157-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2f378a11-8a98-4474-b327-a976632b5157" (UID: "2f378a11-8a98-4474-b327-a976632b5157"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 16:12:25 crc kubenswrapper[4869]: I0312 16:12:25.690127 4869 scope.go:117] "RemoveContainer" containerID="50e48367d6f52ab47273864488b8b40ff8493ad802e6974167b8c3a1052b69af" Mar 12 16:12:25 crc kubenswrapper[4869]: I0312 16:12:25.715404 4869 scope.go:117] "RemoveContainer" containerID="3ab2de78b57efb151ff6ff023f2fa816f3c13f06caa14439097b15619df8d454" Mar 12 16:12:25 crc kubenswrapper[4869]: I0312 16:12:25.716794 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f378a11-8a98-4474-b327-a976632b5157-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 16:12:25 crc kubenswrapper[4869]: I0312 16:12:25.716823 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f378a11-8a98-4474-b327-a976632b5157-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 16:12:25 crc kubenswrapper[4869]: I0312 16:12:25.716833 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bbf7\" (UniqueName: \"kubernetes.io/projected/2f378a11-8a98-4474-b327-a976632b5157-kube-api-access-8bbf7\") on node \"crc\" DevicePath \"\"" Mar 12 16:12:25 crc kubenswrapper[4869]: I0312 16:12:25.768272 4869 scope.go:117] "RemoveContainer" containerID="9e20d617a4f1b9e00b82ca4f897b058161d663c01552c4723d2075d80abc2475" Mar 12 16:12:25 crc kubenswrapper[4869]: E0312 16:12:25.768850 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e20d617a4f1b9e00b82ca4f897b058161d663c01552c4723d2075d80abc2475\": container with ID starting with 9e20d617a4f1b9e00b82ca4f897b058161d663c01552c4723d2075d80abc2475 not found: ID does not exist" containerID="9e20d617a4f1b9e00b82ca4f897b058161d663c01552c4723d2075d80abc2475" Mar 12 16:12:25 crc kubenswrapper[4869]: I0312 16:12:25.768897 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e20d617a4f1b9e00b82ca4f897b058161d663c01552c4723d2075d80abc2475"} err="failed to get container status \"9e20d617a4f1b9e00b82ca4f897b058161d663c01552c4723d2075d80abc2475\": rpc error: code = NotFound desc = could not find container \"9e20d617a4f1b9e00b82ca4f897b058161d663c01552c4723d2075d80abc2475\": container with ID starting with 9e20d617a4f1b9e00b82ca4f897b058161d663c01552c4723d2075d80abc2475 not found: ID does not exist" Mar 12 16:12:25 crc kubenswrapper[4869]: I0312 16:12:25.768928 4869 scope.go:117] "RemoveContainer" containerID="50e48367d6f52ab47273864488b8b40ff8493ad802e6974167b8c3a1052b69af" Mar 12 16:12:25 crc kubenswrapper[4869]: E0312 16:12:25.769401 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50e48367d6f52ab47273864488b8b40ff8493ad802e6974167b8c3a1052b69af\": container with ID starting with 50e48367d6f52ab47273864488b8b40ff8493ad802e6974167b8c3a1052b69af not found: ID does not exist" containerID="50e48367d6f52ab47273864488b8b40ff8493ad802e6974167b8c3a1052b69af" Mar 12 16:12:25 crc kubenswrapper[4869]: I0312 16:12:25.769427 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50e48367d6f52ab47273864488b8b40ff8493ad802e6974167b8c3a1052b69af"} err="failed to get container status \"50e48367d6f52ab47273864488b8b40ff8493ad802e6974167b8c3a1052b69af\": rpc error: code = NotFound desc = could not find container \"50e48367d6f52ab47273864488b8b40ff8493ad802e6974167b8c3a1052b69af\": container with ID starting with 50e48367d6f52ab47273864488b8b40ff8493ad802e6974167b8c3a1052b69af not found: ID does not exist" Mar 12 16:12:25 crc kubenswrapper[4869]: I0312 16:12:25.769444 4869 scope.go:117] "RemoveContainer" containerID="3ab2de78b57efb151ff6ff023f2fa816f3c13f06caa14439097b15619df8d454" Mar 12 16:12:25 crc kubenswrapper[4869]: E0312 16:12:25.769917 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ab2de78b57efb151ff6ff023f2fa816f3c13f06caa14439097b15619df8d454\": container with ID starting with 3ab2de78b57efb151ff6ff023f2fa816f3c13f06caa14439097b15619df8d454 not found: ID does not exist" containerID="3ab2de78b57efb151ff6ff023f2fa816f3c13f06caa14439097b15619df8d454" Mar 12 16:12:25 crc kubenswrapper[4869]: I0312 16:12:25.769941 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ab2de78b57efb151ff6ff023f2fa816f3c13f06caa14439097b15619df8d454"} err="failed to get container status \"3ab2de78b57efb151ff6ff023f2fa816f3c13f06caa14439097b15619df8d454\": rpc error: code = NotFound desc = could not find container \"3ab2de78b57efb151ff6ff023f2fa816f3c13f06caa14439097b15619df8d454\": container with ID starting with 3ab2de78b57efb151ff6ff023f2fa816f3c13f06caa14439097b15619df8d454 not found: ID does not exist" Mar 12 16:12:25 crc kubenswrapper[4869]: I0312 16:12:25.959026 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fpg2n"] Mar 12 16:12:25 crc kubenswrapper[4869]: I0312 16:12:25.970158 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fpg2n"] Mar 12 16:12:26 crc kubenswrapper[4869]: I0312 16:12:26.199013 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c6vfw" Mar 12 16:12:26 crc kubenswrapper[4869]: I0312 16:12:26.199373 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c6vfw" Mar 12 16:12:26 crc kubenswrapper[4869]: I0312 16:12:26.346390 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f378a11-8a98-4474-b327-a976632b5157" path="/var/lib/kubelet/pods/2f378a11-8a98-4474-b327-a976632b5157/volumes" Mar 12 16:12:27 crc kubenswrapper[4869]: I0312 16:12:27.261785 4869 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c6vfw" podUID="6ee26dd1-be1a-4c6f-86b8-cb81062f0f64" containerName="registry-server" probeResult="failure" output=< Mar 12 16:12:27 crc kubenswrapper[4869]: timeout: failed to connect service ":50051" within 1s Mar 12 16:12:27 crc kubenswrapper[4869]: > Mar 12 16:12:36 crc kubenswrapper[4869]: I0312 16:12:36.245955 4869 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c6vfw" Mar 12 16:12:36 crc kubenswrapper[4869]: I0312 16:12:36.293204 4869 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c6vfw" Mar 12 16:12:36 crc kubenswrapper[4869]: I0312 16:12:36.479754 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c6vfw"] Mar 12 16:12:37 crc kubenswrapper[4869]: I0312 16:12:37.730511 4869 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c6vfw" podUID="6ee26dd1-be1a-4c6f-86b8-cb81062f0f64" containerName="registry-server" containerID="cri-o://ac430bcfa61a45cd0ebd1f27d3df88510be38d1822c8c50ab680d04d83ebc0d1" gracePeriod=2 Mar 12 16:12:38 crc kubenswrapper[4869]: I0312 16:12:38.240778 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c6vfw" Mar 12 16:12:38 crc kubenswrapper[4869]: I0312 16:12:38.369478 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee26dd1-be1a-4c6f-86b8-cb81062f0f64-catalog-content\") pod \"6ee26dd1-be1a-4c6f-86b8-cb81062f0f64\" (UID: \"6ee26dd1-be1a-4c6f-86b8-cb81062f0f64\") " Mar 12 16:12:38 crc kubenswrapper[4869]: I0312 16:12:38.369678 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee26dd1-be1a-4c6f-86b8-cb81062f0f64-utilities\") pod \"6ee26dd1-be1a-4c6f-86b8-cb81062f0f64\" (UID: \"6ee26dd1-be1a-4c6f-86b8-cb81062f0f64\") " Mar 12 16:12:38 crc kubenswrapper[4869]: I0312 16:12:38.369705 4869 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqr6h\" (UniqueName: \"kubernetes.io/projected/6ee26dd1-be1a-4c6f-86b8-cb81062f0f64-kube-api-access-qqr6h\") pod \"6ee26dd1-be1a-4c6f-86b8-cb81062f0f64\" (UID: \"6ee26dd1-be1a-4c6f-86b8-cb81062f0f64\") " Mar 12 16:12:38 crc kubenswrapper[4869]: I0312 16:12:38.370568 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ee26dd1-be1a-4c6f-86b8-cb81062f0f64-utilities" (OuterVolumeSpecName: "utilities") pod "6ee26dd1-be1a-4c6f-86b8-cb81062f0f64" (UID: "6ee26dd1-be1a-4c6f-86b8-cb81062f0f64"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 16:12:38 crc kubenswrapper[4869]: I0312 16:12:38.376032 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ee26dd1-be1a-4c6f-86b8-cb81062f0f64-kube-api-access-qqr6h" (OuterVolumeSpecName: "kube-api-access-qqr6h") pod "6ee26dd1-be1a-4c6f-86b8-cb81062f0f64" (UID: "6ee26dd1-be1a-4c6f-86b8-cb81062f0f64"). InnerVolumeSpecName "kube-api-access-qqr6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 16:12:38 crc kubenswrapper[4869]: I0312 16:12:38.472231 4869 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee26dd1-be1a-4c6f-86b8-cb81062f0f64-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 16:12:38 crc kubenswrapper[4869]: I0312 16:12:38.472260 4869 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqr6h\" (UniqueName: \"kubernetes.io/projected/6ee26dd1-be1a-4c6f-86b8-cb81062f0f64-kube-api-access-qqr6h\") on node \"crc\" DevicePath \"\"" Mar 12 16:12:38 crc kubenswrapper[4869]: I0312 16:12:38.507689 4869 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ee26dd1-be1a-4c6f-86b8-cb81062f0f64-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ee26dd1-be1a-4c6f-86b8-cb81062f0f64" (UID: "6ee26dd1-be1a-4c6f-86b8-cb81062f0f64"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 16:12:38 crc kubenswrapper[4869]: I0312 16:12:38.574191 4869 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee26dd1-be1a-4c6f-86b8-cb81062f0f64-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 16:12:38 crc kubenswrapper[4869]: I0312 16:12:38.742759 4869 generic.go:334] "Generic (PLEG): container finished" podID="6ee26dd1-be1a-4c6f-86b8-cb81062f0f64" containerID="ac430bcfa61a45cd0ebd1f27d3df88510be38d1822c8c50ab680d04d83ebc0d1" exitCode=0 Mar 12 16:12:38 crc kubenswrapper[4869]: I0312 16:12:38.742801 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c6vfw" event={"ID":"6ee26dd1-be1a-4c6f-86b8-cb81062f0f64","Type":"ContainerDied","Data":"ac430bcfa61a45cd0ebd1f27d3df88510be38d1822c8c50ab680d04d83ebc0d1"} Mar 12 16:12:38 crc kubenswrapper[4869]: I0312 16:12:38.742873 4869 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c6vfw" event={"ID":"6ee26dd1-be1a-4c6f-86b8-cb81062f0f64","Type":"ContainerDied","Data":"e20bae7d2006f7112d78272432c462421c3a4f4fa30192870037257dc1920210"} Mar 12 16:12:38 crc kubenswrapper[4869]: I0312 16:12:38.742905 4869 scope.go:117] "RemoveContainer" containerID="ac430bcfa61a45cd0ebd1f27d3df88510be38d1822c8c50ab680d04d83ebc0d1" Mar 12 16:12:38 crc kubenswrapper[4869]: I0312 16:12:38.744163 4869 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c6vfw" Mar 12 16:12:38 crc kubenswrapper[4869]: I0312 16:12:38.765339 4869 scope.go:117] "RemoveContainer" containerID="cac6a491b6b417626472ca7d1d7730b0eeee2b644bc0a1f00a333de1989b50dd" Mar 12 16:12:38 crc kubenswrapper[4869]: I0312 16:12:38.793710 4869 scope.go:117] "RemoveContainer" containerID="bc03348482ba80248f7e4e4b210bd287349ba6285b81eef3556d7cf473ba5c97" Mar 12 16:12:38 crc kubenswrapper[4869]: I0312 16:12:38.793953 4869 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c6vfw"] Mar 12 16:12:38 crc kubenswrapper[4869]: I0312 16:12:38.806602 4869 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c6vfw"] Mar 12 16:12:38 crc kubenswrapper[4869]: I0312 16:12:38.848870 4869 scope.go:117] "RemoveContainer" containerID="ac430bcfa61a45cd0ebd1f27d3df88510be38d1822c8c50ab680d04d83ebc0d1" Mar 12 16:12:38 crc kubenswrapper[4869]: E0312 16:12:38.849489 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac430bcfa61a45cd0ebd1f27d3df88510be38d1822c8c50ab680d04d83ebc0d1\": container with ID starting with ac430bcfa61a45cd0ebd1f27d3df88510be38d1822c8c50ab680d04d83ebc0d1 not found: ID does not exist" containerID="ac430bcfa61a45cd0ebd1f27d3df88510be38d1822c8c50ab680d04d83ebc0d1" Mar 12 16:12:38 crc kubenswrapper[4869]: I0312 16:12:38.849531 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac430bcfa61a45cd0ebd1f27d3df88510be38d1822c8c50ab680d04d83ebc0d1"} err="failed to get container status \"ac430bcfa61a45cd0ebd1f27d3df88510be38d1822c8c50ab680d04d83ebc0d1\": rpc error: code = NotFound desc = could not find container \"ac430bcfa61a45cd0ebd1f27d3df88510be38d1822c8c50ab680d04d83ebc0d1\": container with ID starting with ac430bcfa61a45cd0ebd1f27d3df88510be38d1822c8c50ab680d04d83ebc0d1 not found: ID does not exist" Mar 12 16:12:38 crc kubenswrapper[4869]: I0312 16:12:38.849570 4869 scope.go:117] "RemoveContainer" containerID="cac6a491b6b417626472ca7d1d7730b0eeee2b644bc0a1f00a333de1989b50dd" Mar 12 16:12:38 crc kubenswrapper[4869]: E0312 16:12:38.851724 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cac6a491b6b417626472ca7d1d7730b0eeee2b644bc0a1f00a333de1989b50dd\": container with ID starting with cac6a491b6b417626472ca7d1d7730b0eeee2b644bc0a1f00a333de1989b50dd not found: ID does not exist" containerID="cac6a491b6b417626472ca7d1d7730b0eeee2b644bc0a1f00a333de1989b50dd" Mar 12 16:12:38 crc kubenswrapper[4869]: I0312 16:12:38.851763 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cac6a491b6b417626472ca7d1d7730b0eeee2b644bc0a1f00a333de1989b50dd"} err="failed to get container status \"cac6a491b6b417626472ca7d1d7730b0eeee2b644bc0a1f00a333de1989b50dd\": rpc error: code = NotFound desc = could not find container \"cac6a491b6b417626472ca7d1d7730b0eeee2b644bc0a1f00a333de1989b50dd\": container with ID starting with cac6a491b6b417626472ca7d1d7730b0eeee2b644bc0a1f00a333de1989b50dd not found: ID does not exist" Mar 12 16:12:38 crc kubenswrapper[4869]: I0312 16:12:38.851792 4869 scope.go:117] "RemoveContainer" containerID="bc03348482ba80248f7e4e4b210bd287349ba6285b81eef3556d7cf473ba5c97" Mar 12 16:12:38 crc kubenswrapper[4869]: E0312 16:12:38.852071 4869 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc03348482ba80248f7e4e4b210bd287349ba6285b81eef3556d7cf473ba5c97\": container with ID starting with bc03348482ba80248f7e4e4b210bd287349ba6285b81eef3556d7cf473ba5c97 not found: ID does not exist" containerID="bc03348482ba80248f7e4e4b210bd287349ba6285b81eef3556d7cf473ba5c97" Mar 12 16:12:38 crc kubenswrapper[4869]: I0312 16:12:38.852100 4869 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc03348482ba80248f7e4e4b210bd287349ba6285b81eef3556d7cf473ba5c97"} err="failed to get container status \"bc03348482ba80248f7e4e4b210bd287349ba6285b81eef3556d7cf473ba5c97\": rpc error: code = NotFound desc = could not find container \"bc03348482ba80248f7e4e4b210bd287349ba6285b81eef3556d7cf473ba5c97\": container with ID starting with bc03348482ba80248f7e4e4b210bd287349ba6285b81eef3556d7cf473ba5c97 not found: ID does not exist" Mar 12 16:12:40 crc kubenswrapper[4869]: I0312 16:12:40.361446 4869 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ee26dd1-be1a-4c6f-86b8-cb81062f0f64" path="/var/lib/kubelet/pods/6ee26dd1-be1a-4c6f-86b8-cb81062f0f64/volumes" Mar 12 16:12:47 crc kubenswrapper[4869]: I0312 16:12:47.977179 4869 scope.go:117] "RemoveContainer" containerID="e7581b3c6022fd109d5803bc1c5c2afbc5a54dde6d875d3e085d432e1bfde613"